“…Vector techniques are based on the automatic processing of large text corpora representing a language of a general or specific domain, although some studies recommend specific domain corpora (Kwantes et al, 2016). There are different computational models in which word occurrences are algebraically vectorized such as LSA, word2vec, or BEAGLE (for a revision on space models see Günther, Rinaldi, & Marelli, 2019;Jorge-Botana, Olmos, & Luzón, 2020;Jones, Willits, & Dennis, 2015;or McNamara, 2011). All of them coincide in that they represent the lexicon in a reduced dimensionality vector space.…”