Vector-Space Models of Semantic Representation From a Cognitive Perspective: A Discussion of Common Misconceptions

Volume: 14, Issue: 6, Pages: 1006 - 1033
Published: Sep 10, 2019
Abstract
Models that represent meaning as high-dimensional numerical vectors-such as latent semantic analysis (LSA), hyperspace analogue to language (HAL), bound encoding of the aggregate language environment (BEAGLE), topic models, global vectors (GloVe), and word2vec-have been introduced as extremely powerful machine-learning proxies for human semantic representations and have seen an explosive rise in popularity over the past 2 decades. However,...
Paper Details
Title
Vector-Space Models of Semantic Representation From a Cognitive Perspective: A Discussion of Common Misconceptions
Published Date
Sep 10, 2019
Volume
14
Issue
6
Pages
1006 - 1033
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.