I've been playing with embeddings and wanted to try out what results the embedding layer will produce based on just word-by-word input and addition / subtraction, beyond what many videos / papers mention (like the obvious king-man+woman=queen). So I built something that doesn't just give the first answer, but ranks the matches based on distance / cosine symmetry. I polished it a bit so that others can try it out, too.
For now, I only have nouns (and some proper nouns) in the dataset, and pick the most common interpretation among the homographs. Also, it's case sensitive.
Comments URL: https://news.ycombinator.com/item?id=43988533
Points: 31
# Comments: 34
Melden Sie sich an, um einen Kommentar hinzuzufügen
Andere Beiträge in dieser Gruppe

Article URL: https://github.com/words/cuss
Comments URL: https://news.ycombinator.com/ite

Article URL: https://github.com/artnitolog/awesome-arxiv
Comments URL: http

Article URL: https://github.com/kanbn/kan
Comments URL: https://news.ycombinator.com/item?

Article URL: https://arxiv.org/abs/2411.00782
Comments URL: https://news.ycombinator.c

Article URL: https://arxiv.org/abs/2505.24650
Comments URL: https://news.ycombinator.c


Article URL: https://blog.jim-nielsen.com/2025/is-it-javascript/