I've been playing with embeddings and wanted to try out what results the embedding layer will produce based on just word-by-word input and addition / subtraction, beyond what many videos / papers mention (like the obvious king-man+woman=queen). So I built something that doesn't just give the first answer, but ranks the matches based on distance / cosine symmetry. I polished it a bit so that others can try it out, too.
For now, I only have nouns (and some proper nouns) in the dataset, and pick the most common interpretation among the homographs. Also, it's case sensitive.
Comments URL: https://news.ycombinator.com/item?id=43988533
Points: 31
# Comments: 34
Войдите, чтобы добавить комментарий
Другие сообщения в этой группе
Article URL: https://tangled.sh/@adam.tngl.sh/vim-beancounting
Article URL: https://lists.thekelleys.org.uk/pipermail/dnsmasq-discuss/2025q3/018288.html
Comments
Article URL: https://h4x0r.org/futex/
Comments URL: https://news.ycombinator.com/item?id=44951

Article URL: https://github.com/whatwg/html/pull/11563
Comments URL: https://
Hi HN, Aria and Tony here, co-founders of Parachute (https://www.parachute-ai.com/). We’re building governance infrastructure that lets hospitals safely

Article URL: https://github.com/xrip/pico-286
Comments URL: https://news.ycombinator.c