Word2vec-Style Vector Arithmetic on Docs Embeddings
Posted2 months agoActive2 months ago
technicalwriting.devTechstory
calmpositive
Debate
20/100
EmbeddingsNlpMachine Learning
Key topics
Embeddings
Nlp
Machine Learning
The article explores applying Word2Vec-style vector arithmetic to document embeddings, sparking discussion on the technique's potential applications and limitations.
Snapshot generated from the HN discussion
Discussion Activity
Light discussionFirst comment
6d
Peak period
5
156-168h
Avg / period
3
Key moments
- 01Story posted
Oct 31, 2025 at 3:49 PM EDT
2 months ago
Step 01 - 02First comment
Nov 7, 2025 at 2:29 AM EST
6d after posting
Step 02 - 03Peak activity
5 comments in 156-168h
Hottest window of the conversation
Step 03 - 04Latest activity
Nov 7, 2025 at 12:30 PM EST
2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45775988Type: storyLast synced: 11/20/2025, 1:30:03 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Would be cool to add together the vectors for harry potter and lord of the rings and then decode that into a new book about Frodo going to wizard school to collect the ring to help push Voldemort into mount doom.
https://news.ycombinator.com/item?id=45784455
https://news.ycombinator.com/item?id=45756599
It turns out you can tokenise arbitrary information into constant vector which is really useful for later processing. The vec2text (https://github.com/vec2text/vec2text) is an excellent asset if you want to reverse the embeddings back to text. This allows you to encode arbitrary data into standarized vectors, and all the way back.