Sep 29, 2025 at 1:56 AM EDT
Untitled
We wont get to AGI if we dont get models with both larger context and dreaming (aka distilling important parts from their 'day long' context and adding those to their neural net) with about the same effort/cost as inference. LLM models cannot do this and wont be able to, so if no one comes up with a better model, AGI cannot be reached, no matter what amounts are invested, so we will get an AI winter. So many smart minds are on this now, that if anyone had an idea how to 'learn during inference', someone would have released something. No one has a clue, so I am betting the downfall will come soon. Still, we got incredible progress from this AI boom, so it is not bad, just money slushing.
Discussion Activity
No activity data yet
We're still syncing comments from Hacker News.
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Discussion (0 comments)
Discussion hasn't started yet.
ID: 45410712Type: commentLast synced: 11/17/2025, 12:05:08 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.