The Brain Navigates New Spaces by 'darting' Between Reality and Mental Maps
Key topics
The article discusses a study on how the brain navigates new spaces by switching between reality and mental maps, sparking discussions on its implications for understanding human cognition and potential connections to AI and consciousness.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
5h
Peak period
66
Day 8
Avg / period
26.7
Based on 80 loaded comments
Key moments
- 01Story posted
Oct 8, 2025 at 1:51 AM EDT
3 months ago
Step 01 - 02First comment
Oct 8, 2025 at 6:26 AM EDT
5h after posting
Step 02 - 03Peak activity
66 comments in Day 8
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 16, 2025 at 2:53 PM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
> Before the rats encountered the detour, the research team observed that their brains were already firing in patterns that seemed to "imagine" alternate unfamiliar mental routes while they slept. When the researchers compared these sleep patterns to the neural activity during the actual detour, some of them matched.
> “What was surprising was that the rats' brains were already prepared for this novel detour before they ever encountered it,”
Suppose further that all events are a draw of type 1, 2, 3, or 4, and that our memory kept a count and updated the distribution - it is essentially a frequency distribution.
When we encounter a stimulus, we have to (1) recognize it and (2) assign a reward valence to it. If we only ever observed '3', the distribution would become very peaked. Correspondingly, this suggests that we would recognize '3' events faster and be better at assigning a reward valence to those events.
Then if we ever encounter a non-3 event, we would recognize it more slowly - it is well-established that recognition is tied to encounter frequency - and do a poorer job assigning reward valence to it. Together this means that we would do a bad job selecting the appropriate response.
Perhaps this scenario-based dreaming keeps us (and rats) primed so we're not flat-footed in new scenarios.
The question then becomes - if these scenarios are purely imagined, where are they being sampled from? If we never observe 1, 2, and 4...how do we know that these are the true list of alternative scenarios?
https://kemendo.com/Deja-Vu-Experiment.html
I think it also supports my three loops hypothesis as well:
https://kemendo.com/ThreeLoops.html
In effect, my position is that biological systems maintain a synchronized processing pipeline: where the hippocampal prediction system operates slightly “ahead” of sensory processing, like a cache buffer.
If the processing gets “behind” the sensory input then you feel like you’re accessing memory because the electrical signal is reaching memory and sensory distribution simultaneously or slightly lagging.
So it means you’re constantly switching between your world map and the input and comparing them just to stabilize a “linear” experience - something which is a necessity for corporeal prediction and reaction.
It's inherent to the meaning of the word.
You can train a computer to correspond to an individual's idiosyncratic brain state for their word voxels, but no one has yet to reduce the material to a single repeatable voxel state.
“We refute (based on empirical evidence) claims that humans use linguistic representations to think.” Ev Fedorenko Language Lab MIT 2024
The problem with the materialist POV is it doesn't solve the most basic question of brain states. No not everything is a material.
There clearly are processes, like oscillations, that require material to some extent, but are not material themselves. And that's the problem with the materialist camp. If the oscillations, dynamically integrated, are the source of intel/consciousness, then material may not even be a requirement of life. We may just be material sinks.
I understand.
There is a however a flaw in that thinking.
There is no oscillation that exists outside of some material/medium to oscillate. I agree it is important to distinguish the water from the wave. There is no light wave without the photon. Thus - I strongly suspect - there is no consciousness without the brain (or similar medium).
As all our explanations are immaterial, they are post hoc observations, to claim any direction to the role of material is to sportscast the existence of material. There is no consciousness without the process, the material may be secondary as its explanation is a process as well.
We haven't found the format that finds the material in its place yet, whether its eliminative materialism, or another state-process pairing that cuts materialism down to a partner role. The jury is still out, but materialism isn't the answer.
While I hold a similar view as Sean Carroll that it is basically hand-waving to say we'll never understand consciousness, I can't discount Donald Hoffman's Interface theory of perception and that evolutionary fitness requires we only perceive four dimensions (but there could be more as hypothesised in string theory).
Wondering if you have any ideas on this, which can be quite jarring when it happens?
You are thinking about something, and then walk through a doorway into another room and suddenly completely lose track of what you were thinking.
The closest idea I've seen for that is: Jeff Hawkins in his Thousand Brain Theory of Intelligence made a statement that learning is a function of navigation and the world models we construct are set in the context of location we create them.
--------
Edit: Just read your piece on Faith: "Faith, as it’s traditionally understood, is trivial bullshit compared to the towering, unseen faith we place in the empirical all day everyday."
Absolutely correct, and the traditional understanding of Hebrews 11:1 I don't believe reflects what the author (supposedly Paul) was trying to convey.
Ἔστιν δὲ πίστις ἐλπιζομένων ὑπόστασις, πραγμάτων ἔλεγχος οὐ βλεπομένων
πίστις: Pistis can be translated as confidence, as in: I'm confident this chair won't collapse when I sit on on it. Much stronger than belief or faith.
ὑπόστασις: Hupostasis is also a much stronger word than assurance, it conveys substance, as in your past experience backs up your confidence.
There is a fine line between this an wisdom. The Default Mode Network (DMN) is the brain's "simulation machine". When you're not focused on a specific task, the DMN fires up, allowing you to daydream, remember the past, plan for the future, and contemplate others' perspectives.
Wisdom is not about turning the machine off; it's about becoming the director of the movie it's playing. A creative genius envisioning a new world and a person trapped in a state of torment isn't the hardware, but the learned software of regulation, awareness, and perspective.
Wisdom is the process of learning to aim this incredible, imaginative power toward flourishing instead of suffering. Saying "trap us in intrusive memories or hallucinations" is the negative side where there is also a positive side to it all.
No, it's hardware. There is no amount of 'wisdom' bootstraps pulling that will make you not schizophrenic.
Maybe we need a new acronym. Self Programmable Neuron Array. SPNA.
Or is the brain not mathematical?
https://plato.stanford.edu/entries/mind-indian-buddhism/
https://en.wikipedia.org/wiki/Prat%C4%ABtyasamutp%C4%81da
Consciousness is an attention mechanism. That inward regard, evaluating how the self reacts to the world, is attention being payed to the body's feelings. The outward regard then maps those feelings on to local space. Consciousness is watching your feelings as a kind of HUD on the world. It correlates feels to things.
Orchestrated objective reduction or just an emerging proeprty of:
Our 86 billion neurons, every single one deafeningly complex molecular machine with hundred million of hundreds of different receptor types, monoaminoxidae, (reuptake)transporters, connections to other neurons.
Anthropic principle: because it does. If it didn't feel like anything, it wouldn't. But it does, so it does.
https://en.wikipedia.org/wiki/Anthropic_principle
Explain the first part of this sentence.
If we had all those animals, especially those around the time of the cambrian explosion to experiment on as they developed it would probably make more sense in the 'but it does' department. This is also why your math teacher wants you to show your work.
They said it clearly amplified the internal part of some visual perception loop, in fairly straightforward ways. For example, intentionally trying to see something as it wasn't (like a shadow as a snake) would make it be seen that way (the shadow would take on a clear snake appearance, and even move a bit).
Some simple examples are all the face optical illusions (Thatcher, reverse mask, etc), that show our perception of a face is in no way direct.
Consciousness could still be the self reaction to this sub-conscious predictive/generative function.
Who's doing the looking?
The gist from my memory of 15+ years ago is that the brain needs to model the world and then itself within the world, creating a model that is transparent to itself, situated in the world.
It was about replacing backprop with a mechanism that checked outcomes against predictions, and just adjusted parameters that deviated from the predictions rather than the entire path. It wasn't suitable for digital machines (because it isn't any more efficient on digital machines) but it worked on analog models. If anybody remembers this, I'd appreciate the link.
I might be garbling the paper because it's from memory and I'm not an expert, but hopefully it's recognizable.
A recent paper posted here also looked at Recurrent Neural Nets and how in simplifying the design to its core amounted to just having a latent prediction and repeatedly adjusting that prediction.
When I'm driving I'm constantly making predictions about the future state of the highway and acting on that. For example before most people change lanes, even without using a signal they'll look and slightly move the car in that direction, up to a full second before they actually do it. Or I see two cars that are going to end up in a conflict state (trying to take the same location on the highway) so I pivot away from them and the recovery they will have to make.
Self driving cars for all I know are reactionary. They can't pick up on these things beforehand at this time and preemptively put them self in a safer position. Bad/distracted/unaware drivers are not only reactionary, they'll have a much slower reaction time than a self driving car.
It no more requires reasoning about the future as such than does stopping when someone or something is actually in the way (and thus the car will hit it in the future)
Clicking on it brought me to a 404. I found afterwards that the "correct" article does not mention "darting" but "flickering", even in the url. Why are some people mentionning "darting" when it appears nowhere in the text except on HN ? Am I having a mandela effect ? /S
My take on this is, especially in regard to debugging IT issues, is that you have to constantly verify and update your mental model (check your premises!) in order to better weed out problems.
I often find myself lost in my mental maps in daily life (Living inside my head) unless I'm in a nice novel environment. Meditation helps, however.
And dreams are simulation-based training to make life easier, decision-making more efficient?
What kind of next level machinery is this?! ;D
Anecdotally it is striking to see the contrast as a member of the former group talking to people of the latter. They have truly no idea where places are or how close they are to other places. It is like these network connections aren't being made by them at all. No sense of scale either of how large a place is or how far away another place might be. I imagine this dependency on turn by turn navigation with no spatial awareness leads to quite different outcomes in terms of modes of thinking.
I mean, when I think about going to a place I am constructing a mental map of the actual city map. I am considering geography, cardinal directions, major corridors and their connectivity along the route, rough estimates of distance, etc. My CPUs are being used no doubt. Others though it is like a blankness in that wake. CPUs idle. Follow the arrows. Who knows where north is? What is a mile?