What Comes After Science?
Mood
thoughtful
Sentiment
neutral
Category
science
Key topics
philosophy of science
scientific progress
future of research
The article 'What Comes After Science?' explores the potential future directions and implications of scientific inquiry, sparking discussion on the nature of scientific progress.
Snapshot generated from the HN discussion
Discussion Activity
Active discussionFirst comment
3h
Peak period
18
Day 1
Avg / period
18
Based on 18 loaded comments
Key moments
- 01Story posted
11/14/2025, 2:32:49 PM
4d ago
Step 01 - 02First comment
11/14/2025, 5:52:12 PM
3h after posting
Step 02 - 03Peak activity
18 comments in Day 1
Hottest window of the conversation
Step 03 - 04Latest activity
11/15/2025, 2:28:03 AM
4d ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
A while ago I read "Against Method" by Paul Feyerabend and there's a section that really stuck with me, where he talks about the "myth" of Galileo. His point is that Galileo serves as sort of the mythological prototype of a scientist, and that by picking at the loose ends of the myth one can identify some contradictory elements of the popular conception of "scientific method". One of his main points of contention is Galileo's faith in the telescope, his novel implementation of bleeding edge optics technology. Feyerebend argues that Galileo invented the telescope as primarily a military invention, it revolutionized the capabilities of artillery guns (and specifically naval artillary). Having secured his finances with some wealthy patrons, he then began to hunt for nobler uses of his new tool, and landed on astronomy.
Feyerabend's point (and what I'm slowly working up to) is that applying this new (and untested) military tool to what was a very ancient and venerable domain of inquiry was actually kind of scandalous. Up until that point all human knowledge of astronomy had been generated by direct observation of the phenomenon; by introducing this new tool between the human and the stars Galileo was creating a layer of separation which had never been there before, and this was the source of much of the contemporary controversy that led to his original censure. It was one thing to base your cosmology on what could be detected by the human eye, but it seemed very "wrong" (especially to the church) to insert an unfeeling lump of metal and glass into what had before been a very "pure" interaction, which was totally comprehensible to the typical educated human.
I feel like this article is expressing a very similar fear, and I furthermore think that it's kind of "missing the point" in the same way. Human comprehension is frequently augmented by technologly; no human can truly "understand" a gravitational wave experientially. At best we understand the n-th order 'signs' that the phenomenon imprints on the tools we construct. I'd argue that LLMs play a similar role in their application in math, for example. It's about widening our sensor array, more than it is delegating the knowledge work to a robot apprentice.
Though there is a key difference – Galileo could see through his telescope the same way, every time. He also understood what the telescope did to deliver his increased knowledge.
Compare this with LLMs, which provide different answers every time, and whose internal mechanisms are poorly understood. It presents another level of uncertainty which further reduces our agency.
Gradient descent is not a total black box, although it works so well as to be unintuitive. There is ongoing "interpretability" research too, with several key results already.
Actually this is a really critical error- a core point of contention at the time was that he didn't see the same thing every time. Small variations in the lens quality, weather conditions, and user error all contributed to the discovery of what we now call "instrument noise" (not to mention natural variation in the astronomical system which we just couldn't detect with the naked eye, for example the rings of Saturn). Indeed this point was so critical that it led to the invention of least-squares curve fitting (which, ironically, is how we got to where we are today). OLS allowed us to "tame" the parts of the system that we couldn't comprehend, but it was emphatically not a given that telescopes had inter-measurement reliability when they first debuted.
Did Feyerband also not argue that Galileo's claim that Copernicus's theory was proved was false given it was not the best supported hypothesis by the evidence available at the time.
I very much agree with your last paragraph. Telescopes are comprehensible.
My reading of AM was that it's less about what's "true" or "false" and more about how the actual structure of the scientific argument compares to what's claimed about it. The (rough) point (as I understand it) is that Galileo's scientific "findings" were motivated by human desires for wealth and success (what we might call historically contingent or "poltical" factors) as much as they were by "following the hard evidence".
> Telescopes are comprehensible.
"Comprehensible" is a relative measure, I think. Incomprehensible things become comprehensible with time and familiarity.
I found it intellectually reprehensible then, and now.
That's the primary difference between science and engineering.
In science, understating how it works is critical, and doing something with that understanding is optional. In engineering getting the desired outcome is critical, and understanding why it works is optional.
Anyway, the "we have AI, so will be soon no more things to discover" is similar to what was thought at the end of the XIX century that everything was discovered and only increasing precision was left. At the very least, we have a lot of learning about ourselves and how we understand reality, in the light of what AI could uncover with different methods than the traditional ones.
You can also view science as a rejection of the ability to be able to predict (arbitrary) things. Any illusion otherwise is simply seemingly reliable knowledge of the past and present. The rise of eg disinformation and misinformation, siloed communication, the replication crisis could presage a future where confidence is generally lower than the past, and predictive power is more limited.
I caution heavily against the idea that what you perceive as "progress" is inevitable or will follow past trends
I’m curious. Why did you write it this way vs. “19th”? People from 400 AD to 1400 AD used to write it that way. I’m assuming you’re either very old or a history buff.
This is not entirely new. For example, we had working (if inefficient) steam engines and pumps long before the development of thermodynamics. We had beer and cheese long before microbiology.
3 more comments available on Hacker News
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.