History of Declarative Programming (2021)
Key topics
Delving into the history of declarative programming, a fascinating document has sparked a lively discussion around Church numbers and lambda calculus. Commenters are dissecting the mathematical concepts, with some questioning the precision of the explanations and others drawing parallels with Peano's definition. The conversation takes a practical turn as the author reveals that accessing related academic resources, like Church's notes, may require academic credentials. Meanwhile, others are struggling with the tedious format of the referenced book, with some pleading for a PDF version.
Snapshot generated from the HN discussion
Discussion Activity
Active discussionFirst comment
1h
Peak period
11
0-6h
Avg / period
3
Based on 24 loaded comments
Key moments
- 01Story posted
Dec 14, 2025 at 5:46 PM EST
20 days ago
Step 01 - 02First comment
Dec 14, 2025 at 6:57 PM EST
1h after posting
Step 02 - 03Peak activity
11 comments in 0-6h
Hottest window of the conversation
Step 03 - 04Latest activity
Dec 18, 2025 at 5:21 AM EST
16 days ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Related to this: does anyone know if there's any document that delves into how Church landed on Church numerals in particular? I get how they work, etc, but at least the papers I saw from him seem to just drop the definition out of thin air.
Were church numerals capturing some canonical representation of naturals in logic that was just known in the domain at the time? Are there any notes or the like that provide more insight?
I forgot the name of this, but they seem the equivalent of successors in math In the low level math theory you represent numbers as sequences of successors from 0 (or 1 I forgot)
Basically you have one then sucessor of one which is two, sucessor of two and so on So a number n is n successor operations from one
To me it seems Church numbers replace this sucessor operation with a function but it's the same idea
While defining numbers in terms of their successors is decently doable, this logical jump (that works super well all things considered!) to making numbers take _both_ the successor _and_ the zero just feels like a great idea, and it's a shame to me that the papers I read from Church didn't intuit how to get there.
After the fact, with all the CS reflexes we have, it might be ... easier to reach this definition if you start off "knowing" you could implement everything using just functions and with some idea of not having access to a zero, but even then I think most people would expect these objects to be some sort of structure rather than a process.
There is, of course, the other possibility which is just that I, personally, lack imagination and am not as smart as Alonzo Church. That's why I want to know the thought process!
Zero is not the identity function. Zero takes a function and calls it zero times. The end result of this is that it returns the identity function. In Haskell it would be `const id` instead of `id`.
I suspect that this minor misconception may lead you to an answer to your original question!Why isn't the identity function zero? Given that everything in lambda calculus is a function, and the identity function is the simplest function possible, it would make sense to at least try!
If you try, I suspect you'll quickly find that it starts to break down, particularly when you start trying to treat your numerals as functions (which is, after all, their intended purpose).
Church numerals are a minimal encoding. They are as simple as it possibly gets. This may not speak to Church's exact thought process, but I think it does highlight that there exists a clear process that anyone might follow in order to get Church's results. In other words, I suspect that his discover was largely mechanical, rather than a moment of particularly deep insight. (And I don't think this detracts from Church's brilliance at all!)
¹https://chatgpt.com/share/693f575d-0824-8009-bdca-bf3440a195...
The jump from "there is a successor operator" to "numbers take a successor operator" is interesting to me. I wonder if it was the first computer science-y "oh I can use this single thing for two things" moment! Obviously not the first in all of science/math/whatever but it's a very good idea
> It is rather well-known, through Peano's own acknowledgement, that Peano […] made extensive use of Grassmann's work in his development of the axioms. It is not so well-known that Grassmann had essentially the characterization of the set of all integers, now customary in texts of modern algebra, that it forms an ordered integral domain in which each set of positive elements has a least member. […] [Grassmann's book] was probably the first serious and rather successful attempt to put numbers on a more or less axiomatic basis.
Unfortunately, I don’t think one can be linked given the author’s note.
Images of text, even if it were a text size I'd be comfortable with, is something that just breaks how I read online.
What you wrote makes me think more of the point free style
Here's a functional example:
Are those multiplications run in sequence or parallel?Here's a fancier functional one:
What order are the fields fetched?If you answered "unspecified" then you're right! The compiler will happily parallelize both of these expressions!
But that's also true for imperative languages.
The problem is pretty easy to see in a C-style for loop:
The index variable depends on it's value in previous versions of the loop and can be modified in the loop body! Can't parallelize that!Side effects and mutation break many of the promises provided by functional constructs. Hybrid languages can illustrate this:
Anybody discouraged from buying by the very limited hurdle getting the book will completely fail at the more substantial hurdle of understanding it.
Expecting everything for free, and creators giving in to that demand shapes character. Systems which reward minimal effort, maximal demand, and zero reciprocity end up selecting for the worst traits in both readers and communities.