Lisp From Nothing, Second Edition
Key topics
The Lisp programming language is once again under the spotlight with the release of "Lisp from Nothing, Second Edition," sparking a lively discussion about its syntax and structure. Commenters weighed in on the perennial debate about Lisp's use of S-expressions, with some defending its elegance and others recalling attempts to replace it with alternative syntax, such as John McCarthy's proposed "M-expression." Interestingly, some contributors shared their own experiences implementing Lisp-like languages, with one noting that having a pre-existing notation made it surprisingly easy to reinvent Lisp for scripting a CRDT database. As the conversation unfolded, a consensus emerged that, despite efforts to change it, Lisp's unique syntax has a certain charm that makes it instantly understandable to some, while others appreciate the judicious use of additional syntax elements, like Racket's square brackets, to aid visual parsing.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
N/A
Peak period
67
Day 3
Avg / period
14.9
Based on 104 loaded comments
Key moments
- 01Story posted
Aug 27, 2025 at 5:50 AM EDT
4 months ago
Step 01 - 02First comment
Aug 27, 2025 at 5:50 AM EDT
0s after posting
Step 02 - 03Peak activity
67 comments in Day 3
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 4, 2025 at 9:51 PM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
For example, print change-dir make-dir; is equivalent to (print (change-dir (make-dir) ) ) in the old money. I wonder if I am reinventing too much here.
Did LISPers try to get rid of the brackets in the past?
Probably the best example of a “Lisp without parentheses” is Dylan. Originally, Dylan was developed as a more traditional Lisp with sexprs, but they came up with a non-sexr “surface syntax” before launching it to avoid scaring the public.
In general a sequence of expressions of which only the value of the last is used, like C's comma operator or the "implicit progn" of conventional cond and let bodies, is only useful for imperative programming where the non-last expressions are executed for their side effects.
Clojure's HAMTs can support a wider range of operations efficiently, so Clojure code, in my limited experience, tends to be more purely applicative than code in most other Lisps.
Incidentally, a purely applicative finite map data structure I recently learned about (in December 02023) is the "hash trie" of Chris Wellons and NRK: https://nullprogram.com/blog/2023/09/30/. It is definitely less efficient than a hash table, but, in my tests so far, it's still about 100ns per hash lookup on my MicroPC and 250ns on my cellphone, compared to maybe 50ns or 100ns respectively for an imperative hash table without FP-persistence. It uses about twice as much space. This should make it a usable replacement for hash tables in many applications where either FP-persistence, probabilistically bounded insertion time, or lock-free concurrent access is required.
This "hash trie" is unrelated to Knuth's 01986 "hash trie" https://www.cs.tufts.edu/~nr/cs257/archive/don-knuth/pearls-..., and I think it's a greatly simplified HAMT, but I don't yet understand HAMTs well enough to be sure. Unlike HAMTs, it can also support in-place mutating access (and in fact my performance measurements above were using it).
______
† sometimes called "functional", though that can alternatively refer to programming with higher-order functions
Thanks.
In C/C++ most functions return error codes, forcing the latter form.
And then there are functional languages allowing: x -> h -> g -> f but I think the implicit parameter passing doesn’t sit well with a lot of programmers either.
More likely than not it's a matter of what a person gets used to. I've enjoyed working in Lisp/Scheme and C, but not so much in primarily functional languages. No doubt programmers have varied histories that explain their preferences.
As you imply, in C one could write nested functions as f (g (h (x))) if examining return values is unnecessary. OTOH in Lisp return values are also often needed, prompting use of (let ...) forms, etc., which can make function nesting unclear. In reality programming languages are all guilty of potential obscurity. We just develop a taste for what flavor of obscurity we prefer to work with.
These works are something I both understand and would never achieve myself. These are cultural artifacts, like deeply personal poetry, made purely for the process of it. Not practically useful, not state of the art, not research level, but... a personal journey?
If the author is reading this... can you share your vision? Motivation?
I do not usually talk much about "myself". I tried, but with no-one asking, I find it difficult to say anything.
Cage.
Thanks for An Introduction to Mental Development, I've throughly enjoyed it!
And thanks for the Cage quote. I enjoyed that, too!
As a fan of word plays, that got a genuine chortle out of me. Thank you.
EDIT: https://usesthis.com/interviews/nils.m.holm/
It is always interesting to spot a person on the interwebs who seems to actually have managed to turn buddhist or some other teachings into real world deeds. Living really modestly (IIRC, he/you also uses modest, underclocked laptops?), publishing for the benefit of many, and doing all this for years and years. Like, there seems to be no "overhead" in this way of living. Hugely inspirational.
I would also point out the "Essays" section on nmh's webpage, especially the ones discussing sensitivity and high IQ: https://t3x.org/#essays
Having purchased several of your books, thanks for your work, nmh!
Turning the Buddhist (or other) teachings into deeds is not too hard once you have understood who you are, and, maybe more importantly, who you are not. Figuring /that/ out can be tough and require a lot of practice.
What people perceive as modest is really an acceptance or even appreciation of what is. My apartment has not been renovated in decades, I repair what needs repair and otherwise leave things to themselves. I wear clothes until they disintegrate, and my hardware is already old when I buy it. This is the course of things. Things age and change and at some point disappear. Why prefer the new over the old? Why the old over the new? It is just that things and beings get old on their own, and it is much more joyful to witness this than trying to resist it.
I can't speak for the author but this is exactly how I look at the lisp I'm developing. It's a lifetime project. I had some kind of vision depicting how different things could be, and at some point I started trying to make it happen. I want to convince myself I'm not insane for thinking it was possible in the first place.
Maybe less embarrassing than talking about Rock the Cashbar by The Clash (though that one was corrected the first time I saw the back of the album).
doesn't seem to fit with:
"INTENDED AUDIENCE This is not an introduction to LISP."
on page 10.
Lisp from Nothing - https://news.ycombinator.com/item?id=24809293 - Oct 2020 (29 comments)
Lisp from Nothing - https://news.ycombinator.com/item?id=24798941 - Oct 2020 (5 comments)
The book is basically a modern and more complete version of the "Small C Handbook" of the 1980's. I goes through all the stages of compilation, including simple optimizations, but keeps complexity to a minimum. So if you just want to learn about compiler writing and see what a complete C compiler look like under the hood, without investing too much into theory, then this is probably one of very few books that will deliver.
Edit: and then Warren Toomey has written "A Compiler Writing Journey" based on PCC, which may shed a bit more light on the book: https://github.com/DoctorWkt/acwj
I sure find them beautiful and all, but why do they take center stage so often? Beside the aesthetics and instructional value, I don't get the appeal. Also I feel that a bunch of the heavy lifting behind metacircular evaluators is actually done by the Polish notation syntax as well as the actual implementation, and these concepts don't get nearly as much love.
Any Lisper who can illuminate me?
And in a way it’s like Maxwell’s equations. A simple proof of computation that also somehow implements a very neat language.
I know this is a classic analogy, but now you've got me wondering, originally Maxwell wrote a messy pile of equations of scalrs, later someone (Gibbs?) gave them the familiar vector calculus form. Nowadays we have marvellously general and terse form, like (using the differential of the Hodge dual in naturalised units),
My question is, when are we going to get some super-compact unified representation of `eval`?[1] https://tromp.github.io/cl/cl.html
It was Oliver Heaviside (https://en.wikipedia.org/wiki/Oliver_Heaviside) that rewrote Maxwell's original equations (20 of them in differential form) into the notation used today (4 of them in vector calculus form).
Here's a nice comparison: https://ddcolrs.wordpress.com/2018/01/17/maxwells-equations-...
There's also version for the metacirculator interpreter written in full on M-expr, but they kinda break the spirit of things.
I think the version of eval that we have is already pretty terse for what it is. You could maybe code-golf it into something smaller, or you could code-golf it into something fully immutable.
My only gripe is that they all rely on an already existing reader that parses the expressions for you and represents them. Which is exactly what the book is about.
Finding a small enough interpretation that does ALL of it would be a dream, but I doubt it could be anywhere near as concise as the (modern) Maxwell equations.
Where does one — who has no knowledge of these prerequisites or about LISP (except that the latter has been heard in programming circles as something esoteric, extremely powerful, etc.) — start, before reading this book?
But learning the basics of lisp is more like a side effect, the focus is on program design.
Another source of awe is about Lisp being more of a programming system than a language, and Common Lisp was the standardization of a lot of efforts towards that by companies making large and industrial pieces of software like operating systems, word processors, and 3D graphics editors. At the language level, "compile", "compile-file", "disassemble", "trace", "break", "step" are all functions or macros available at runtime. When errors happen, if there's not an explicit handler for it (like an exception handler) then the default behavior isn't to crash but to trigger the built-in debugger. And the stack isn't unwound yet, you can inspect the local variables at every layer. (There's very good introspection in general for everything.) Various restarts will be offered at different parts of the stack -- for example, a value was unknown, so enter it now and continue. Or you can recompile your erroneous function and restart execution at one of the stack frames with the original arguments to try again. Or you can apt-get install some foreign dependency and try reloading it without having to redo any of the effort the program had already made along the way.
Again, all part of the language at runtime, not a suite of separate tools. Implementations may offer things beyond this too, like SBCL's code coverage or profiling features. All the features of the language are designed with this interactivity and redefinability in mind though -- if you redefine a class definition, existing objects will be updated, but you can control that more finely if you need to by first making a new update-instance-for-redefined-class method. (Methods aren't owned by classes, unlike other OOP languages, which I think eliminates a lot of the OOP design problems associated with those other languages.)
I like the book Successful Lisp as a tour of Common Lisp, it's got a suggested reading order in ch 2 for different skill levels: https://dept-info.labri.fr/~strandh/Teaching/MTP/Common/Davi... It's dated in parts as far as tooling goes but if you're mostly interested in reading about some bits rather than actively getting into programming with Lisp that's not so bad. If you do want to get into it, https://lispcookbook.github.io/cl-cookbook/ has some resources on getting started with a Lisp implementation and text editor (doesn't have to be emacs).
https://www.cs.cmu.edu/~dst/LispBook/book.pdf
I love it so much, and seeing your bibliography makes me feel like a kid in a candy store. The confluence of Asian philosophy and computing is delightful.
To put you in the correct headspace this Saturday morning: https://t3x.org/whoami.html
> • a list of the books referred to in a scholarly work, typically printed as an appendix.
> • a list of the books of a specific author or publisher, or on a specific subject. "a bibliography of his publications"
> • the history or systematic description of books, their authorship, printing, publication, editions, etc. "he regarded bibliography as a science"
In this context clearly the second meaning is meant (https://t3x.org/index.html#books); this also corresponds to meaning 2 at Wiktionary (https://en.wiktionary.org/w/index.php?title=bibliography&old...) or 2b at Merriam-Webster (https://www.merriam-webster.com/dictionary/bibliography).
> a list of works written by an author or printed by a publishing house “compiled a complete bibliography of John Donne”
Enjoy your stay!
Or just possibility to do syscalls to do something. What is more important then new syntax and sugar over basic instructions.
Our objectives might, and most probably will, be different.
(They are probably “useful” in the dissemination of what the real essence of computation can reduce to, in practical terms.)
Not everything needs to be useful in fact: certain things can be just enjoyed in their essence, just looked at and appreciated. A bit like… art?
I am implementing my own Scheme as well. Why? I don’t know, one needs to do things that serve no apparent purpose, sometimes.
And then, at least for the compiler books, there is: http://t3x.org/files/whichbook.pdf
Another valid question downvoted into oblivion.
The environment in (lexically scoped) LISP is an implementation detail. Lambda calculus does not need an environment, because variables are substituted on a sheet of paper. So lambda calculus equals lexically scoped LAMBDA in LISP.
Sure, you could view LISP as LC plus some extra functions (that are not easily implemented in LC).
(credit to https://aphyr.com/posts/340-reversing-the-technical-intervie..., I always get a kick out of that and the follow up https://aphyr.com/posts/341-hexing-the-technical-interview).
https://t3x.org/s9book/index.html