Compiling with Continuations
Posted4 months agoActive3 months ago
swatson555.github.ioTechstory
calmmixed
Debate
60/100
CompilersContinuationsProgramming Languages
Key topics
Compilers
Continuations
Programming Languages
The article discusses the book 'Compiling with Continuations' and its relevance to compiler design, sparking a discussion on the book's historical context, technical merits, and practical applications.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
3d
Peak period
21
Day 4
Avg / period
6.3
Comment distribution25 data points
Loading chart...
Based on 25 loaded comments
Key moments
- 01Story posted
Sep 16, 2025 at 11:56 PM EDT
4 months ago
Step 01 - 02First comment
Sep 20, 2025 at 4:51 AM EDT
3d after posting
Step 02 - 03Peak activity
21 comments in Day 4
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 29, 2025 at 8:58 PM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45271493Type: storyLast synced: 11/20/2025, 12:41:39 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
If nobody ever implemented it in C, it's questionable whether it ever happened at all.
Not exactly a common language...
I don't have a math or computer science background so the more academic publications are almost always quite unreadable to me. I learn a lot by exploring the source code of existing virtual machines instead, and most are written in systems languages.
Sometimes much smarter people than I randomly decide to write articles that democratize access to very complex areas of programming language development. Examples:
https://journal.stuffwithstuff.com/2013/12/08/babys-first-ga...
https://www.wingolog.org/archives/2010/02/26/guile-and-delim...
https://github.com/abeln/cc
Blogger just didn't look hard enough (or doesn't know that search on GitHub is crippled, for reasons)
For C you can look at the old Perl6 VM, Parrot which compiled to CPS until v2.7, where they destroyed CPS and went conventional and slow.
From the review (discussing "practical applications of continuations"):
> Was it influential to the field of computer science?
Call/cc was fairly popular in the North West of the US (maybe California too) among Lisp and Scheme people. Compiling them well on real machines was definitely a topic worth investigating at the time.
https://en.wikipedia.org/wiki/Call-with-current-continuation
Call/cc has since (thankfully) become a lot less popular.
I liked the book a lot when I read it (2-3 decades ago) but it didn't seem relevant in the slightest if you just didn't care about call/cc.
CPS is an intermediate form that a compiler can use to reason about a program. You could argue that using this form makes it easier to implement call/cc, but I don’t think that’s really true.
I vaguely recall that CPS makes it possible to reason about stack disciplined calls, if you want to compile to call/ret instructions.
But whatever, it really doesn’t matter. CPS is quite dead as an IR. SSA won
It’s a continuation. You don’t need to express your program in continuation passing style to use continuation which is why call/cc exists.
The idea of continuation is interesting in and of itself independently of if your compiler uses CPS because continuation as a concept is useful. It appears in effect system for exemple.
Apple book is very good by the way. It gives a very hand on overview of how to implement a compiler in a functional style and neatly introduces some quite complex ideas. To me it’s amongst the books you can’t regret reading also it’s quite short and easy which helps. Timeless classic like Peyton Jones The implementation of functional programming languages and is great introduction to lambda calculus and presentation of how to implement a type checker in Miranda.
The book introduces how to turn a program to CPS, why you can and how that allows to compile. That’s interesting in and of itself as a way to conceptualise how a program computation flows work and what it means for the construction of functional programs.
It was never a popular way to write compilers but academic books are not tutorial. That never was the point.
https://bernsteinbear.com/assets/img/kelsey-ssa-cps.pdf
CPS is an AST form with first class functions that don’t return. CPS is highly opinionated about how control flow is represented and not very opinionated about how data flow is represented (maybe it’s just variable names you resolve, maybe it’s data flow).
SSA is a data flow graph with a way to express looping. SSA isn’t very opinionated about control flow and different SSA implementations do it differently.
That paper is just saying that you can transform between the two. So what. Of course you can; you can transform between any Turing complete languages and that doesn’t make them “not that different” and definitely not the same.
As far as functional IRs go, I would say SSA corresponds most directly to (a first-order restriction of) ANF w/ join points. The main difference being the use of dominance-based scoping rules, which is certainly more convenient than juggling binders when writing transformations. The first-order restriction isn't even essential, e.g. https://compilers.cs.uni-saarland.de/papers/lkh15_cgo.pdf.
If you're interested in an IR that can express continuations (or evaluation contexts) in a first-class way, a much better choice than CPS is an IR based on the sequent calculus. As I'm sure you know (since you work with one of the coauthors), this was first experimented with in a practical way in GHC (https://pauldownen.com/publications/scfp_ext.pdf), but there is a paper in this year's OOPSLA (https://dl.acm.org/doi/10.1145/3720507) that explores this more fundamentally, without the architectural constraint of being compatible with all of the other decisions already made in GHC. Of course, one could add dominance scoping etc. to make an extension of SSA with more symmetry between producers and consumers like the classical sequent calculus.
Delimited continuations are the state of the art now, aren't they?
On x86, the use of paired call/return is conflated with usage of the stack to store activation records. On AArch64, indirect branches can be marked with a hint bit indicating that they are a call or return, so branch prediction can work exactly the same with activation records elsewhere.
> Three Implementation Models for Scheme
This was a good read, thank you for the reference.
> call/cc which I find too powerful
Why do you think that? Do you have the same opinion about delimited continuations?
Alternatively, OP might have a bit of learning still to do.
I bought it because it demonstrated how to cps-transform code, which was my goal. I think it delivered.
For a "broader perspective", see the follow-up papers:
(...which TFA characterised as "parody")FWIW, after I implemented CPS this way, I eventually switched to ANF for typing reasons. I'm a bit of a beginner at type inference and could not assign a meaningful type to the extra term ("continuation") that CPS produces (and which ANF does not).
CPS is quite useful for abstract interpretation when doing flow-based type-inference systems since the continuation just becomes another value to track.