Dynamically Patch a Python Function's Source Code at Runtime
Posted4 months agoActive4 months ago
ericmjl.github.ioTechstoryHigh profile
calmmixed
Debate
60/100
PythonDynamic Code ModificationSelf-Modifying CodeLisp
Key topics
Python
Dynamic Code Modification
Self-Modifying Code
Lisp
The article discusses dynamically patching a Python function's source code at runtime, sparking a discussion on the use cases, benefits, and potential drawbacks of this technique.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
23m
Peak period
53
0-6h
Avg / period
10.9
Comment distribution76 data points
Loading chart...
Based on 76 loaded comments
Key moments
- 01Story posted
Aug 24, 2025 at 8:28 AM EDT
4 months ago
Step 01 - 02First comment
Aug 24, 2025 at 8:52 AM EDT
23m after posting
Step 02 - 03Peak activity
53 comments in 0-6h
Hottest window of the conversation
Step 03 - 04Latest activity
Aug 26, 2025 at 1:07 PM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45003750Type: storyLast synced: 11/20/2025, 5:39:21 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Wasn't SMC one of the LISP-associated AI fields a few decades ago? iirc it's been mostly abandoned due to security issues, but some of it survives in dynamic compilation.
That's what Lisp is!
Once you see how cool that is, then you can begin to appreciate why Lisp was the defacto standard for AI programing all the way back in the 1960s!
So whenever you want, you can start using "normal code" for manipulating the "normal code" itself, and hopefully now we have yet another perspective on the same thing, for why Lisps are so awesome :)
[1] https://peps.python.org/pep-0750/
Say I want `what-code(1 + 1)` to receive `1 + 1` as the argument, not `2`, would either of those things let me do that? At a glance, and without diving deeper, the Ast module could make it so `what-code("1 + 1")` would let the function parse the string and get an Ast, which we could do stuff with, but it pretty much ends at that and is non-ideal for many reasons.
So what you could do is something like this:
Instead of inserting addition as an exact string, producing "1 + 1 * 7" which is 8, it would parse as an AST and insert the addition node as a unit, so that you get the intended result. You could also do an alpha conversion if the injected code declares variables to avoid name clashes.It wouldn't let you do true macros, but it would make codegen easier and less error-prone.
That's not how I understand homoiconicity, "everything is a list" or not has nothing to do with it. The point is that whatever structure the compiler uses to understand the code (slightly loose), is the same code you write, since code is represented as data. But if it's maps, lists, booleans or whatever isn't the important takeaway from that.
In your example, how would I, after the multiplication line but before the result line , manipulate "1 + 1" to read something else like "2 + 1"? In Clojure for example, it would be (+ 1 1) so if you wanna change 1 to 2, you use the typical "replace item in list by index" function, but with Python you're either stuck with AST or "code-as-strings". I think that's where a lot of the complexity and error-proneness gets in.
> ITA Software by Google Airfare search engine and airline scheduling software. Cambridge, MA. Common Lisp is used for the core flight search engine. The larger Flights project is roughly equal parts CL, C++, and Java.
Read the last sentence AND this company got acquired by Google like 15 years ago. So ya my question still stands.
> So ya my question still stands.
That list has 100 companies using lisp today. Were you actually asking if any new companies write in Lisp? Cuz those exist as well - in the same list...
Not sure you know what your own question is!
https://en.m.wikipedia.org/wiki/Wikipedia:Spot_checking_sour...
But that aside, if you want a fresh look at what people are thinking about with Lisp, maybe check out the talks that were given this year at the 2025 European Lisp Symposium [1,2]. Or perhaps look at how someone shipped a platformer game on Steam with Common Lisp [3,4], and is in the finishing lap porting it to the Nintendo Switch [5].
I realize, though, that this kind of "debate" (?) is never satisfying to the instigator. If it does satisfy though, I will agree with you that—despite all of the claims of alleged productivity and power the language offers—Common Lisp remains far less popular than Python, which I assume is your only real point here.
[1] A presentation about how adding a static type system to Common Lisp à la Haskell helps write mission critical programs in defense and quantum computing: https://youtu.be/of92m4XNgrM
[2] A talk from employees of Keepit, a company that supplies a SaaS backup service, discussed how they train people on Common Lisp when employing them: https://youtu.be/UCxy1tvsjMs?t=66m51s
[3] Discusses technical details of how Lisp was used to implement a game that was actually shipped: https://reader.tymoon.eu/article/413
[4] The actual game that you can buy: https://store.steampowered.com/app/1261430/Kandria/ (This is not intended to be an advertisement and I'm unaffiliated. It's just a demonstration of a recently "shipped" product written in Common Lisp where you might not expect it.)
[5] Technical discussion of the Nintendo Switch port: https://youtu.be/kiMmo0yWGKI?t=113m20s
Such a wide net though, what exactly constitutes a "remotely useful project" in your mind? Maybe if we figure out what the exact requirements are, we'll be able to help you with your search :)
It definitively can, no doubt about it. But used sparingly and only when there is no other way, it can help you remove enormous amount of boilerplate and other things, in a relatively simple, fast and safe way. In my codebases, it does lead to a lot less code, even when most projects just have 2 or 3 macros at most.
Just as one basic example that comes to mind just because I had to do it today: imagine you have a testing suite. When some assertion fails, you'd like to display what value was expected, what value it actually got, and what the exact code was. In JavaScript, I think the most you'd be able to get without involving 3rd party compilers, reading source code from disk or whatnot, would be some functions name (`myfn.toString()`), while in Clojure your macro could capture the entire source code within it, and print it, trivially.
Basically, if you want a function that can take the arguments without evaluating them before executing it, you can do so with macros but without macros you cannot do that. Personally, being able to do so leads to me finding simpler solutions, and expressing them in better ways, compared to if I didn't have them available.
We can see from OP that it's actually quite annoying to specify code in a non-S-expression language (or really any language lacking a meta-syntax), usually requiring either
- stuffing and interpolating strings
- building ASTs with API functions provided by the implementation in a tiresome or verbose manner
But you're right that there are more aspects to hot-reloading than just the syntax and data structure.
Common Lisp gets away with it because it actually defines the semantics of redefinition of functions and classes. For instance, the standard says what will happen to all existing FOO object instances in memory if I change
to and even lets the programmer customize the behavior of such a redefinition.Few if any languages go through the trouble of actually defining these semantics, especially in the context of a compiled language (like Common Lisp), making all but the simplest instances of reloading code well defined.
In Python in relation to ast, it does seem so yeah.
If you add two numbers in Python code, it looks like `1 + 1`, but if you use the module from `Lib/ast.py` linked above, how would it look like? I think it would be something like `Expression(body=BinOp(left=Name(id='x',ctx=Load()),op=Add(),right=Name(id='y', ctx=Load())))` which at a glance, certainly looks different than `1 + 1` in my eyes :)
In lisps, `(+ 1 1)` is just `(+ 1 1)` regardless if it's source code or AST, they're just the same thing.
Dynamic metaprogramming is flexible but dangerous. Python is also "dynamic", meaning that code can be changed at runtime instead of only being able to accidentally pass null function pointers.
Python's metaclasses function similarly to Lisp's macros but in a consistent way: most Python code uses the standard metaclasses so that the macros don't vary from codebase to codebase. The Django "magic removal" story was about eliminating surprise and non-normative metaclassery, for example.
Does this tool monkey patch all copies of a function or just the current reference? There are many existing monkey patching libraries with tests
Someone explain to me, an old, aging programmer old enough to know UML, why this isn't some we presume very young person who has no idea how to write OOP coming up with some horrible convoluted way to do something routine?
Oh don't worry... they still cram that down our throats in CS undergrad in one of the courses.... Forget which one. I did my UG from 2020-24
;)
I also use it in a multiple dispatch library (https://github.com/breuleux/ovld) to replace the entry point by specialized dispatch code in order to cut some overhead.
It's fun.
Also, why is every damn post these days somehow framed in an AI context? It's exhausting.
Every 5/10-year segment of my life has somehow had one or two "This is the future"-hypes running concurrently with my life. Previously it was X, now it's Y. And most of the times, everything else is somehow connected to this currently hyped subject, no matter if it's related or not.
The only thing I've found to be helpful is thinking about and changing my perspective and framing about it. I read some article like this which is just tangentially related to AI, but the meat is about something else. So mentally I just ignore the other parts, and frame it in some other way in my head.
Suddenly people can write their articles with attachments to the hyped subject but I don't mind, I'm reading it for other purposes and get other takeaways that are still helpful. A tiny Jedi mind-trick for avoiding that exhaustion :)
Some of it gets really annoying on the business side, because companies like Gartner jump on the trends, and they have enough influence that businesses have to pay attention. When serverless was a thing, every cloud provider effectively had to add serverless things even if it made zero sense and no customers were asking for it, simply to satisfy Gartner (and their ilk) and be seen as innovating and ahead of the curve. Same thing happened with block chain, and is currently happening with AI.
And yet, it took us (humans) a long time to turn the wheel around and using it for transportation, so maybe we need reinvent even more things, turn them around and such :)
(idk if this author is “junior” per se, mostly just agreeing the shift in perspective is helpful to not get burnt out by things like this)
Some minor details. You currently aren't updating functions if their freevars have changed, you can actually do that by using c-api to update __closure__ which is a readonly attribute from python:
Also I think you should update __annotations__, __type_params__, __doc__, and __dict__ attributes for the function.Rather than using gc.get_referrers I just maintain a set for each function containing all the old versions (using weakref so they go away if that old version isn't still referenced by anything). Then when a function updates I don't need to find all references, all references will be to some old version of the function so I just update that set of old functions, and all references will be using the new code. I took this from IPython autoreload. I think it is both more efficient than gc.get_referrers, and more complete as it solves the issue of references "decorated or stashed in some data structure that Jurigged does not understand". The code for that is here: https://codeberg.org/sczi/swanky-python/src/commit/365702a6c...
hot reload for python is quite tricky to fully get right, I'm still missing plenty parts that I know about and plan on implementing, and surely plenty more that I don't even know. If you or anyone else that's worked on hot reload in python wants to talk about it, I'm happy to, just reach out, my email is visible on codeberg if you're signed in.
I'm not sure what you mean by "maintaining a set of old versions". It's possible I missed something obvious, but the issue here is that I have the code objects (I can snag them from module evaluation using an import hook)... but I do not have the function objects. I never had them in the first place. Take this very silly and very horrible example:
The adders dictionary is dynamically updated with new closures. Each is a distinct function object with a __code__ field. When I update the inner function, I want all of these closures to be updated. Jurigged is able to do it -- it can root them out using get_referrers. I don't see how else to do it. I quickly tested in a Jupyter notebook, and it didn't work: new closures have the new code, but the old ones are not updated.Yes mine doesn't handle that, it is the same as jupyter there. Smalltalk is supposed to be best at interactive development, I wonder if it will update the old closures. I don't know it to try, but I do know Common Lisp which is also supposed to be quite good, and fwiw it behaves the same, new closures have the new code, but the old ones are not updated:
Here's another fun complication: what if I have a decorator that performs a code transform on the source code of the decorated function? If the code is changed, I would like to automatically re-run the transform on the new code. I made a (kind of awkward) protocol for this: https://github.com/breuleux/jurigged?tab=readme-ov-file#cust.... It's a little confusing even for me, so I should probably review it.
For example in your elephant:main.py test, in swanky python I run do(3): ['Paint 3 canvasses', 'Sing 3 songs', 'Dance for 3 hours']
change songs to songz, and now do(3) is: ['Paint 3 canvasses', 'Sing 3 songs', 'Dance for 3 hours', 'Sing 3 songz']
Rather than changing the earlier songs to songz as jurigged manages to. But any lisp environment would behave the same, we don't have the idea of:
> 3. When a file is modified, re-parse it into a set of definitions and match them against the original, yielding a set of changes, additions and deletions.
We are just evaling functions or whatever sections of code you say to eval, not parsing files and seeing what was modified. So in some cases we might need to make a separate unregister function and call that on the old one. Like in emacs if you use advice-add (adds before, after, and other kinds of hooks to a function), you can't just change the lines adding an advice and save the file to have it modify the old advice, you need to explicitly call advice-remove to unset the old advice, then advice-add with your new advice, if you want to modify it while running without restarting.
When I eval a function again I am evaling all decorators again, in your readme you write the downsides of that:
> %autoreload will properly re-execute changed decorators, but these decorators will return new objects, so if a module imports an already decorated function, it won't update to the new version.
But I think I am handling that, or maybe you have other cases in mind I am missing? ie in a.py:
Then in b.py, from a import reload_me, change reload_me in a and slime-eval-defun it, and b is using the new version of reload_me. Basically for all (__module__, __qualname__) I am storing the function object, and all old versions of the function object. Then when there is a new function object with that name I update the code, closure and other attributes for all the old function objects to be the same as the new one.I'll look into maybe just integrating jurigged for providing the reloading within swanky python. I was using the ipython autoreload extension at first, but ran into various problems with it so ended up doing something custom still mostly based on ipython, which is working for me in practice for now. So as long as I don't run into problems with it I'll focus on the many other parts of swanky python that need work, but sooner or later when I inevitably run into reloading problems I'll evaluate whether to just switch reloading to use jurigged.
yea i've been meaning to do this for a while as well...
Though the author says they wrote it as a joke and probably it is not possible to do robustly in pure python, but I assume it can be done robustly as a patch to CPython or possibly even as just a native C extension that gets loaded without people needing a patched build of CPython. If you know any good resources or information about how to approach this, or start working on it yourself, let me know.
I do wish there were callbacks I could subscribe to that would notify me whenever my file changed or whenever any code changed, so I could re-run some init.
My other feature request would be a way to replace function implementations even when they are currently on the stack, as some other hot reload implementations can. But I certainly understand why this would be difficult.
It’s even in the real world now - most of my conversations with people in tech end up at AI eventually.
It kind of reminds me of the 2010s when non-tech people would ask me about crypto at social events.
Thank you a bazillion for making it. It works quietly in the background without fuss, and I'm grateful for it every time I use it.
For example, there's java.lang.reflect.Proxy [1] and System.Reflection.Emit [2].
[1]: https://docs.oracle.com/javase/8/docs/technotes/guides/refle...
[2]: https://learn.microsoft.com/en-us/dotnet/api/system.reflecti...
I once inherited a nice library which made heavy use of such compile/exec pairs. The code looked very convoluted and was very bug prone. I replaced all such pairs with lambda wrappers. This was much more readable: my colleagues who were also working on this code were like "oh is that what that was meant to be doing" after this change.
Just cause you can do something doesn't mean you should. I send thoughts and prayers for the people debugging programs where this is in place.
Next step, here's how to load modules, resolve a dependency. Handle capabilities and dynamically inject more functionality 'live'.
Patching running machine code in memory for compiled objects is the same but you just need to work around the abstraction that is introduced by languages trying to make the whole stack human parseable.
https://en.m.wikipedia.org/wiki/Monkey_patch
setattr(mod, name, new_func)
https://github.com/eidorb/ubank/blob/master/soft_webauthn_pa...
Creating a debugger for an LLM (not a human) is something I haven't really seen, but seems super useful...?
https://en.wikipedia.org/wiki/Greenspun%27s_tenth_rule
1 more comments available on Hacker News