Go Away Python
Key topics
The debate around ditching Python for Go gained traction with a post titled "Go away Python," sparking a lively discussion on the merits of using Go for scripting tasks. Some commenters, like throw-12-16, were inspired to port their utilities to Go, while others, such as llmslave2, shared their experiences using Go for building full-stack JavaScript applications. The conversation took a humorous turn with ahartmetz's witty remarks about the scope of "full stack" development, and the thread devolved into a playful exchange between ahartmetz and other commenters, with dangoodmanUT even spotting a "Dwight Schrute" vibe. Meanwhile, others, like age123456gpg, steered the conversation back on track by sharing the official Go stance on supporting interpreter mode.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
2h
Peak period
141
Day 1
Avg / period
32
Based on 160 loaded comments
Key moments
- 01Story posted
Dec 30, 2025 at 3:50 AM EST
11 days ago
Step 01 - 02First comment
Dec 30, 2025 at 5:26 AM EST
2h after posting
Step 02 - 03Peak activity
141 comments in Day 1
Hottest window of the conversation
Step 03 - 04Latest activity
Jan 7, 2026 at 5:41 PM EST
1d ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
That being said...use Go for scripting. It's fantastic. If you don't need any third party libraries this approach seems really clean.
Device drivers, task switching, filesystem, memory management and all?
I make computers do things, but I never act like my stuff is the only stuff that makes things happen. There is a huge software stack of which my work is just the final pieces.
And it's okay. It doesn't mean it should be this way for everyone else.
It is pretty common (and been so for at least two decades) for web devs to differentiate like so: backend, frontend or both. This "both" part almost always is replaced by "full stack".
When people say this they just mean they do both parts of a web app and have no ill will or neglect towards systems programmers or engineers working on a power plant.
The term “full stack” works fine within its usual context, but when viewed more broadly, it becomes misleading and, in my opinion, problematic.
But it is already established in the industry, and fighting it is unlikely to yield any positive outcomes.
Something like //usr/bin/gcc -o main "$0"; ./main "$@"; exit
The main reason was to do all this without any dependencies beyond a C compiler and some POSIX standard library.
If you've never used Clojure and start a Clojure project, you will almost definitely find advice telling you to use Leiningen.
For Python, if you search online you might find someone saying to use uv, but also potentially venv, poetry or hatch. I definitely think uv is taking over, but its not yet ubiquitous.
Ironically, I actually had a similar thing installing Go the other day. I'd never used Go before, and installed it using apt only to find that version was too old and I'd done it wrong.
Although in that case, it was a much quicker resolution than I think anyone fighting with virtual environments would have.
Over the years, I've used setup.py, pip, pipenv (which kept crashing though it was an official recommendation), manual venv+pip (or virtualenv? I vaguely remember there were 2 similar tools and none was part of a minimal Python install). Does uv work in all of these cases? The uv doc pointed out by the GP is vague about legacy projects, though I've just skimmed through the long page.
IIRC, Python tools didn't share their data across projects, so they could build the same heavy dependencies multiple times. I've also seen projects with incomplete dependencies (installed through Conda, IIRC) which were a major pain to get working. For many years, the only simple and sane way to run some Python code was in a Docker image, which has its own drawbacks.
Yes. The goal of uv is to defuck the python ecosystem and they're doing a very good job at it so far.
Should we port npm “back” to node js?
But I don't see this hapenning in python.
yes, we should bring package manager back. if it is so awesome and solves some problem.
You keep using words like "we" and "us" so I assume you'll be kicking off writing the PEP to make this happen?
I only work a little bit with python.
I get that installing to the site-packages is a security vulnerability. Installing to my home directory is not, so why can't that be the happy path by default? Debian used to make this easy with the dist-packages split leaving site-packages as a safe sandbox but they caved.
The brilliant part about venvs is that A and B can have their completely separate mutually incompatible environments.
The thing that makes Python different is that it was never designed with any kind of per-project isolation in mind and this is the best way anyone's come up with to hack that behaviour into the language.
uv has replaced that for me, and has replaced most other tools that I used with the (tiny amount of) Python that I write for production.
1. pip isn't entirely to blame for all of Python's bad package management - distutils & setuptools gave us the awful PEP 518 - but either way, UV does away with that in favour of a modern, consistent, declarative, parseable PEP508 manifest spec, along with their own well-designed lockfile (there was no accepted lockfile PEP at the time UV was created - since PEP 715 has become accepted UV has added support, though that PEP is still limited so there's more work to do here).
2. pyenv works fine but uv is faster & adds some nice extra features with uvx
3. venv has always been a pain - ensuring you're always in the right venv, shell support, etc. uv handles this invisibly & automatically - because it's one tool you don't need to worry about running pip in the right venv or whatever.
Sometimes it's things like updating to Fedora 43 and every tool you installed with `pipx` breaking because it was doing things that got wiped out by the system upgrade, sometimes it's `poetry update --only dep1` silently updating dep2 in the background without telling you because there was an update available and even though you specified `--only` you were wrong to do that and Poetry knows best.
Did you know that when you call `python -m venv` you should always pass `--upgrade-deps` because otherwise it intentionally installs an out of date version of pip and setuptools as a joke? Maybe you're not using `python -m venv` because you ran the pyenv installer and it automatically installed `pyenv-virtualenv` without asking which overrides a bunch of virtualenv features because the pyenv team think you should develop things in the same way they do regardless of how you want to delevop things. I hate pyenv.
So far the only problem I've had with `uv` is that if you run `uv venv` it doesn't install pip in the created virtualenv because you're supposed to run `uv pip install` instead of `pip install`. That's annoying but it's not a dealbreaker.
Outside of that, I feel very confident that I could give a link to the uv docs to a junior developer and tell them to run `uv python install 3.13` and `uv tool install ruff` and then run `uv sync` in a project and everything will work out and I'm not going to have to help them recover their hard drive because they made the foolish mistake of assuming that `brew install python` wouldn't wreck their macbook when the next version of Python gets released.
One of the neatest features of uv is that it uses clever symlinking tricks so if you have a dozen different Python environments all with the same dependency there's only one copy of that dependency on disk.
For pip to do this, first it would have to organize its cache in a sensible manner, such that it could work as an actual download cache. Currently it is an HTTP cache (except for locally-built wheels), where it uses a vendored third-party library to simulate the connection to files.pythonhosted.org (in the common PyPI case). But it still needs to connect to pypi.org to figure out the URI that the third-party library will simulate accessing.
Before uv came along I was starting to write stuff in Go that I’d normally write in Python.
Python's always been a pretty nice language to work in, and uv makes it one of the most pleasant to deal with.
It's just so useful: uv is great and there are decent quality packages for everything imaginable.
Unlike something like Rust, which has much fewer users (though growing) and requires PhDs in Compiler Imprecation and Lexical Exegetics.
Or C++ which has a much larger installed base but also no standard distribution method at all, and an honorary degree in Dorsal Artillery.
This is sort of like saying "You might find someone saying to drive a Ford, but also potentially internal combustion engine, Nissan or Hyundai".
Pip is slow, far slower than it needs to be in almost everything that it does, regardless of being written in Python. It's "standard" but not part of the standard library (so that it can be developed independently), and was never designed to install cross-environment properly (the current best approach, since 22.3, is a hack that incurs a significant delay and expects everyone to move in lock-step with the CPython EOL schedule). It wastes disk space, both by re-copying packages into new environments (rather than hard-linking them as uv does) and by spawning copies of itself in those environments (the original work-around to avoid needing cross-environment installation support, which a few people have also come to rely on in other ways).
> If it is so good, ok, i get it, it might be - and all that good stuff needs to go back to python as python.
I like these threads because they encourage me to work on my main project.
But with much more detail: it seems complicated because
* People refuse to learn basic concepts that are readily explained by many sources; e.g. https://chriswarrick.com/blog/2018/09/04/python-virtual-envi... [0].
* People cling to memories of long-obsolete issues. When people point to XKCD 1987 they overlook that Python 2.x has been EOL for almost six years (and 3.6 for over four, but whatever)[1]; only Mac users have to worry about "homebrew" (which I understand was directly interfering with stuff back in the day) or "framework builds" of Python; easy_install is similarly a long-deprecated dinosaur that you also would never need once you have pip set up; and fewer and fewer people actually need Anaconda for anything[2][3].
* There is never just one way to do it, depending on your understanding of "do". Everyone will always imagine that the underlying functionality can be wrapped in a more user-friendly way, and they will have multiple incompatible ideas about what is the most user-friendly.
But there is one obvious "way to do it", which is to set up the virtual environment and then launch the virtual environment's Python executable. Literally everything else is window dressing on top of that. The only thing that "activating" the environment does is configure environment variables so that `python` means the virtual environment's Python executable. All your various alternative tools are just presenting different ways to ensure that you run the correct Python (under the assumption that you don't want to remember a path to it, I guess) and to bundle up the virtual environment creation with some other development task.
The Python community did explicitly provide for multiple people to provide such wrappers. This was not by providing the "15th competing standard". It was by providing the standard (really a set of standards designed to work together: the virtual environment support in the standard library, the PEPs describing `pyproject.toml`, and so on), which replaced a Wild West (where Setuptools was the sheriff and pip its deputy).
[0]: By the way, this is by someone who doesn't like virtual environments and was one of the biggest backers of PEP 582.
[1]: Of course, this is not Randall Munroe's fault. The comic dates to 2018, right in the middle of the period where the community was trying to sort things out and figure out how to not require the often problematic `setup.py` configuration for every project including pure-Python ones.
[2]: The SciPy stack has been installable from wheels for almost everyone for quite some time and they were even able to get 3.12 wheels out promptly despite being hamstrung by the standard library `distutils` removal.
[3]: Those who do need it, meanwhile, can generally live within that environment entirely.
The way I teach, I would start there; then you always have it as a fallback, and understand the system better.
I generally sort users into aspirants who really should learn those things (and will benefit from it), vs. complete end users who just want the code to run (for whom the developer should be expected to provide, if they expect to gain such a following).
I think this properly kicked off with RVM, which needed to come into existence because you had this situation where the Ruby interpreter was going through incompatible changes, the versions on popular distributions were lagging, and Rails, the main reason people were turning to Ruby, was relatively militant about which interpreter versions it would support. Also, building the interpreter such that it would successfully run Rails wasn't trivial. Not that hard, but enough that a convenience wrapper mattered. So you had a whole generation of web devs coming up in an environment where the core language wasn't the first touchpoint, and there wasn't an assumption that you could (or should) rely on what you could apt-get install on the base OS.
This is broadly an extremely good thing.
But the critical thing that RVM did was that it broke the circular dependency at the core of the problem: it didn't itself depend on having a working ruby interpreter. Prior to that you could observe a sort of sniffiness about tools for a language which weren't implemented in that language, but RVM solved enough of the pain that it barged straight past that.
Then you had similar tools popping up in other languages - nvm and leiningen are the first that spring to mind, but I'd also throw (for instance) asdf into the mix here - where the executable that you call to set up your environment has a '#!/bin/bash' shebang line.
Go has sidestepped most of this because of three things: 1) rigorous backwards compatibility; 2) the simplest possible installation onramp; 3) being timed with the above timeline so that having a pre-existing `go` binary provided by your OS is unlikely unless you install it yourself. And none of those are true of Python. The backwards compatibility breaks in this period are legendary, you almost always do have a pre-existing Python to confuse things, and installing a new python without breaking that pre-existing Python, which your OS itself depends on, is a risk. Add to that the sniffiness I mentioned (which you can still see today on `uv` threads) and you've got a situation where Python is catching up to what other languages managed a decade ago.
Again.
^mostly, some defs might have StackOverflow copy/pasta
I'm sure the documentation of this featureset highlights what I'm about to say but if you're attracted to the simplicity of writing Python projects who are initialized using this method, do not use this code in staging/prod.
If you don't see why this is not production friendly it's for the simple a good.reaaon that creating deployable artifacts packaging a project or a dependency of a project this uses this method, creating reproducible builds becomes impossible.
This will also lead to builds that pass your CI but fail to run in their destination environment and vice versa due to the fact that they download heir dependencies on the fly.
There may be workarounds and I know nothing of this feature so investigate yourself if you must.
My two cents.
(this is a joke btw)
On a serious note, its so brilliant that something like this is now possible when we think about it. It's maddeningly crazy to think about all the process but in the end that you can end up with a system / linux iso whose hash you can trust/independently verify and then you use it and spread around the world. Definitely makes me feel as sky's the only limit or just its very pleasant to think about it.
Can find via `uv cache dir`
See: https://docs.astral.sh/uv/reference/cli/#uv-cache-dir
[1] https://paulw.tokyo/standalone-python-script-with-uv/
What you meant was, "you don't need python pre-installed". This does not solve the problem of not wanting to have (or limited from having) python installed.
This is more of a pip issue than uv though, and `uv pip` is still preferable in my mind, but seems Python package management will forever be a mess, not even the bandaid uv can fix things like these.
And regardless if you use only uv, or pip-via-uv, or straight up pip, dependencies you install later steps over dependencies you installed earlier, and no tool so far seems to try to solve this, which leads me to conclude it's a Python problem, not a package manager problem.
in the end i went back to good old virtualenvwrapper.sh and setting PYTHONPATH. full control over what goes into the venv and how. i guess people like writing new tools. i can understand that.
Maybe for more complex projects and use cases it's harder, but it's a lot faster than just pip and pyproject.toml is a lot nicer to manage than `requirements.txt`, so that's two easy enough wins for me to move over.
The standard recommendation for this is `tomli`, which became the basis of the standard library `tomllib` in 3.11.
First off, in my mind the kinds of things that are "scripts" don't have dependencies outside the standard library, or if they do are highly specific to my own needs on my own system. (It's also notable that one of the advantages the author cites for Go in this niche is a standard library that avoids the need for dependencies in quick scripts! Is this not one of Python's major selling points since day 1?)
Second, even if you have dependencies you don't have to learn differences between these tools. You can pick one and use it.
Third, virtual environments are literally just a place on disk for those dependencies to be installed, that contains a config file and some stubs that are automatically set up by a one-liner provided by the standard library. You don't need to go into them and inspect anything if you don't want to. You don't need to use the activation script; you can just specify the venv's executable instead if you prefer. None of it is conceptually difficult.
Fourth, sharing an environment for these quick scripts actually just works fine an awful lot of the time. I got away with it for years before proper organization became second nature, and I would usually still be fine with it (except that having an isolated environment for the current project is the easiest way to be sure that I've correctly listed its dependencies). In my experience it's just not a thing for your quick throwaway scripts to be dependent on incompatible Numpy versions or whatever.
... And really, to avoid ever having to think about the dependencies you provide dynamically, you're going to switch to a compiled language? If it were such a good idea, nobody would have thought of making languages like Python in the first place.
And uh...
> As long as the receiving end has the latest version of go, the script will run on any OS for tens of years in the future. Anyone who's ever tried to get python working on different systems knows what a steep annoying curve it is.
The pseudo-shebang trick here isn't going to work on Windows any more than a conventional one is. And no, when I switched from Windows to Linux, getting my Python stuff to work was not a "steep annoying curve" at all. It came more or less automatically with acclimating to Linux in general.
(I guess referring to ".pyproject" instead of the actually-meaningful `pyproject.toml` is just part of the trolling.)
I had a recent conversation with a colleague. I said how nice it is using uv now. They said they were glad because they hated messing with virtualenvs so much that preferred TypeScript now. I asked them what node_modules is, they paused for a moment, and replied “point taken”.
Uv still uses venvs because it’s the official way Python stores all the project packages in one place. Node/npm, Go/go, and Rust/cargo all do similar things, but I only really here people grousing about Python’s version, which as you say, you can totally ignore and never ever look at.
The very long discussion (https://discuss.python.org/t/pep-582-python-local-packages-d...) of PEP 582 (https://peps.python.org/pep-0582/ ; the "__pypackages__" folder proposal) seems relevant here.
It'll be interesting to see how this all plays out with __pypackages__ and friends.
Yep. And so does the pyenv approach (which I understand involves permanently adding a relative path to $PATH, wherein the system might place a stub executable that invokes the venv associated with the current working directory).
And so do hand-made subshell-based approaches, etc. etc.
In "development mode" I use my activation-script-based wrappers. When just hacking around I generally just give the path to the venv's python explicitly.
At the time, Poetry and Pipenv were the popular tools, but I found they were not sufficient; they did a good job abstracting dependencies, but not venvs and Python version.
That in retrospective was what made rye temporarily attractive and popular.
I think Java can run uncompiled text scripts now too
However... scripting requires (in my experience), a different ergonomic to shippable software. I can't quite put my finger on it, but bash feels very scriptable, go feels very shippable, python is somewhere in the middle, ruby is closer to bash, rust is up near go on the shippable end.
Good scripting is a mixture of OS-level constructs available to me in the syntax I'm in (bash obviously is just using OS commands with syntactic sugar to create conditional, loops and variables), and the kinds of problems where I don't feel I need a whole lot of tooling: LSPs, test coverage, whatever. It's languages that encourage quick, dirty, throwaway code that allows me to get that one-off job done the guy in sales needs on a Thursday so we can close the month out.
Go doesn't feel like that. If I'm building something in Go I want to bring tests along for the ride, I want to build a proper build pipeline somewhere, I want a release process.
I don't think I've thought about language ergonomics in this sense quite like this before, I'm curious what others think.
It's really a huge pain point in python. Pure python dependencies are amazingly easy to use, but there's a lot of packages that depend on either c extensions that need to be built or have OS dependencies. It's gotten better with wheels and manylinux builds, but you can still shoot your foot off pretty easily.
No, bash is technically not "more" OS than e.g. Python. It just happens that bash is (often) the default shell in the terminal emulator.
In python, doing math or complex string or collection operations is usually a simple oneliner, but calling shell commands or other OS processes requires fiddling with the subprocess module, writing ad-hoc streaming loops, etc - don't even start with piping several commands together.
Bash is the opposite: As long as your task can be structured as a series of shell commands, it absolutely shines - but as soon as you require custom data manipulation in any form, you'll run into awkward edge cases and arbitrary restrictions - even for things that are absolutely basic in other languages.
You inspired me to throw something simpler together - https://pypi.org/project/shell-pilot/
It’s not like you can’t do in Python, but it’s a whole lot more work than typing <10 characters directly into shell.
226 more comments available on Hacker News