Uv Is the Best Thing to Happen to the Python Ecosystem in a Decade
Key topics
The HN community discusses the emergence of 'uv' as a potential game-changer in the Python ecosystem, with many praising its speed and ease of use, while others raise concerns about its complexity and potential corporate influence.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
6m
Peak period
138
0-12h
Avg / period
26.7
Based on 160 loaded comments
Key moments
- 01Story posted
Oct 29, 2025 at 2:57 PM EDT
2 months ago
Step 01 - 02First comment
Oct 29, 2025 at 3:03 PM EDT
6m after posting
Step 02 - 03Peak activity
138 comments in 0-12h
Hottest window of the conversation
Step 03 - 04Latest activity
Nov 2, 2025 at 11:24 PM EST
2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
UV is great but I use it as a more convenient pip+venv. Maybe I'm not using it to it's full potential.
uv is probably much more of a game changer for beginner python users who just need to install stuff and don't need to lint. So it's a bigger deal for the broader python ecosystem.
But where it isn't a matter of opinion is, speed. Never met anyone who given then same interface, would prefer a process taking 10x longer to execute.
You aren't, but that's fine. Everyone has their own idea about how tooling should work and come together, and I happen to be in your camp (from what I can tell). I actively don't want an all-in-one tool to do "project management".
What strikes me about uv is that it seems to understand that not everyone launching a Python-based project has a CS degree. That accessibility matters—especially in the era where more non-engineers are building products.
Curious: for those who've switched to uv, did you notice any friction when collaborating with team members who were still on traditional setups? I'm thinking about adoption challenges when you're not a solo builder.
No, the same uv that people have been regularly (https://hn.algolia.com/?q=uv) posting about on HN since its first public releases in February of 2024 (see e.g. https://news.ycombinator.com/item?id=39387641).
> How many are there now?
Why is this a problem? The ecosystem has developed usable interoperable standards (for example, fundamentally uv manages isolated environments by using the same kind of virtual environment created by the standard library — because that's the only kind that Python cares about; the key component is the `pyvenv.cfg` file, and Python is hard-coded to look for and use that); and you don't have to learn or use more than one.
There are competing options because people have different ideas about what a "package manager" should or shouldn't be responsible for, and about the expectations for those tasks.
I don’t really get that uv solves all these problems ve never encountered. Just make a venv and use it seems to work fine.
I don't love that UV is basically tied to a for profit company, Astral. I think such core tooling should be tied to the PSF, but that's a minor point. It's partially the issue I have with Conda too.
The PSF is busy with social issues and doesn't concern itself with trivia like this.
Edit: or was it ruff? Either way. I thought they created the tools first, then the company.
I just... build from source and make virtual environments based off them as necessary. Although I don't really understand why you'd want to keep older patch versions around. (The Windows installers don't even accommodate that, IIRC.) And I can't say I've noticed any of those "significant improvements and differences" between patch versions ever mattering to my own projects.
> I don't love that UV is basically tied to a for profit company, Astral. I think such core tooling should be tied to the PSF, but that's a minor point. It's partially the issue I have with Conda too.
In my book, the less under the PSF's control, the better. The meager funding they do receive now is mostly directed towards making PyCon happen (the main one; others like PyCon Africa get a pittance) and to certain grants, and to a short list of paid staff who are generally speaking board members and other decision makers and not the people actually developing Python. Even without considering "politics" (cf. the latest news turning down a grant for ideological reasons) I consider this gross mismanagement.
Wonderful project
For me package installation is way, way faster with uv, and I appreciate not needing to activate the virtual environment.
I'm interested if you have any technical documentation about how conda environments are structured. It would be nice to be able to interact with them. But I suspect the main problem is that if you use a non-conda tool to put something into a conda environment, there needs to be a way to make conda properly aware of the change. Fundamentally it's the same issue as with trying to use pip in the system environment on Linux, which will interfere with the system package manager (leading to the PEP 668 protections).
uv has implemented experimental support, which they announced here [3].
[0] https://wheelnext.dev/proposals/pepxxx_wheel_variant_support...
[1] https://us.pycon.org/2025/schedule/presentation/100/
[2] https://www.youtube.com/watch?v=1Oki8vAWb1Q
[3] https://astral.sh/blog/wheel-variants
Or by asyncio.
https://peps.python.org/pep-0703/
The "install things that have complex non-Python dependencies using pip" story is much better than several years ago, because of things like pip gaining a new resolver in 2020, but in large part simply because it's now much more likely that the package you want offers a pre-built wheel (and that its dependencies also do). A decade ago, it was common enough that you'd be stuck with source packages even for pure-Python projects, which forced pip to build a wheel locally first (https://pradyunsg.me/blog/2022/12/31/wheels-are-faster-pure-...).
Another important change is that for wheels on PyPI the installer can now obtain separate .metadata files, so it can learn what the transitive dependencies are for a given version of a given project from a small plain-text file rather than having to speculatively download the entire wheel and unpack the METADATA file from it. (This is also possible for source distributions that include PKG-INFO, but they aren't forced to do so, and a source distribution's metadata is allowed to have "dynamic" dependencies that aren't known until the wheel is built (worst case) or a special metadata-only build hook is run (requires additional effort for the build system to support and the developer to implement)).
Wake me up when pip can do any of that.
This is a matter of opinion. Pip exists to install the packages and their dependencies. It does not, by design, exist to manage a project for you.
If anything, pip is a dependency installer, while working with even trivial projects requires a dependency manager. Parent's point was that pip is actually good enough that you don’t even need uv anymore, but as long as pip doesn’t satisfy 80% of the requirements, that’s just plain false.
Some people don't have, or don't care about, the additional requirements you have in mind.
A majority of HN users might agree with you, but I'd guess that a majority of developers, to paraphrase Don Draper, don't think about it at all.
With uv it just works. With pip, technically you can make it work, and I bet you'll screw something up along the way.
This is different as of Python 3.11. Please see https://peps.python.org/pep-0668/ for details. Nowadays, to install a package globally, you first have to have a global copy of pip (Debian makes you install that separately), then you have to intentionally bypass a security marker using --break-system-packages.
Also, you don't have to activate the venv to use it. You can specify the path to the venv's pip explicitly; or you can use a different copy of pip (e.g. a globally-installed one) passing it the `--python` argument (you have been able to do this for about 3 years now).
(Pedantically, yes, you could use a venv-installed copy of pip to install into the system environment, passing both --python and --break-system-packages. I can't prove that anyone has ever done this, and I can't fathom a reason beyond bragging rights.)
> - really easy to distinguish [dev] and main dependencies
As of 25.1, pip can install from dependency groups described in pyproject.toml, which is the standard way to group your dependencies in metadata.
> distinguish direct dependencies from indirect dependencies, making it easy to find when a package is not needed anymore
As of 25.1, pip can create PEP 751 standard lockfiles.
> easily use different python versions for different projects
If you want something to install Python for you, yes, that was never in pip's purview, by design.
If you want to use an environment based off an existing Python, that's what venv is for.
I'm still mostly on poetry
Currently they are a bit pointless. Sure they aid in documentation, but they are effort and cause you pain when making modifications (mind you with halfarse agentic coding its probably less of a problem. )
What would be better is to have a strict mode where instead of duck typing its pre-declared. It would also make a bunch of things faster (along with breaking everything and the spirit of the language)
I still don't get the appeal of UV, but thats possibly because I'm old and have been using pyenv and venv for many many years. This means that anything new is an attack on my very being.
however if it means that conda fucks off and dies, then I'm willing to move to UV.
I've been using it professionally and its been a big improvement for code quality.
Types save you cognitive effort and catch errors earlier, while writing code, not later when running or testing
Then again it's not so bad if you're willing to make AI add all the types and not even care.
It's the python version of fink vs macports vs homebrew. Or apt vs deb. or pkgsrc vs ports.
But I don't think "its just another" gets the value proposition here. It's significantly simpler to deploy in practice for people like me, writing ad hoc scripts and running git downloaded scripts and codelets.
Yes, virtualenv and pip existed. No, they turned out to be a lot more fiddly to run in practice than UV.
That UV is rust is funny, but not in a terrible way. The llvm compiler toolchain is written in C but compiles other languages. Using one language to do things for another language isn't such a terrible outcome.
I hope UV supplants the others. Not to disrespect their authors, but UV is better for end users. If its worse for package maintainers I think the UV authors should be told.
/just guessing, haven't tried it
Maybe if you trust the software, then trusting the install script isn't that big of a stretch?
Looking at the install script or at a release page (eg. https://github.com/astral-sh/uv/releases/tag/0.9.6 ) shows they have pretty broad hardware support in their pre-compiled binaries. The most plausible route to being disappointed by the versatility of this install script is probably if you're running an OS that's not Linux, macOS, or Windows—but then, the README is pretty clear about enumerating those three as the supported operating systems.
Also, many of the "distribution" tools like brew, scoop, winget, and more are just "PR a YAML file with your zip file URL, name of your EXE to add to a PATH, and a checksum hash of the zip to this git repository". We're about at a minimum effort needed to generate a "distribution" point in software history, so seems interesting shell scripts to install things seem to have picked up instead.
The software is not written in a scripting language where forgetting quote marks regularly causes silent `rm -rf /` incidents. And even then, I probably don't explicitly point the software at my system root/home and tell it to go wild.
There have also been PoCs on serving malicious content only when piped to sh rather than saved to file.
If you want to execute shell code from the internet, at the very least store it in a file first and store that file somewhere persistent before executing it. It will make forensics easier
There's no guarantee packages are actually making use of package features in any reasonable way, other than convention.
Of course unpredictability itself is also a security problem. I'm not even supposed to run partial updates that at least come from the same repository. I ain't gonna shovel random shell scripts into the mix and hope for the best.
Versioning OTOH is often more problematic with distro package managers that can't support multiple versions of the same package.
Also inability to do user install is a big problem with distro managers.
no. thats how you get malware. Make a package. Add it to a distro. then we will talk.
You can `pip install uv` or manually download and extract the right uv-*.tar.gz file from github: https://github.com/astral-sh/uv/releases
Also, most reasonable developers should already be running with the ExecutionPolicy RemoteSigned, it would be nice if code signing these install script was a little more common, too. (There was even a proposal for icm [Invoke-Command] to take signed script URLs directly for a much safer alternative code-golfed version of iwr|iex. Maybe that proposal should be picked back up.)
This isn't a knock against UV, but more a criticism of dynamic dependency resolution. I'd feel much better about this if UV had a way to whitelist specific dependencies/dependency versions.
uv installing deps is hardly more risky.
Scanning for external dependencies is common but not so much internal private libraries.
I've used Tiger/Saint/Satan/COPS in the distant past. But I think they're somewhat obsoleted by modern packaging and security like apparmor and selinux, not to mention docker and similar isolators.
uv executes http://somemirror.com/some-version
most people like their distro to vet these things. uv et all had a reason when Python2 and 3 were a mess. i think that time is way behind us. pip is mostly to install libraries, and even that is mostly already done by the distros.
But it’s much harder to inspect what the imports are going to do and be sure they’re free of any unsavory behavior.
I meant it’s easy to inspect your script’s logic — look it. Bunch harder to audit the code in dependencies though…
curl -LsSf https://astral.sh/uv/install.sh | sh """
Also isn't great. But that's how homebrew is installed, so ... shrug ... ?
Not to bash uv/homebrew, they are better than most _easy_ alternatives.
I will happily copy-paste this from any source I trust, for the same reason I'll happily install their software any other way.
For anything that I want to depend on, I prefer stronger auditability to ease of install. I get it, theoretically you can do the exact same thing with curl/sh as with git download/inspecting dependencies, installing the source and so on. But in reality, I'm lazy (and per another thread, a 70s hippie) and would like to nix any temptation to cut corners in the bud.
But then I'm a weirdo that takes personal offense at tools hijacking my rc / PATH, and keep things like homebrew at arm's length, explicitly calling shellenv when I need to use it.
It’s the script contents that count, not just dependencies.
Deno-style dependency version pinning doesn’t solve this problem unless you check every hash.
If you don't care about being ecosystem-compliant (and I am sure malware does not), it's only a few lines of Python to download the code and eval it.
As long as you have internet access, and whatever repository it's drawing from is online, and you may get different version of python each time, ...
But, yes, python scripts with in-script dependencies plus uv to run them doesn't change dependency distribution, just streamlines use compared to manual setup of a venv per script.
The man page tells me:
Without that, the system may try to treat the entirety of "uv run --script" as the program name, and fail to find it. Depending on your env implementation and/or your shell, this may not be needed.See also: https://unix.stackexchange.com/questions/361794
-S causes the string to be split on spaces and so the arguments are passed correctly.
So in fact "-S" is not passed as a separate argument, but as a prefix in the first (and only) argument, and env then extracts it and acts accordingly:
``` $ /usr/bin/env "-S echo deadbeef" deadbeef ```
It will install and use distribution packages, to use PyPA's terminology; the term "module" generally refers to a component of an import package. Which is to say: the names you write here must be the names that you would use in a `uv pip install` command, not the names you `import` in the code, although they may align.
This is an ecosystem standard (https://peps.python.org/pep-0723/) and pipx (https://pipx.pypa.io) also supports it.
linux core utils have supported this since 2018 (coreutils 8.3), amusingly it is the same release that added `cp --reflink`. AFAIK I know you have to opt out by having `POSIX_CORRECT=1` or `POSIX_ME_HARDER=1` or `--pedantic` set in your environment. [1]
freebsd core utils have supported this since 2008
MacOS has basically always supported this.
---
1. Amusingly despite `POSIX_ME_HARDER` not being official a alrge swapt of core utils support it. https://www.gnu.org/prep/standards/html_node/Non_002dGNU-Sta...
I want to be able to ship a bundle which needs zero network access to run, but will run.
It is still frustratingly difficult to make portable Python programs.
Although several variations on this theme already exist, I'm sure. https://github.com/pex-tool/pex/ is arguably one of them, but it's quite a bit bulkier than what I'm looking for.
My current hobby language is janet. Creating a statically linked binary from a script in janet is trivial. You can even bring your own C libraries.
But whoever runs this has to install uv first, so not really standalone.
"Lol, no I break into computer systems I am a hacker"
"Geeze hell no I have an axe, I am an OG hacker"
I could totally see `#!/usr/bin/python723` become a thing :)
The two main runners I am aware of are uv and pipx. (Any compliant runner can be referenced in the shebang to make a script standalone where shebangs are supported.)
Small price to pay for escaping python dependency hell.
(sadly, uv cannot detect the release date of some packages. I'm looking at you, yaml!)
UV means getting more strings attached with VC funded companies and leaning on their infrastructure. This is a high risk for any FOSS community and history tells us how this ends….
uv is MIT licensed so if they rug pull, you can fork.
Speaking of history, I was very sympathetic to the "we are open-source volunteers, give us a break" kind of stuff for the first N years.. but pypa has a pattern of creating problems, ignoring them, ignoring criticism, ignoring people who are trying to help, and pushing talent+interest elsewhere. This has fragmented the packaging ecosystem in a way that confuses newcomers, forces constant maintenance and training burden on experts, and damages the credibility of the language and its users. Hatch is frankly too little too late, and even if it becomes a wonderful standard, it would just force more maintenance, more confusion for a "temporary" period that lasts many, many years. Confidence is too far gone.
As mentioned elsewhere in the thread, there are tons of conflicting tools in the space already, and due to the fragmentation, poetry etc could never get critical mass. That's partly because pypa stuff felt most "official" and a safer long term bet than anything else, but partly because 33% better was never good enough to encourage widespread adoption until it was closer to 200% better. But uv actually IS that much better. Just let it win.
And let pypa be a case-study in how to NOT do FOSS. Fragmentation is fine up to a point, but you know what? If it wasn't for KDE / Gnome reinventing the wheel for every single kind of individual GUI then we'd have already seen the glorious "year of the linux desktop" by now.
1164 more comments available on Hacker News