Back to Home11/18/2025, 4:23:45 PM

Python package managers: uv vs. pixi?

1 points
1 comments

Mood

thoughtful

Sentiment

positive

Category

tech

Key topics

Python package management

uv

pixi

conda

Debate intensity10/100

The author compares Python package managers uv and pixi, sharing their experience and workflow, sparking a discussion on Python package tooling fragmentation.

Snapshot generated from the HN discussion

Discussion Activity

Light discussion

First comment

2m

Peak period

2

Hour 1

Avg / period

1.5

Comment distribution3 data points

Based on 3 loaded comments

Key moments

  1. 01Story posted

    11/18/2025, 4:23:45 PM

    5h ago

    Step 01
  2. 02First comment

    11/18/2025, 4:25:40 PM

    2m after posting

    Step 02
  3. 03Peak activity

    2 comments in Hour 1

    Hottest window of the conversation

    Step 03
  4. 04Latest activity

    11/18/2025, 9:28:53 PM

    38m ago

    Step 04

Generating AI Summary...

Analyzing up to 500 comments to identify key contributors and discussion patterns

Discussion (1 comments)
Showing 3 comments
jacobtomlinson
5h ago
1 reply
Author of the post here. For Python package management I use a mixture of pixi, uv and conda depending on the task I'm doing. I wrote up a long form post about the history of these tools, why each one exists, and why I settled on these choices in my workflow. I hope this is interesting to folks who want to learn more about why Python package tooling is so fragmented!
counters
4h ago
Thanks for the fantastic write-up! This is a great breakdown of to lean into the strength of each of these tools.
zahlman
38m ago
> One big problem with pip in the early days was that it only handled source distributions. This means it could download a gzip file of source code and put it in the right place, call some hooks that was it. If you wanted to package some C code that could be used from Python you would need to zip up the C code along with it’s Python bindings and publish it to PyPI, then when pip installed the package it would download the code and then run the compiler locally on your machine to turn the C code into something that could actually be executed. If you didn’t have all the C compilers and related tools on your machine you were in for a bad time.

This seems at least inaccurate. Pip could handle Setuptools' existing binary .egg format from the start, and these commonly even included pre-compiled .pyc files. They just weren't (often?) used to ship pre-compiled C because a) people had a more rigid security posture back then and wanted to compile for themselves; b) eggs were not wheels, and in particular wheel-specific standards like "manylinux" (https://peps.python.org/pep-0600/) didn't exist yet. (I don't think eggs even had a standardized way of specifying a platform in the filename.) And on the other hand, even today not everything is available as a wheel, and some people are on platforms that are not well covered by cross-compiled wheels.

Also, it wasn't pip "running the compiler" really. It would shell out to Setuptools for that. (Setuptools is no longer a dependency of pip, but for source distributions pip still dynamically obtains and uses either Setuptools or another build backend as specified by the package, per https://peps.python.org/pep-0517/.

> But it took a different road and instead of always creating your environment in the current directory

Just to nitpick, `python -m venv` lets you specify a path, it isn't just a name for the venv folder in the current directory.

> Because PyPI is for Python code and its dependencies it doesn’t want to think about the dependencies of those dependencies. So if your C code depends on something else then you need to bundle all of that together.

Not necessarily. Nothing in https://packaging.python.org/en/latest/specifications/binary..., as far as I can tell, actually requires a wheel to contain any Python code. It's just that interfacing C code in one installed wheel to another might be a bit tricky (I guess the simplest thing is to figure out the relative path to the .so in the other installed wheel, but probably the right thing is to mess around with `importlib_resources` and go through some thin Python wrappers — for example, uv is available as a wheel that basically just has the compiled Rust binary plus a Python script that deduces the path to that binary).

> This package manager effectively wrapped pip but also stored a lock file of the solve and allowed you to reproduce your environment easily. While I’m sure poetry has some other great features I tend to think of it as pip + locking.

It's meant for project management — in particular, a bunch of its subcommands will actually update the contents of pyproject.toml, as well as managing the poetry.lock. And it provides its own build backend (analogous to the role Setuptools plays now), called `poetry-core` as a package now but historically known as Masonry. It's meant (much like uv, and of course rye as you mention) to be an all-in-one tool for devs, rather than just a way for users writing a few lines of Python to access heavyweight dependencies. It just doesn't include the "get Python itself" step.

> Next-gen pip with uv

As you can imagine from the above, I think it's better termed "next-gen Poetry" than "next-gen pip". And "reimplementing pip directly" is missing a lot. The advantages come mainly from architectural changes rather than being written in Rust.

ID: 45968309Type: storyLast synced: 11/18/2025, 4:26:41 PM

Want the full context?

Jump to the original sources

Read the primary article or dive into the live Hacker News thread when you're ready.