Why 'uv' Should Be the First Thing You Install on your Deep Learning Machine

Posted on Jul 17, 2025

Ever had that one experiment where PyTorch 1.13.1 was your ride-or-die, but the next one needed 2.1.0 just to compile a single transformer model? Yeah, me too. Welcome to the mad, maddening world of ML environment juggling.ngx08

In machine and deep learning, you’re never really “done” installing packages. One project might want torch==1.13.1, another might scream if it doesn’t get torch==2.0.1+cu118, and God help you if you’re fiddling with tensorflow. Each model or paper reimplementation turns into a new universe of dependencies.

People Don’t Know They Hate pip, Yet.

Let’s be honest about what pip actually does to your productivity. You clone a promising computer vision repo, run pip install -r requirements.txt, and suddenly you’re in dependency hell. The resolver chokes on conflicting versions, downloads packages one by one like it’s 2010, and creates bloated environments that eat your SSD space faster than you can say “CUDA out of memory.”

But here’s the real kicker: people put up with this not because they want to,but because for a long time there was no viable alternative.

Sure, there have been contenders: Poetry, pipenv, Conda. I have tried all of they briefly. All tried to solve the problem from different angles. But none were fast, simple, and frictionless enough to replace pip at scale. Most either collapsed under complexity or required compromises developers weren’t willing to make.

To quote OG. Steve Jobs, people don’t know what they want until you show it to them. That’s the tragedy here. Developers didn’t know they could expect more, because no one built it.

So what’s uv?

If you weren’t living under rocks, you would probably be using or heard about it. For the cave people; think of uv as pip and virtualenv with a rocket booster and a lot more sass. It’s built in Rust (yes, the fast and fashionable one), and frankly, it’s embarrassing how much better it is than the Python tooling we’ve been settling for.

uv isn’t just another package manager, it’s a statement that Python packaging can actually work properly.

In other words: no more watching pip spin for 4 minutes because it can’t decide between urllib3 1.26.6 and 1.26.9. No more “Collecting…” messages that make you question your life choices. No more praying to the dependency gods that your environment will actually resolve.

Why uv Obliterates pip and others (And Why You Should Care)

Blazing Fast Dependency Resolution

The Rust-based resolver doesn’t just respect constraints, it actually understands them. While pip’s resolver is busy having an existential crisis about package versions, uv has already figured out the optimal solution and moved on with its life.

This isn’t just about speed (though uv is stupidly fast). It’s about reliability. When you’re trying to reproduce a paper’s results at 2 AM before a deadline, you need tools that work predictably, not ones that randomly fail on version conflicts.

Parallel Downloads: A Game-Changer for CV Work

Instead of fetching and installing packages one by one like pip (seriously, what is this, dial-up internet?), uv downloads everything in parallel. This is absolutely crucial for computer vision work where you’re dealing with massive packages.

When you’re installing the holy trinity of torch, torchvision, and opencv-python, you’re not stuck in “Downloading…” hell for 5 minutes. Some torch wheels are >150MB. Now imagine downloading 4 of those simultaneously instead of sequentially. That’s what uv does, and watching green progress lines speed-run to completion, it’s glorious.

Fun fact: I timed this. A typical CV environment setup that takes pip 8-12 minutes? uv does it in under 2 minutes. That’s not just convenience, that’s hours of your life back over the course of a project.

Shared Cache Across Projects: Finally, Sanity

No more duplicating the same 200MB wheel in every .venv/. If you install torch once, it’s reused across all uv environments. This isn’t just about disk space (though your SSD will thank you), it’s about mental overhead.

With pip, every new project feels like starting from scratch. With uv, you’re building on a foundation of already-cached packages. It’s the difference between feeling productive and feeling like you’re fighting your tools.

Zero Configuration, Drop-in CLI

Works just like pip, but better. You don’t have to learn anything new, rewrite your scripts, or convince your team to adopt some exotic workflow. It’s literally just adding uv in front of your existing pip commands.

This is huge for teams. No “tool evangelist” friction, no training sessions, no resistance to change. Just immediate improvement.

Here’s a screenshot from my machine:
Since I already downloaded and installed in another environment, it resolved 28 packages in 9ms and installed in 275ms. That’s not a typo. Nine milliseconds to resolve what would take pip several minutes. That too all the nvidia packages, IFKYK.

Why This Matters More Than You Think

The Hidden Cost of Bad Tooling

Every minute you spend wrestling with pip is a minute not spent on actual machine learning. Bad tools don’t just slow you down, they fundamentally change how you approach problems. When environment setup is painful, you avoid experimenting. When package installation is unreliable, you stick with what works instead of trying new approaches.

uv removes these friction points. Suddenly, spinning up a new environment to test a different PyTorch version isn’t a 20-minute ordeal, it’s a 30-second afterthought.

Reproducibility Actually Works

In ML, reproducibility isn’t just nice to have, it’s essential. When your paper gets accepted and reviewers want to run your code, or when you’re trying to build on someone else’s work, you need environments that set up identically every time.

uv’s deterministic resolution and shared caching means your requirements.txt actually does what it promises. No more “it worked on my machine” excuses to the most part, I agree there will be some variables for niche cases.

Team Productivity Multiplier

When everyone on your team uses uv, onboarding becomes trivial. New team members can get productive in minutes, not hours. Code reviews can focus on algorithms instead of environment setup issues.

This isn’t just about individual productivity, it’s about team velocity. And in competitive ML research, velocity matters.

The Bottom Line

uv feels like what pip should’ve been all along. For computer vision folks juggling multiple models, environments, and CUDA builds, it’s not just an improvement, it’s a necessity.

No more bloated .venv/ directories eating your disk space. No more guessing which torch version works with your GPU. No more hour-long setups that make you question your career choices. You can spend your time training YOLOs, not debugging pip.

Here’s my take: If you’re still using pip for ML work in 2025, you’re choosing to work harder, not smarter. uv is free, it’s stable, and it solves real problems that waste hours of your time.

If you’re serious about repeatable experiments, reproducibility, and not going insane managing OpenCV versions, just use uv. Your future self will thank you.

It won’t give you 60 FPS inference, but it will get you there faster. And in a field where time-to-experiment matters more than ever, that’s the competitive advantage you need.

Getting Started (Because You Should, Right Now)

I don’t want to sound tutorial-ish by add these instructions, you can find better guide on installations and usage elsewhere, always refer the official docs it’s the holy grail. But including these here serves as a quick reference for myself. Hopefully, you’ll find them useful as well.

Installation: Do This Once, Never Think About It Again

curl -LsSf https://astral.sh/uv/install.sh | sh

Or via pipx (if you’re into that):

pipx install uv

Creating & Managing Environments

uv venv

Or with specific python version (more on this life-changing feature later):

uv venv --python 3.10

Boom. Virtual environment created in milliseconds. And here’s the beautiful part: uv doesn’t replicate itself in every environment. It’s centralized and shared, like a proper tool should be.

Activating is the same as always:

source uv.env/bin/activate

Installing packages? Just add uv as prefix

uv pip install torch torchvision

Need specific versions? Same familiar syntax:

uv pip install torch==2.1.0+cu121 --extra-index-url https://download.pytorch.org/whl/cu121

With pip, this might take 3–5 minutes for sequential downloads, possible resolver hang-ups, maybe a wheel build from source (the horror). With uv? You’re done before you can grab coffee.

Lock it down:

uv pip freeze > requirements.txt
uv lock requirements.txt

Or go from scratch:

uv pip install -r requirements.txt

Config Files for Sanity

Create .uv/config.toml for global or project-wide configs:

[install]
index-url = "https://pypi.org/simple"

[venv]
path = "uv.env"

Pro-tip: Version-control this file and never worry about onboarding teammates (or future-you) again. No more “works on my machine” debugging sessions.

Network Timeout Handling

For those inevitable network hiccups with large packages:

export UV_HTTP_TIMEOUT=600

This sets the HTTP request timeout to 10 minutes. Because sometimes downloading 500MB of CUDA libraries takes time, and that’s okay.

The Real Game-Changers

Unified Python Version Management: Death to pyenv

This is where uv stops being just “faster pip” and becomes indispensable. Managing Python interpreters is seamless:

uv python install 3.11
uv python install 3.12

Set per-project Python version with .uv/config.toml:

[runtime]
python = "3.11"

This is huge. No more pyenv slowness, no more PATH gymnastics, no more “which Python am I using?” confusion. Just specify the version you need and move on with your life.

I cannot overstate how much mental overhead this removes from project switching. Your brain should be thinking about model architectures, not interpreter management.

uv sync: The Magic Command

Already have a requirements.txt or pyproject.toml? Just run:

uv sync

It installs all dependencies, creates the environment, and ensures parity with lockfiles if present. One command, zero guesswork.

This is especially powerful when cloning CV repos. Instead of the usual dance of “create venv, activate, install requirements, debug conflicts, repeat,” you just run uv sync and start working. It’s what software development should feel like.