this post was submitted on 04 Sep 2024
25 points (90.3% liked)

Python

6375 readers
15 users here now

Welcome to the Python community on the programming.dev Lemmy instance!

πŸ“… Events

PastNovember 2023

October 2023

July 2023

August 2023

September 2023

🐍 Python project:
πŸ’“ Python Community:
✨ Python Ecosystem:
🌌 Fediverse
Communities
Projects
Feeds

founded 1 year ago
MODERATORS
 

I read some articles about using a virtual environment in Docker. Their argument are that the purpose of virtualization in Docker is to introduce isolation and limit conflicts with system packages etc.

However, aren't Docker and Python-based images (e.g., python:*) already doing the same thing?

Can someone eli5 this whole thing?

top 20 comments
sorted by: hot top controversial new old
[–] [email protected] 10 points 2 months ago* (last edited 2 months ago) (4 children)

It's not necessary but there is no reason not to.

Pros:

  • production and development programs are more similar
  • upgrading your base image won't affect your python packages
  • you can use multi stage builds to create drastically smaller final images

Cons:

  • you have to type venv/bin/python3 instead of just python3 in the run line of your dockerfile
[–] [email protected] 5 points 2 months ago

Hah my base python has never seen a command.

Biggest reason for me is that local dev happens in a venv and translating to a container is 100% 1:1 then

[–] [email protected] 3 points 2 months ago

It’s easy to set the path to include the venv in the Dockerfile, that way you never have to activate, either in the run line, nor if you exec into it. Also this makes all your custom entry points super easy to use. Bonus, it’s super easy to use uv to get super fast image builds like that. See this example https://gist.github.com/dwt/6c38a3462487c0a6f71d93a4127d6c73

[–] [email protected] 2 points 2 months ago (1 children)

upgrading your base image won’t affect your python packages

Surely if upgrading python will affect your global python packages it will also affect your venv python packages?

you can use multi stage builds to create drastically smaller final images

This can also be done without using venv's, you just need to copy them to the location where global packages are installed.

[–] [email protected] 3 points 2 months ago (1 children)

Upgrading the base image does not imply updating your python, and even updating your python does not imply updating your python packages (except for the standard libraries, of course).

[–] [email protected] 1 points 2 months ago (1 children)

Sure, but in the case where you upgrade python and it affects python packages it would affect global packages and a venv in the same way.

[–] [email protected] 1 points 2 months ago

Sure If that happens. But it may also not. Which is actually usually the case. Sure, it's not 100% safe, but it is safer.

[–] [email protected] -2 points 2 months ago

If you're on an apple silicon mac, docker performance can be atrocious if you are emulating. It can also be inconvenient to work with Docker volumes and networks. Python already has pyenv and tools like poetry and rye. Unless there's a need for Docker, I personally would generally avoid it (tho I do almost all my deployments via docker containers)

[–] [email protected] 6 points 2 months ago (1 children)

It's a bit unclear to me what you refer to with "their argument". What argument exactly?

[–] [email protected] 1 points 2 months ago

need for isolation inside container even with python image.

[–] [email protected] 6 points 2 months ago (1 children)
[–] [email protected] -2 points 2 months ago (1 children)
[–] [email protected] 5 points 2 months ago (1 children)

Does Hynek's article convince you?

[–] [email protected] 1 points 1 month ago

yes, but will need some more practical usage to fully grasp.

[–] [email protected] 5 points 2 months ago

Could you share the article?

[–] [email protected] 3 points 2 months ago (2 children)

I can think of only two reasons to have a venv inside a container:

  • If you're running third-party services inside a container, pinned to different Python versions.

  • If you do local development without docker and scripts that have to activate the venv from inside the script. If you move the scripts inside the container, now you don't have a venv. But then it's easy to just check an environment variable and skip, if inside Docker.

For most applications, it seems like an unnecessary extra step.

[–] [email protected] 2 points 2 months ago

If you do multi stage builds (example here) it is slightly easier to use venvs.

If you use the global environment you need to hardcode the path to global packages. This path can change when base images are upgraded.

[–] [email protected] 1 points 2 months ago (1 children)

But then it’s easy to just check an environment variable and skip, if inside Docker.

How is forcing your script to be Docker-aware simpler than just always creating a venv?

[–] [email protected] 3 points 2 months ago (1 children)

One Docker env variable and one line of code. Not a heavy lift, really. And next time I shell into the container I don't need to remind everyone to activate the venv.

Creating a venv in Docker just for the hell of it is like creating a symlink to something that never changes or moves.

[–] [email protected] -1 points 2 months ago

How can you be sure it's one line of code? What if there are several codepaths, and venvs are activated in different places? And in any case, even if there is only one conditional needed, that is still one branch more than necessary to test.

Your symlink example does not make sense. There is someting that is changing. In fact, it may even be the opposite: if you need to use file A in s container, and file B otherwise, it may make perfect sense to symlink the correct file to C, so thst your code does not need to care about it.