-1

I am currently working solo on a very small simple python microservice. I started building this app, mostly by habit, in a virtual environment. As I started to get closer to the point where deployment considerations needed to be made, I started to realize that all of the dependencies I'd need to include in the container are already "contained" inside the virtual env.

In fact, as far as I've thought it through (which is admittedly miles from 'all the way,') for a python app, one could just deploy a virtual environment with installed dependencies preinstalled onto a server. As for x-axis scalability, the directory could be copied, no.

My question, then, is this:

Is this idea even feasible, or am I overlooking something? And if so, are there any advantages to using containers over virtual environments?

Nate T
  • 125

2 Answers2

2

Absolutely, virtual envs are a great option for isolating dependencies. But only for dependencies that are Python packages. You also need configuration management on the server: for the available Python versions, for native libraries, and for other tools that might be needed. And I don't think you should copy a virtualenv to a different system.

In contrast, a container let's you isolate all dependencies, except for the Linux kernel and of course external services.

In practice, either variant can be fine for deployment. Personally, I prefer using containers because this allows for more control, easier testing, and provides a security barrier between the application and the host system.

amon
  • 135,795
0

There are limits how much the Python virtual environment can do. You cannot install the needed version of CUDA or Oracle database just with pip.

h22
  • 965