ChatGPT解决这个技术问题 Extra ChatGPT

What is the difference between venv, pyvenv, pyenv, virtualenv, virtualenvwrapper, pipenv, etc?

Python 3.3 includes in its standard library the new package venv. What does it do, and how does it differ from all the other packages that match the regex (py)?(v|virtual|pip)?env?

And to preempt the close votes, I felt this was a more general question than stackoverflow.com/questions/29950300/… , and so I didn't feel comfortable editing that question or posting an overly general answer on that post.
This guide is both useful & constantly updated as python continues to add more & more "one & only one obvious way" to do things: docs.python-guide.org/en/latest/dev/virtualenvs
As of 3.6 I found it easier to get virtualenv working in comparison to pyenv on macOS (I'm a pyNoob)
I burned an entire day wasting time with pipenv. Bottom line, it’s overmarketed. Venv and virtualenv if you need py2 are the proper tools. Conda (miniconda if you don’t need the full stack) is also very good. Very good writeup: chriswarrick.com/blog/2018/07/17/…
I think the accepted answer below has some unfortunate bias against venv, which is the correct tool to use going forward for Python 3. It should really be first on the list, followed by virtualenv. docs.python.org/3/library/venv.html

F
Flimm

This is my personal recommendation for beginners: start by learning virtualenv and pip, tools which work with both Python 2 and 3 and in a variety of situations, and pick up other tools once you start needing them.

Now on to the answer to the question: what is the difference between these simalarly named things: venv, virtualenv, etc?

PyPI packages not in the standard library:

virtualenv is a very popular tool that creates isolated Python environments for Python libraries. If you're not familiar with this tool, I highly recommend learning it, as it is a very useful tool. It works by installing a bunch of files in a directory (eg: env/), and then modifying the PATH environment variable to prefix it with a custom bin directory (eg: env/bin/). An exact copy of the python or python3 binary is placed in this directory, but Python is programmed to look for libraries relative to its path first, in the environment directory. It's not part of Python's standard library, but is officially blessed by the PyPA (Python Packaging Authority). Once activated, you can install packages in the virtual environment using pip.

pyenv is used to isolate Python versions. For example, you may want to test your code against Python 2.7, 3.6, 3.7 and 3.8, so you'll need a way to switch between them. Once activated, it prefixes the PATH environment variable with ~/.pyenv/shims, where there are special files matching the Python commands (python, pip). These are not copies of the Python-shipped commands; they are special scripts that decide on the fly which version of Python to run based on the PYENV_VERSION environment variable, or the .python-version file, or the ~/.pyenv/version file. pyenv also makes the process of downloading and installing multiple Python versions easier, using the command pyenv install.

pyenv-virtualenv is a plugin for pyenv by the same author as pyenv, to allow you to use pyenv and virtualenv at the same time conveniently. However, if you're using Python 3.3 or later, pyenv-virtualenv will try to run python -m venv if it is available, instead of virtualenv. You can use virtualenv and pyenv together without pyenv-virtualenv, if you don't want the convenience features.

virtualenvwrapper is a set of extensions to virtualenv (see docs). It gives you commands like mkvirtualenv, lssitepackages, and especially workon for switching between different virtualenv directories. This tool is especially useful if you want multiple virtualenv directories.

pyenv-virtualenvwrapper is a plugin for pyenv by the same author as pyenv, to conveniently integrate virtualenvwrapper into pyenv.

pipenv aims to combine Pipfile, pip and virtualenv into one command on the command-line. The virtualenv directory typically gets placed in ~/.local/share/virtualenvs/XXX, with XXX being a hash of the path of the project directory. This is different from virtualenv, where the directory is typically in the current working directory. pipenv is meant to be used when developing Python applications (as opposed to libraries). There are alternatives to pipenv, such as poetry, which I won't list here since this question is only about the packages that are similarly named.

Standard library:

pyvenv (not to be confused with pyenv in the previous section) is a script shipped with Python 3.3 to 3.7. It was removed from Python 3.8 as it had problems (not to mention the confusing name). Running python3 -m venv has exactly the same effect as pyvenv.

venv is a package shipped with Python 3, which you can run using python3 -m venv (although for some reason some distros separate it out into a separate distro package, such as python3-venv on Ubuntu/Debian). It serves the same purpose as virtualenv, but only has a subset of its features (see a comparison here). virtualenv continues to be more popular than venv, especially since the former supports both Python 2 and 3.


This is very helpful! So why are there 8 tangled things instead of 1? (“There should be one – and preferably only one – obvious way to do it.” -- The Zen of Python)
@Jerry101, the introduction of venv is in part a response to that mess. If you want to help improve the situation, I suggest you use venv and encourage others to do the same.
"the introduction of venv is in part a response to that mess" How come when there are too many things that do 'something like X', people always think they can improve that mess by making an other thing that does 'something like X'. Its kind of funny actually. We are now 4 years later... so may be pertinent to ask, did venv actually solve that problem?
The only two tools on the list that truly cover what is arguably the same territory are virtualenv and venv, so the characterization that we're dealing with a mess caused by several competing tools is not very precise. The list does, however, consist of several virtual environment-related tools, all with similar-sounding names. That can be confusing, especially to users who are just learning about them. Did venv improve the situation? It did offer a more light-weight alternative to other virtual environment tools, benefiting from native modifications and a spot in the standard library. …
Obligatory xkcd.com/927
w
wisbucky

I would just avoid the use of virtualenv after Python3.3+ and instead use the standard shipped library venv. To create a new virtual environment you would type:

$ python3 -m venv <MYVENV>  

virtualenv tries to copy the Python binary into the virtual environment's bin directory. However it does not update library file links embedded into that binary, so if you build Python from source into a non-system directory with relative path names, the Python binary breaks. Since this is how you make a copy distributable Python, it is a big flaw. BTW to inspect embedded library file links on OS X, use otool. For example from within your virtual environment, type:

$ otool -L bin/python
python:
    @executable_path/../Python (compatibility version 3.4.0, current version 3.4.0)
    /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1238.0.0)

Consequently I would avoid virtualenvwrapper and pipenv. pyvenv is deprecated. pyenv seems to be used often where virtualenv is used but I would stay away from it also since I think venv also does what pyenv is built for.

venv creates virtual environments in the shell that are fresh and sandboxed, with user-installable libraries, and it's multi-python safe.

Fresh: because virtual environments only start with the standard libraries that ship with python, you have to install any other libraries all over again with pip install while the virtual environment is active.

Sandboxed: because none of these new library installs are visible outside the virtual environment, so you can delete the whole environment and start again without worrying about impacting your base python install.

User-installable libraries: because the virtual environment's target folder is created without sudo in some directory you already own, so you won't need sudo permissions to install libraries into it.

multi-python safe: because when virtual environments activate, the shell only sees the python version (3.4, 3.5 etc.) that was used to build that virtual environment.

pyenv is similar to venv in that it lets you manage multiple python environments. However with pyenv you can't conveniently rollback library installs to some start state and you will likely need admin privileges at some point to update libraries. So I think it is also best to use venv.

In the last couple of years I have found many problems in build systems (emacs packages, python standalone application builders, installers...) that ultimately come down to issues with virtualenv. I think python will be a better platform when we eliminate this additional option and only use venv.

EDIT: Tweet of the BDFL,

I use venv (in the stdlib) and a bunch of shell aliases to quickly switch. — Guido van Rossum (@gvanrossum) October 22, 2020


Great answer @RiazRizvi and it provides many insights in parallel to accepted answer. However, I would argue that pyenv still has it's place under the sun despite venv getting traction for virtual environments. The classic reason I can think of still using pyenv right now in my workflows is that highest Python runtime that AWS Lambda supports is 3.8 and Python 3.9 being out I want other non-Lambda projects to be 3.9 based. So I still need pyenv to switch between versions. Using pyenv-virtualenv allows users to use both pyenv and venv (not `virtualenv) together.
what's wrong with virtualenvwrapper?
@riaz rizvi Multi python safe: how do you creat virtual environment for different python versions? I thought it always defaults to the python (system wide installed) version that is used to create the venv
somuchtolearnandshare - make the explicit call to the python you want to use - $ path/to/python3x -m venv <MYVENVx> or $ path/to/python3y -m venv <MYVENVy> then when you activate the environment you will activate the python that was used to create the environment
@Edison , I use conda directly most of the time, unless I am being lazy (then I might use Anaconda Navigator). If I have damaged a conda environment inadvertently by using pip interleaved with conda installation commands, then I will use conda to rollback to an earlier conda environment revision (see conda list --revisions) or use conda to remove the environment after exporting the environment.yaml file. I use Spyder, JupyterLab, VSCode, and PyCharm (in that order depending on the simplicity of what I am working on). Simpler is better. VSCode + plugins is a good full function IDE.
F
F1Linux

UPDATE 20200825:

Added below "Conclusion" paragraph

I've went down the pipenv rabbit hole (it's a deep and dark hole indeed...) and since the last answer is over 2 years ago, felt it was useful to update the discussion with the latest developments on the Python virtual envelopes topic I've found.

DISCLAIMER:

This answer is NOT about continuing the raging debate about the merits of pipenv versus venv as envelope solutions- I make no endorsement of either. It's about PyPA endorsing conflicting standards and how future development of virtualenv promises to negate making an either/or choice between them at all. I focused on these two tools precisely because they are the anointed ones by PyPA.

venv

As the OP notes, venv is a tool for virtualizing environments. NOT a third party solution, but native tool. PyPA endorses venv for creating VIRTUAL ENVELOPES: "Changed in version 3.5: The use of venv is now recommended for creating virtual environments".

pipenv

pipenv- like venv - can be used to create virtual envelopes but additionally rolls-in package management and vulnerability checking functionality. Instead of using requirements.txt, pipenv delivers package management via Pipfile. As PyPA endorses pipenv for PACKAGE MANAGEMENT, that would seem to imply pipfile is to supplant requirements.txt.

HOWEVER: pipenv uses virtualenv as its tool for creating virtual envelopes, NOT venv which is endorsed by PyPA as the go-to tool for creating virtual envelopes.

Conflicting Standards:

So if settling on a virtual envelope solution wasn't difficult enough, we now have PyPA endorsing two different tools which use different virtual envelope solutions. The raging Github debate on venv vs virtualenv which highlights this conflict can be found here.

Conflict Resolution:

The Github debate referenced in above link has steered virtualenv development in the direction of accommodating venv in future releases:

prefer built-in venv: if the target python has venv we'll create the environment using that (and then perform subsequent operations on that to facilitate other guarantees we offer)

Conclusion:

So it looks like there will be some future convergence between the two rival virtual envelope solutions, but as of now pipenv- which uses virtualenv - varies materially from venv.

Given the problems pipenv solves and the fact that PyPA has given its blessing, it appears to have a bright future. And if virtualenv delivers on its proposed development objectives, choosing a virtual envelope solution should no longer be a case of either pipenv OR venv.

Update 20200825:

An oft repeated criticism of Pipenv I saw when producing this analysis was that it was not actively maintained. Indeed, what's the point of using a solution whose future could be seen questionable due to lack of continuous development? After a dry spell of about 18 months, Pipenv is once again being actively developed. Indeed, large and material updates have since been released.


And what about pyenv? This is a good answer, because it looks at future directions, but it's not clear how it interacts with pyenv or conda or other environment mamagers
Avoid ALL pip virtual environments whenever possible. Use conda instead. Conda provides a unified approach. It is maintained by teams of professional open source developers and has a reputable company providing funding and a commercially supported version. The teams that maintain pip, venv, virtualenv, pipenv, and many other pip variants have limited resources by comparison. The pip virtual environment plurality is frustrating for beginners. Use pip virtual environments and their (too) many variants as a last resort when conda packages do not exist.
@naught101 pyenv is not an alternative to virtualenv. And neither of those things are alternatives to pipenv. They do different things. Just like Django, Python and PostgreSQL are different things.
@Flimm: different how?
@naught101 See the other answers for this question post (including my own).
L
Lie Ryan

Let's start with the problems these tools want to solve:

My system package manager don't have the Python versions I wanted or I want to install multiple Python versions side by side, Python 3.9.0 and Python 3.9.1, Python 3.5.3, etc

Then use pyenv.

I want to install and run multiple applications with different, conflicting dependencies.

Then use virtualenv or venv. These are almost completely interchangeable, the difference being that virtualenv supports older python versions and has a few more minor unique features, while venv is in the standard library.

I'm developing an /application/ and need to manage my dependencies, and manage the dependency resolution of the dependencies of my project.

Then use pipenv or poetry.

I'm developing a /library/ or a /package/ and want to specify the dependencies that my library users need to install

Then use setuptools.

I used virtualenv, but I don't like virtualenv folders being scattered around various project folders. I want a centralised management of the environments and some simple project management

Then use virtualenvwrapper. Variant: pyenv-virtualenvwrapper if you also use pyenv.

Not recommended

pyvenv. This is deprecated, use venv or virtualenv instead. Not to be confused with pipenv or pyenv.


What about Conda? Would you recommend against it entirely? And what information would you use to decide between pipenv and poetry?
pipenv/poetry used two file workflow for managing dependencies. First file specifies the logical dependency, and the second file is a dependency lock file that's automatically generated by pipenv/poetry. requirements.txt is kinda a mix of the two file, which is simpler, but not having separate lock file makes it less flexible and harder to maintain the dependency list.
@soMuchToLearnAndShare pipenv is built on top of virtualenv/venv, so you always use them together. Pipenv adds a number of higher level features than virtualenv, namely dependency management. Virtualenv doesn't manage dependencies, all it does is provide isolated environment to install dependencies.
@soMuchToLearnAndShare venv is available in the standard library and that's a major benefit over virtualenv. I don't want to put words over PyPA mouth, but virtualenv does have a couple extra features that venv doesn't, and it works across larger range of Python versions. If you need the additional features that virtualenv provides over venv, then you obviously should use virtualenv. If you're satisfied with your current setup with venv, then there's no reason to choose virtualenv.
@soMuchToLearnAndShare but there's no reason to avoid virtualenv either if you don't mind the additional install. If you want to use pipenv, then it only supports virtualenv. There's no reason to avoid pipenv just because it uses virtualenv, especially as using pipenv already means that you need additional install anyway. At the end of the day, the environment directory created by virtualenv and venv are nearly identical, so your choice of virtual environment tool mostly only matter when creating the environment and not so much when using it.
A
ArnuldOnData

Jan 2020 Update

@Flimm has explained all the differences very well. Generally, we want to know the difference between all tools because we want to decide what's best for us. So, the next question would be: which one to use? I suggest you choose one of the two official ways to manage virtual environments:

Python Packaging now recommends Pipenv

Python.org now recommends venv


Note that pipenv and venv aren't alternatives to each other, just like Django and Python aren't alternatives to each other. With venv alone, you can't install packages, for instance, whereas pipenv does offer a mechanism to install packages.
I did not get you when you said with venv you can't install packages. I mean I can install everything available through pip in a virtual environment created using venv e.g. I have 4 different virtual environments in 4 different directories with different python and pandas versions but same jupyter lab version. All through venv
a
azec-pdx

pyenv - manages different python versions,

all others - create virtual environment (which has isolated python version and installed "requirements"),

pipenv want combine all, in addition to previous it installs "requirements" (into the active virtual environment or create its own if none is active)

So maybe you will be happy with pipenv only.

But I use: pyenv + pyenv-virtualenvwrapper, + pipenv (pipenv for installing requirements only).

In Debian:

apt install libffi-dev install pyenv based on https://www.tecmint.com/pyenv-install-and-manage-multiple-python-versions-in-linux/, but.. .. but instead of pyenv-virtualenv install pyenv-virtualenvwrapper (which can be standalone library or pyenv plugin, here the 2nd option): $ pyenv install 3.9.0 $ git clone https://github.com/pyenv/pyenv-virtualenvwrapper.git $(pyenv root)/plugins/pyenv-virtualenvwrapper # inside ~/.bashrc add: # export $VIRTUALENVWRAPPER_PYTHON="/usr/bin/python3" $ source ~/.bashrc $ pyenv virtualenvwrapper

Then create virtual environments for your projects (workingdir must exist):

pyenv local 3.9.0  # to prevent 'interpreter not found' in mkvirtualenv
python -m pip install --upgrade pip setuptools wheel
mkvirtualenv <venvname> -p python3.9 -a <workingdir>

and switch between projects:

workon <venvname>
python -m pip install --upgrade pip setuptools wheel pipenv

Inside a project I have the file requirements.txt, without fixing the versions inside (if some version limitation is not neccessary). You have 2 possible tools to install them into the current virtual environment: pip-tools or pipenv. Lets say you will use pipenv:

pipenv install -r requirements.txt

this will create Pipfile and Pipfile.lock files, fixed versions are in the 2nd one. If you want reinstall somewhere exactly same versions then (Pipfile.lock must be present):

pipenv install

Remember that Pipfile.lock is related to some Python version and need to be recreated if you use a different one.

As you see I write requirements.txt. This has some problems: You must remove a removed package from Pipfile too. So writing Pipfile directly is probably better.

So you can see I use pipenv very poorly. Maybe if you will use it well, it can replace everything?

EDIT 2021.01: I have changed my stack to: pyenv + pyenv-virtualenvwrapper + poetry. Ie. I use no apt or pip installation of virtualenv or virtualenvwrapper, and instead I install pyenv's plugin pyenv-virtualenvwrapper. This is easier way.

Poetry is great for me:

poetry add <package>   # install single package
poetry remove <package>
poetry install   # if you remove poetry.lock poetry will re-calculate versions

can you please elaborate on your current stack, I mean pyenv + pyenv-virtualenvwrapper + poetry, especially how you instruct poetry to use a specifique version installed via pyenv, and if you are disabling create virtual environment in poetry?
Sounds like a mess to understand and keep track of! Couldn't you eliminate all the rigamarole with the conda package, using its environment and package manager features? conda is a single tool that does the work of all three or four packages mentioned in your answer.
R
Rich Lysakowski PhD

As a Python newcomer this question frustrated me endlessly and confused me for months. Which virtual environment and package manager(s) should I invest in learning when I know that I will be using it for years to come?

The best article answering this vexing question is https://jakevdp.github.io/blog/2016/08/25/conda-myths-and-misconceptions/ by Jake Vanderplas. Although a few years old, it provides practical answers and the history of Python package and virtual environment managers from the trenches as these state-of-the-art was developing.

It was particularly frustrating for me in the data science and "big data cloud computing" communities, because conda is widely used as a virtual environment manager and full function package manager for Python and JavaScript, SQL, Java, HTML5, and Jupyter Notebooks.

So why use pip at all, when conda does everything that pip and venv variants do?

The answer is, "because you MUST use pip if a conda package is simply not available." Many times a required package is only available in pip format and there is no easy solution but to use pip. You can learn to use conda build but if you are not the package maintainer, then you must convince the package owner to generate a conda package for each new release (or do it yourself.)

These pip-based packages differ along many important and practical dimensions:

stability

maturity

complexity

active support (versus dying or dead)

levels of adoption near the Python ecosystem "core" versus "on the fringes" (i.e., integrated into Python.org distro)

easy to figure out and use (for beginners)

I will answer your question for two packages from dimension of package maturity and stability.

venv and virtualenv are the most mature, stability, and community support. From the online documentation you can see that virtualenv is in version 20.x as of today. virtualenv

virtualenv is a tool to create isolated Python environments. Since Python 3.3, a subset of it has been integrated into the standard library under the venv module. The venv module does not offer all features of this library, to name just a few more prominent: is slower (by not having the app-data seed method), is not as extendable, cannot create virtual environments for arbitrarily installed python versions (and automatically discover these), is not upgrade-able via pip, does not have as rich programmatic API (describe virtual environments without creating them).

virtualenvwrapper is set of scripts to help people use virtualenv (it is a "wrapper" that not well-maintained, its last update was in 2019. virtualenvwrapper

My recommendation is to avoid ALL pip virtual environments whenever possible. Use conda instead. Conda provides a unified approach. It is maintained by teams of professional open source developers and has a reputable company providing funding and a commercially supported version. The teams that maintain pip, venv, virtualenv, pipenv, and many other pip variants have limited resources by comparison. The pip virtual environment plurality is frustrating for beginners. The pip-based virtual environment tools complexity, fragmentation, fringe and unsupported packages, and wildly inconsistent support drove me to use conda. For data science work, my recommendation is that to use a pip-based virtual environment manager as a last resort when conda packages do not exist.

The differences between the venv variants still scare me because my time is limited to learn new packages. pipenv, venv, pyvenv, pyenv, virtualenv, virtualenvwrapper, poetry, and others have dozens of differences and complexities that take days to understand. I hate going down a path and find support for a package goes belly-up when a maintainer resigns (or gets too busy to maintain it). I just need to get my job done.

In the spirit of being helpful, here are a few links to help you dive in over your head, but not get lost in Dante's Inferno (re: pip).

A Guide to Python’s Virtual Environments

Choosing "core" Python packages to invest in for your career (long-term), versus getting a job done short term) is important. However, it is a business analysis question. Are you trying to simply get a task done, or a professional software engineer who builds scalable performant systems that require the least amount of maintenance effort over time? IMHO, conda will take you to the latter place more easily than dealing with pip-plurality problems. conda is still missing 1-step pip-package migration tools that make this a moot question. If we could simply convert pip packages into conda packages then pypi.org and conda-forge could be merged. Pip is necessary because conda packages are not (yet) universal. Many Python programmers are either too lazy to create conda packages, or they only program in Python and don't need conda's language-agnostic / multi-lingual support.

conda has been a god-send for me, because it supports cloud software engineering and data science's need for multilingual support of JavaScript, SQL, and Jupyter Notebook extensions, and conda plays well within Docker and other cloud-native environments. I encourage you to learn and master conda, which will enable you to side-step many complex questions that pip-based tools may never answer.

Keep it simple! I need one package that does 90% of what I need and guidance and workarounds for the 10% remaining edge cases.

Check out the articles linked herein to learn more about pip-based virtual environments.

I hope this is helpful to the original poster and gives pip and conda aficionados some things to think about.


Quote: Pip is necessary because conda packages are not (yet) universal. Many Python programmers are either too lazy to create conda packages, or they only program in Python and don't need conda's language-agnostic / multi-lingual support. --- if so - then isn't this making a strong hint why not to use conda? Or if conda wants to be universal, then there should be a clear time soon enough for that. So despite the many pip/virtualenv flavors then maybe better pick a winner and cancel all the rest than pick conda ... (is virtualenv[wrapper] already the winner?)
My answer is opinionated in favor of simplicity, i.e., using ONE tool for virtual environment, dependency, and package management for Python AND other languages. The conda system lacks just one function/module to make this entire cloudy confusion of alternatives disappear and become moot, a module to convert any pip-only format packages into conda packages reliably. conda is singly better supported than the fragmented cast of characters that include pipenv, virtualenv, venv, pyenv, poetry, and others. Someone will get around to writing a functional converter soon.
I just found a package last week called "pip2conda". When I get around to testing it, I will let you know if it fulfills the promise of its name.
The motivation for conda is to have a single, unified package AND environment manager. Reduce complexity, uncomplicate life for Pythonista who are also polyglots, "There should be one-- and preferably only one --obvious way to do it." The Zen of Python, by Tim Peters ... Simple is better than complex. ... There should be one-- and preferably only one --obvious way to do it. ... If the implementation is hard to explain, it's a bad idea. If the implementation is easy to explain, it may be a good idea. ... Conda is one honking great idea -- let's do more of those!