If you have been writing Python for any length of time, you have almost certainly run into the moment where installing a package for one project breaks another. Maybe you upgraded requests for Project A, and suddenly Project B throws import errors because it depends on an older version. Or worse, you installed something system-wide with sudo pip install and corrupted your operating system’s Python environment. These are not edge cases — they are inevitable consequences of working without virtual environments.
Virtual environments solve this problem by giving each project its own isolated Python installation with its own set of packages. Combined with pip, Python’s package manager, they form the foundation of every professional Python workflow. Whether you are building a Flask API, training a machine learning model, or writing automation scripts, understanding virtual environments and pip is non-negotiable. This tutorial covers everything from the basics to advanced tooling that senior engineers use daily in production.
To appreciate what virtual environments give you, consider what happens without them. Every Python installation has a single site-packages directory where third-party packages get installed. When you run pip install flask without a virtual environment, Flask and all its dependencies land in that global site-packages folder. Every Python script on your system now sees that version of Flask.
Here is where things go wrong:
Dependency conflicts. Project A requires SQLAlchemy==1.4 and Project B requires SQLAlchemy==2.0. Since there is only one site-packages, you cannot have both versions installed simultaneously. Installing one overwrites the other, and one of your projects breaks.
System Python pollution. On macOS and most Linux distributions, the operating system ships with a Python installation that system tools depend on. Installing packages into system Python with pip install (especially with sudo) can overwrite libraries that your OS needs. I have seen developers render their terminal unusable by upgrading six or urllib3 system-wide.
Reproducibility failures. Without an isolated environment, you have no reliable way to know which packages your project actually needs versus what happens to be installed on your machine. When your teammate clones the repo and runs it, it fails with mysterious import errors because they do not have the same random collection of packages you accumulated over months.
Version ambiguity. Running python on different machines might invoke Python 2.7, 3.8, or 3.12. Without explicit environment management, you are guessing which interpreter and which package versions your code will encounter in production.
# This is what chaos looks like sudo pip install flask # Installs into system Python pip install django==3.2 # Might conflict with existing packages pip install requests # Which project needs this? All of them? Some? pip list # 200+ packages, no idea which project uses what
Virtual environments eliminate every one of these problems.
Python 3.3+ includes the venv module in the standard library, so you do not need to install anything extra. This is the recommended way to create virtual environments.
# Navigate to your project directory cd ~/projects/my-flask-app # Create a virtual environment python3 -m venv venv
This creates a venv directory inside your project containing a copy of the Python interpreter, the pip package manager, and an empty site-packages directory. The directory structure looks like this:
venv/ ├── bin/ # Scripts (activate, pip, python) — Linux/macOS │ ├── activate # Bash/Zsh activation script │ ├── activate.csh # C shell activation │ ├── activate.fish # Fish shell activation │ ├── pip │ ├── pip3 │ ├── python -> python3 │ └── python3 -> /usr/bin/python3 ├── include/ # C headers for compiling extensions ├── lib/ # Installed packages go here │ └── python3.12/ │ └── site-packages/ ├── lib64 -> lib # Symlink on some systems └── pyvenv.cfg # Configuration file
The most common names for virtual environment directories are venv, .venv, and env. I recommend venv or .venv because they are immediately recognizable, and every .gitignore template for Python already includes them. The dot prefix in .venv hides it from normal directory listings, which some developers prefer.
# All of these are common and acceptable python3 -m venv venv python3 -m venv .venv python3 -m venv env # You can also name it after the project, though this is less common python3 -m venv myproject-env
Always create the virtual environment inside your project’s root directory. This keeps everything self-contained and makes it obvious which environment belongs to which project. Some developers prefer to store all virtual environments in a central location like ~/.virtualenvs/, but this adds complexity without much benefit unless you are using virtualenvwrapper.
If you have multiple Python versions installed, you can specify which one to use:
# Use a specific Python version python3.11 -m venv venv python3.12 -m venv venv # On Windows py -3.11 -m venv venv
In rare cases, such as Docker containers where you want a minimal environment, you can create a virtual environment without pip:
# Create without pip (smaller, faster) python3 -m venv --without-pip venv
Creating a virtual environment does not automatically use it. You must activate it first, which modifies your shell’s PATH so that python and pip commands point to the virtual environment’s binaries instead of the system ones.
# macOS / Linux (Bash or Zsh) source venv/bin/activate # macOS / Linux (Fish shell) source venv/bin/activate.fish # macOS / Linux (Csh / Tcsh) source venv/bin/activate.csh # Windows (Command Prompt) venv\Scripts\activate.bat # Windows (PowerShell) venv\Scripts\Activate.ps1
When a virtual environment is active, your shell prompt changes to show the environment name in parentheses:
# Before activation $ whoami folau # After activation (venv) $ whoami folau # Verify Python is using the venv (venv) $ which python /home/folau/projects/my-flask-app/venv/bin/python (venv) $ which pip /home/folau/projects/my-flask-app/venv/bin/pip
Activation is simpler than it sounds. It prepends the virtual environment’s bin/ (or Scripts/ on Windows) directory to your PATH environment variable. That is it. When you type python, your shell finds the venv’s Python before the system Python because it appears earlier in PATH.
# Before activation $ echo $PATH /usr/local/bin:/usr/bin:/bin # After activation (venv) $ echo $PATH /home/folau/projects/my-flask-app/venv/bin:/usr/local/bin:/usr/bin:/bin
When you are done working on a project, deactivate the environment to return to your system Python:
# Works on all platforms (venv) $ deactivate $
You do not strictly need to activate a virtual environment. You can call the venv’s Python or pip directly by using the full path:
# Run Python from the venv without activating ./venv/bin/python my_script.py # Install a package without activating ./venv/bin/pip install requests
This is particularly useful in shell scripts, cron jobs, and CI/CD pipelines where activating is unnecessary overhead.
pip is the standard package manager for Python. It downloads and installs packages from the Python Package Index (PyPI), which hosts over 500,000 packages. When you work inside a virtual environment, pip installs packages only into that environment’s site-packages, keeping everything isolated.
# Install the latest version pip install requests # Install a specific version pip install requests==2.31.0 # Install a minimum version pip install "requests>=2.28.0" # Install a version range pip install "requests>=2.28.0,<3.0.0" # Install multiple packages at once pip install flask sqlalchemy redis # Install with extras (optional dependencies) pip install "fastapi[all]" pip install "celery[redis]"
# Upgrade to the latest version pip install --upgrade requests pip install -U requests # Short form # Upgrade pip itself pip install --upgrade pip
# Uninstall a package pip uninstall requests # Uninstall without confirmation prompt pip uninstall -y requests # Uninstall multiple packages pip uninstall flask sqlalchemy redis
Note that pip uninstall only removes the specified package. It does not remove that package's dependencies, even if nothing else needs them. This can leave orphaned packages in your environment.
# List all installed packages pip list # List outdated packages pip list --outdated # Show detailed info about a specific package pip show requests
The output of pip show is useful for debugging dependency issues:
(venv) $ pip show requests Name: requests Version: 2.31.0 Summary: Python HTTP for Humans. Home-page: https://requests.readthedocs.io Author: Kenneth Reitz License: Apache 2.0 Location: /home/folau/projects/my-app/venv/lib/python3.12/site-packages Requires: certifi, charset-normalizer, idna, urllib3 Required-by: httpx, some-other-package
The pip freeze command outputs every installed package and its exact version in a format that can be fed back into pip. This is how you capture your project's dependencies:
# Output all installed packages with versions pip freeze # Save to a requirements file pip freeze > requirements.txt
The output looks like this:
certifi==2024.2.2 charset-normalizer==3.3.2 flask==3.0.2 idna==3.6 jinja2==3.1.3 markupsafe==2.1.5 requests==2.31.0 urllib3==2.2.1 werkzeug==3.0.1
# Install all packages from requirements.txt pip install -r requirements.txt # Install from multiple requirement files pip install -r requirements.txt -r requirements-dev.txt
The requirements.txt file is the traditional way to declare Python project dependencies. It is a plain text file where each line specifies a package and optionally a version constraint.
# Pinned versions (recommended for applications) flask==3.0.2 requests==2.31.0 sqlalchemy==2.0.27 # Minimum version requests>=2.28.0 # Version range requests>=2.28.0,<3.0.0 # Compatible release (>=2.31.0, <2.32.0) requests~=2.31.0 # Any version (avoid this) requests # Comments # This is a comment flask==3.0.2 # Web framework # Include another requirements file -r requirements-base.txt
A common pattern is to maintain separate requirement files for production and development:
# requirements.txt (production) flask==3.0.2 gunicorn==21.2.0 psycopg2-binary==2.9.9 redis==5.0.1 # requirements-dev.txt (development) -r requirements.txt pytest==8.0.2 pytest-cov==4.1.0 black==24.2.0 flake8==7.0.0 mypy==1.8.0 ipdb==0.13.13
Notice how requirements-dev.txt includes requirements.txt with the -r flag. This means installing dev dependencies automatically installs production dependencies as well, avoiding duplication.
For applications (web apps, APIs, services), always pin exact versions with ==. This guarantees that every environment — your laptop, your teammate's laptop, staging, production — runs identical code. Unpinned or loosely pinned dependencies are one of the most common sources of “works on my machine” bugs.
For libraries (packages you publish for others to install), use flexible version constraints like >= or ~=. If your library pins exact versions, it creates conflicts when users install it alongside other packages that need different versions of the same dependency.
Raw pip freeze has a significant limitation: it dumps every installed package, including transitive dependencies (dependencies of your dependencies). This makes it hard to tell which packages you actually chose to install versus which ones came along for the ride. pip-tools solves this elegantly.
pip install pip-tools
With pip-tools, you maintain a requirements.in file that lists only your direct dependencies. Then pip-compile resolves all transitive dependencies and writes a fully pinned requirements.txt.
# requirements.in (what YOU want) flask requests sqlalchemy celery[redis]
# Generate the pinned requirements.txt pip-compile requirements.in
The generated requirements.txt includes hashes and comments showing where each dependency came from:
#
# This file is autogenerated by pip-compile with Python 3.12
# by the following command:
#
# pip-compile requirements.in
#
certifi==2024.2.2
# via requests
charset-normalizer==3.3.2
# via requests
flask==3.0.2
# via -r requirements.in
idna==3.6
# via requests
jinja2==3.1.3
# via flask
requests==2.31.0
# via -r requirements.in
sqlalchemy==2.0.27
# via -r requirements.in
pip-sync goes a step further: it installs exactly the packages in requirements.txt and removes anything else. This ensures your environment matches the lock file precisely.
# Sync your environment to match requirements.txt exactly pip-sync requirements.txt # Sync with multiple requirement files pip-sync requirements.txt requirements-dev.txt
# Upgrade all packages pip-compile --upgrade requirements.in # Upgrade a specific package pip-compile --upgrade-package requests requirements.in # Then sync your environment pip-sync requirements.txt
The Python ecosystem has several tools beyond venv and pip for environment and dependency management. Here is when to reach for each one.
Pipenv combines virtual environment management and dependency resolution into a single tool. It uses a Pipfile instead of requirements.txt and generates a Pipfile.lock for deterministic builds.
# Install pipenv pip install pipenv # Create environment and install a package pipenv install flask # Install dev dependency pipenv install --dev pytest # Activate the shell pipenv shell # Run a command without activating pipenv run python app.py
Pipenv was once the officially recommended tool, but its development stalled for years. It has since resumed active development, but many teams have moved to other tools. Use it if your team already uses it or if you want a simple all-in-one solution.
Poetry is the most popular modern alternative. It handles dependency management, virtual environments, building, and publishing — all through a pyproject.toml file.
# Install Poetry curl -sSL https://install.python-poetry.org | python3 - # Create a new project poetry new my-project # Add dependencies poetry add flask poetry add --group dev pytest # Install dependencies poetry install # Run commands in the environment poetry run python app.py poetry shell
Poetry is excellent for projects that are both applications and libraries. Its dependency resolver is more sophisticated than pip's, and pyproject.toml is cleaner than requirements.txt. Use Poetry for greenfield projects where you want a modern, complete toolchain.
Conda is a cross-language package manager popular in data science. Unlike pip, it can install non-Python dependencies (C libraries, R packages, system tools), which is critical for scientific computing packages like NumPy, SciPy, and TensorFlow that depend on compiled native code.
# Create a conda environment conda create -n myenv python=3.12 # Activate conda activate myenv # Install packages conda install numpy pandas scikit-learn # Export environment conda env export > environment.yml # Recreate from file conda env create -f environment.yml
Use conda if you are doing data science or machine learning work, especially if you need packages with complex native dependencies. For web development and general-purpose Python, stick with venv + pip or Poetry.
pyproject.toml is the modern standard for Python project configuration, defined in PEP 518 and PEP 621. It replaces setup.py, setup.cfg, and even requirements.txt as the single source of truth for project metadata and dependencies.
# pyproject.toml
[build-system]
requires = ["setuptools>=68.0", "wheel"]
build-backend = "setuptools.backends._legacy:_Backend"
[project]
name = "my-flask-app"
version = "1.0.0"
description = "A production Flask application"
requires-python = ">=3.10"
authors = [
{name = "Folau Kaveinga", email = "folau@example.com"}
]
dependencies = [
"flask>=3.0,<4.0",
"sqlalchemy>=2.0",
"requests>=2.28",
"gunicorn>=21.0",
]
[project.optional-dependencies]
dev = [
"pytest>=8.0",
"black>=24.0",
"mypy>=1.8",
"ruff>=0.2",
]
[tool.black]
line-length = 88
target-version = ["py312"]
[tool.ruff]
line-length = 88
select = ["E", "F", "I"]
[tool.mypy]
python_version = "3.12"
strict = true
[tool.pytest.ini_options]
testpaths = ["tests"]
addopts = "-v --tb=short"
The advantage of pyproject.toml is consolidation. Your project metadata, dependencies, and tool configuration all live in one file instead of being scattered across setup.py, requirements.txt, mypy.ini, pytest.ini, .flake8, and more.
# Install the project in development mode pip install -e . # Install with dev dependencies pip install -e ".[dev]" # Build the project python -m build
Virtual environments isolate packages, but they do not solve the problem of needing different Python versions for different projects. pyenv fills that gap by letting you install and switch between multiple Python versions seamlessly.
# macOS (via Homebrew) brew install pyenv # Linux curl https://pyenv.run | bash # Add to your shell profile (~/.bashrc or ~/.zshrc) export PYENV_ROOT="$HOME/.pyenv" export PATH="$PYENV_ROOT/bin:$PATH" eval "$(pyenv init -)"
# List available Python versions pyenv install --list | grep "^ 3" # Install specific versions pyenv install 3.11.8 pyenv install 3.12.2 # Set global default pyenv global 3.12.2 # Set version for a specific project directory cd ~/projects/legacy-app pyenv local 3.11.8 # Creates .python-version file # Now create a venv with the correct version python -m venv venv # Uses 3.11.8 because of .python-version
The combination of pyenv (for Python version management) and venv (for package isolation) gives you complete control over your Python environments.
Most modern IDEs detect and integrate with virtual environments automatically, providing code completion, linting, and debugging support based on the packages installed in your venv.
VS Code's Python extension automatically detects virtual environments in your workspace. To configure it:
Cmd+Shift+P on macOS, Ctrl+Shift+P on Windows/Linux)venv/bin/pythonYou can also set it in .vscode/settings.json:
{
"python.defaultInterpreterPath": "${workspaceFolder}/venv/bin/python",
"python.terminal.activateEnvironment": true
}
When python.terminal.activateEnvironment is true, VS Code automatically activates the virtual environment whenever you open a new terminal.
PyCharm has first-class virtual environment support:
venv/bin/pythonPyCharm can also create virtual environments for you when starting a new project. It detects requirements.txt files and offers to install dependencies automatically.
A common question is whether you need virtual environments inside Docker containers. After all, each container is already an isolated environment. The answer is nuanced.
If your Docker container runs a single Python application and nothing else, a virtual environment adds no practical benefit. The container itself provides the isolation:
# Dockerfile without venv (acceptable for simple apps) FROM python:3.12-slim WORKDIR /app COPY requirements.txt . RUN pip install --no-cache-dir -r requirements.txt COPY . . CMD ["gunicorn", "app:app", "--bind", "0.0.0.0:8000"]
There are legitimate reasons to use virtual environments inside containers:
Multi-stage builds. Virtual environments make it easy to copy only the installed packages from a build stage to a slim runtime stage:
# Dockerfile with venv (recommended for production) FROM python:3.12-slim AS builder WORKDIR /app RUN python -m venv /opt/venv ENV PATH="/opt/venv/bin:$PATH" COPY requirements.txt . RUN pip install --no-cache-dir -r requirements.txt FROM python:3.12-slim AS runtime COPY --from=builder /opt/venv /opt/venv ENV PATH="/opt/venv/bin:$PATH" WORKDIR /app COPY . . CMD ["gunicorn", "app:app", "--bind", "0.0.0.0:8000"]
Avoiding system package conflicts. Some base images include Python packages that the OS depends on. Installing your dependencies into a venv prevents overwriting these system packages.
Cleaner separation. When your container runs multiple processes or includes system Python tools, a venv keeps your application packages cleanly separated.
Here is the complete workflow for starting a new Python project with proper environment management:
# 1. Create project directory mkdir ~/projects/my-api && cd ~/projects/my-api # 2. Initialize git git init # 3. Create virtual environment python3 -m venv venv # 4. Add venv to .gitignore echo "venv/" >> .gitignore echo "__pycache__/" >> .gitignore echo "*.pyc" >> .gitignore echo ".env" >> .gitignore # 5. Activate the environment source venv/bin/activate # 6. Upgrade pip pip install --upgrade pip # 7. Install your dependencies pip install flask sqlalchemy pytest # 8. Freeze dependencies pip freeze > requirements.txt # 9. Make your initial commit git add . git commit -m "Initial project setup with Flask, SQLAlchemy"
When you clone a project that uses virtual environments, here is how to get up and running:
# 1. Clone the repository git clone https://github.com/team/project.git cd project # 2. Create a fresh virtual environment python3 -m venv venv # 3. Activate it source venv/bin/activate # 4. Install exact dependencies from the lock file pip install -r requirements.txt # 5. Verify everything works python -m pytest
If the project uses pyproject.toml instead:
# Install the project and its dependencies pip install -e ".[dev]"
Upgrading dependencies in a production project requires discipline. Never blindly upgrade everything at once.
# 1. Check what is outdated pip list --outdated # 2. Upgrade one package at a time pip install --upgrade requests # 3. Run your test suite python -m pytest # 4. If tests pass, update requirements.txt pip freeze > requirements.txt # 5. Commit the change with a clear message git add requirements.txt git commit -m "Upgrade requests from 2.28.0 to 2.31.0"
For a safer approach using pip-tools:
# Upgrade a specific package and re-resolve all dependencies pip-compile --upgrade-package requests requirements.in pip-sync requirements.txt python -m pytest git add requirements.txt git commit -m "Upgrade requests to 2.31.0"
Here is a typical GitHub Actions workflow that uses virtual environments:
# .github/workflows/ci.yml
name: CI
on:
push:
branches: [main]
pull_request:
branches: [main]
jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.11", "3.12"]
steps:
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Create virtual environment
run: python -m venv venv
- name: Install dependencies
run: |
source venv/bin/activate
pip install --upgrade pip
pip install -r requirements.txt
pip install -r requirements-dev.txt
- name: Run linters
run: |
source venv/bin/activate
ruff check .
mypy .
- name: Run tests
run: |
source venv/bin/activate
pytest --cov=src --cov-report=xml
- name: Upload coverage
uses: codecov/codecov-action@v4
with:
file: coverage.xml
Virtual environments contain thousands of files, are platform-specific (a venv created on macOS will not work on Linux), and include hardcoded paths. Never commit them. Add this to your .gitignore:
# .gitignore venv/ .venv/ env/ *.pyc __pycache__/
Running pip install outside a virtual environment installs packages globally, which eventually leads to conflicts. On macOS and Linux, some people use sudo pip install, which is even worse because it modifies files owned by the operating system.
# NEVER do this sudo pip install flask # ALWAYS activate a venv first source venv/bin/activate pip install flask
If you install packages without activating your virtual environment, they go into the global Python. The most common symptom is: “I installed the package, but Python says it cannot find it.”
# Check which pip you are using which pip # Should show: /path/to/your/project/venv/bin/pip # NOT: /usr/bin/pip or /usr/local/bin/pip
Installing a new package and forgetting to update requirements.txt means your teammates and CI/CD pipeline will not have that package. Make it a habit to freeze after every install:
# Install and freeze in one command pip install requests && pip freeze > requirements.txt
The version of pip bundled with python -m venv is often outdated. Old pip versions have slower dependency resolution and may fail to install packages that require newer features. Always upgrade pip immediately after creating a new environment.
# First thing after activation pip install --upgrade pip
If you are using conda, avoid installing packages with pip unless the package is not available through conda. Mixing the two can lead to dependency conflicts that are extremely difficult to debug. If you must use pip inside a conda environment, install conda packages first.
requirements.txt, not the environment itself.== in requirements.txt for deployable applications. Use flexible ranges only for libraries.requirements.txt and requirements-dev.txt (or use pyproject.toml optional dependencies).pip install --upgrade pip right after creating a new virtual environment.pip freeze works for simple projects, but pip-compile gives you traceable, reproducible dependency resolution..python-version file, pyproject.toml's requires-python, or at minimum a note in your README.venv directory and create a fresh one. They are disposable by design./path/to/venv/bin/python script.py.python -m venv venv to create environments and source venv/bin/activate to activate them. This is built into Python — no extra tools required.pip is the standard package manager. The core commands you will use daily are pip install, pip freeze, and pip install -r requirements.txt.requirements.txt for applications. Use pip-tools or Poetry for better dependency management on larger projects.pyproject.toml is the modern replacement for setup.py and requirements.txt. New projects should adopt it.pyenv when you need different Python versions for different projects.sudo pip. Never skip creating a venv because your project is “too small.”