Anaconda

So how do you get Python? The most direct way is to download it from the Python.org website. We don’t recommend this.

Python Distributions

Python is a little like Linux. Many people take the core of it, and package it into useful distributions. If you want to use Linux, for example, most people would not start at downloading the kernel. This will leave you with A LOT to do before getting to a working operating system. You will probably rather download a distribution like Debian, Ubuntu, Fedora, or openSUSE. They get you up and running, and doing what is interesting to you in a flash.

Today the most common options are Anaconda, Miniconda, and Miniforge, all of which use the conda ecosystem. For scientific and engineering work we recommend Miniforge (lightweight, conda-forge by default) or Anaconda (larger, batteries-included). Both make it easy to install scientific packages and manage reproducible environments.

Another perfectly adequate option is uv, a fast, pip-compatible Python package and environment manager. It works well for creating virtual environments, pinning dependencies, and running projects.

If your workflow revolves around Jupyter Notebooks, though, conda-based distributions (Miniforge/Anaconda) typically integrate more smoothly (kernels, native dependencies). uv still works with notebooks, but may require extra steps (e.g., installing ipykernel in the environment and registering a kernel), and some scientific packages are easier to install from conda-forge.

On Virtual Environments

Working productively with Python is easiest when each project lives in its own virtual environment. This keeps dependencies isolated, allows different projects to use different Python versions, prevents version conflicts, and makes your setups lightweight and reproducible. It also means you never have to modify the system Python: create an environment, work inside it, and delete it when you are done.

A virtual environment is simply an isolated directory that contains a Python interpreter together with only the packages needed for a given project. After activating the environment, any Python, package installation, or command-line tools you run operate inside that sandbox.

There are several tools that create and manage such environments. In this manual we show commands with conda because it is widely used in scientific and engineering workflows and integrates smoothly with distributions like Miniforge and Anaconda. conda excels at installing packages that depend on native libraries, thanks to the conda-forge ecosystem, and it makes Jupyter kernel management straightforward. The trade-off is that environments can be larger and dependency solving can be slower, and you need to be mindful when mixing conda with pip.

If you prefer a faster, more lightweight approach, uv is a pip-compatible alternative that resolves and installs quickly, works directly with PyPI and standard pyproject.toml workflows, and can even create environments on the fly (for example, uv run python will create and reuse a project-specific environment automatically). For heavy scientific stacks with complex native dependencies you may occasionally need compilers or system libraries, and for notebooks you will typically install ipykernel in the environment and register the kernel. The ecosystem is newer, but the day-to-day workflow - create or activate an environment, install packages, and run code inside it - is the same.

Where we demonstrate conda commands, you can generally substitute the equivalent uv commands and achieve the same result. IDEs like PyCharm and VSCode also support creating and managing virtual environments through their graphical interfaces, both for conda and uv workflows.