Publishing to PyPI is easy. Publishing in a way that stays installable, testable, and reproducible six months later is the hard part. This guide walks through a modern workflow using pyproject.toml (single source of truth) and CI (so releases don’t depend on your laptop).
Quickstart
If you want the highest-impact steps first, follow this mini-path. You can do it on an existing repo or a fresh one. The goal is simple: build locally, test in CI, publish from a tag.
Fast wins (do these today)
- Move package code under src/ (prevents accidental imports from the repo root)
- Define metadata + dependencies in pyproject.toml (name, version, requires-python)
- Build both a wheel and an sdist locally and run a clean install test
- Add a CI workflow that runs tests on every push and builds artifacts on releases
- Publish to TestPyPI once before your first real PyPI release
Release checklist (do this per version)
- Bump version (and update the changelog if you keep one)
- Run tests locally once (quick sanity check)
- Tag the release (e.g., v0.3.0) and push the tag
- CI builds + publishes (no manual uploads)
- Verify installation with
pip install your-packageand run a tiny import/CLI test
Always test installation from the built artifact (wheel) in a clean environment. If your package only “works” when imported from your repo folder, your users will have a bad time.
Overview
“Publish a Python package” sounds like a single action, but it’s really a chain of small contracts: the metadata contract (name/version), the build contract (how artifacts are created), the dependency contract (what you require), and the distribution contract (what ends up in the wheel/sdist). A modern packaging workflow makes those contracts explicit and verifiable.
What you’ll get from this post
| Goal | What it looks like | Why it matters |
|---|---|---|
| Single source of truth | All core metadata in pyproject.toml |
No more “version drift” across files and tools |
| Reproducible builds | Build artifacts created by CI | Releases don’t depend on your laptop state |
| Safe publishing | Publish on tags + optional TestPyPI first | Fewer broken releases and faster recovery |
| User-friendly installs | Clean dependencies and correct packaging of files | Fewer “it works on my machine” support issues |
This post focuses on a practical default path for pure-Python packages and small libraries. If you ship compiled extensions (C/C++/Rust), you’ll add platform wheels and more build steps, but the core ideas remain the same: explicit configuration, artifact verification, and CI-driven releases.
“Package” can mean two things: the importable Python package (what you import) and the distribution
uploaded to PyPI (what you pip install). They often share a name, but they don’t have to.
Keeping that distinction in your head makes errors easier to debug.
Core concepts
pyproject.toml: one file to describe the build and the project
pyproject.toml is the modern center of Python packaging. It can define:
(1) how your project is built (the build backend), and (2) your project metadata (name, version, dependencies, entry points).
The win is not “new syntax” — it’s that tools stop guessing.
Build system (how artifacts are created)
- Build backend: the tool that produces wheel/sdist (e.g., setuptools, hatchling, flit)
- Build requirements: minimal dependencies needed to build
- Isolation: builds happen in a clean environment (reduces “works on my machine”)
Project metadata (what users install)
- Name: PyPI distribution name (unique on PyPI)
- Version: what users pin and what CI publishes
- Dependencies: what pip must install for your library to work
- Entry points: console scripts (CLI commands)
Wheel vs sdist: why you should build both
When you publish to PyPI, you typically upload one or more artifacts: a wheel (pre-built, fast install) and an sdist (source distribution). A common mistake is uploading “only a wheel” and accidentally omitting source files or package data. Another is uploading “only an sdist” and forcing end users to build during install.
Mental model
| Artifact | What it contains | Typical user experience |
|---|---|---|
| Wheel (.whl) | Ready-to-install package files | Fast, predictable installs |
| sdist (.tar.gz) | Source + build config | May build on install; used for transparency and rebuilds |
CI as the release gate: you want one way to publish
If releases happen manually from a developer machine, sooner or later you’ll ship a broken build: missing files, wrong Python version, stale dependency lock, or an uncommitted local change. CI fixes this by making “publish” a predictable, logged process: tests run, artifacts are built, artifacts are checked, then uploaded.
Publishing should be tied to an explicit release signal: a Git tag or a GitHub Release. That small friction is what prevents accidental releases when you merge an unrelated PR.
Versioning and compatibility: be intentional
Users rely on your version numbers for safety. You don’t need a complicated policy, but you do need consistency: decide how you bump versions, document what “breaking” means for your API, and declare what Python versions you support.
Practical defaults
- Set requires-python to the oldest version you actively support
- Use semantic-ish versioning: major = breaking, minor = features, patch = fixes
- Add extras for optional features (
pip install pkg[dev])
Step-by-step
This is a complete, practical flow to publish a Python package the right way. You can adapt it to your tooling preference, but the structure matters: layout → metadata → build → verify → CI → publish.
Step 1 — Choose a clean project layout (use src/)
A src layout prevents a classic trap: Python importing your package from the repository folder instead of the installed artifact. That trap hides missing files and broken packaging until your users hit it.
Recommended structure
src/your_pkg/— importable codetests/— test suitepyproject.toml— metadata + build configREADME.md+LICENSE
Sanity check
If you delete the repo folder and only keep the built wheel, the package should still work. That’s the entire point of packaging.
Step 2 — Write pyproject.toml (metadata + build backend)
Here’s a compact example using a modern build backend and a src layout. Treat it as a starting point: fill in your real name, URLs, classifiers, and dependencies.
[build-system]
requires = ["hatchling>=1.18"]
build-backend = "hatchling.build"
[project]
name = "your-package-name"
version = "0.1.0"
description = "Short, accurate one-liner for PyPI."
readme = "README.md"
requires-python = ">=3.10"
license = { file = "LICENSE" }
authors = [{ name = "Your Name" }]
keywords = ["python", "packaging"]
classifiers = [
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
]
dependencies = [
"requests>=2.31",
]
[project.urls]
Homepage = "https://github.com/your-org/your-repo"
Issues = "https://github.com/your-org/your-repo/issues"
[project.optional-dependencies]
dev = [
"pytest>=8",
"ruff>=0.6",
]
[project.scripts]
yourcli = "your_pkg.cli:main"
[tool.hatch.build.targets.wheel]
packages = ["src/your_pkg"]
- name must be unique on PyPI (and can differ from
import your_pkg) - requires-python prevents pip from installing on unsupported Pythons
- dependencies should be real runtime deps only (dev tools go in extras)
- project.scripts is the clean way to expose a CLI
Step 3 — Build locally and verify the artifacts
Don’t wait for CI to tell you your distribution is broken. Build once locally, check the metadata, and run a clean install test. This catches missing files, wrong packaging configuration, and README rendering issues early.
# Create a clean build environment
python -m venv .venv
source .venv/bin/activate
python -m pip install -U pip build twine
# Run your tests (optional but recommended)
python -m pip install -e ".[dev]"
pytest
# Build wheel + sdist into ./dist
python -m build
# Check distribution metadata for common problems
twine check dist/*
# Clean install test: new venv, install only the built wheel
deactivate
python -m venv .venv_install
source .venv_install/bin/activate
python -m pip install -U pip
python -m pip install dist/*.whl
# Smoke test import and CLI (adjust to your package)
python -c "import your_pkg; print('ok')"
yourcli --help
Most packaging bugs are invisible until you install from the wheel: missing package data, wrong import paths, optional deps not declared, or a CLI entry point pointing at the wrong function.
Step 4 — Add CI: test on push, build + publish on tags
Your CI should do two things reliably:
(1) run tests for every change, and (2) publish only when you explicitly release.
Below is a GitHub Actions workflow that runs tests and publishes to PyPI when you push a tag like v1.2.3.
It uses a trusted publishing flow (OIDC) so you don’t need to store a long-lived PyPI API token in GitHub.
name: CI
on:
push:
branches: ["main"]
tags: ["v*"]
pull_request:
permissions:
contents: read
id-token: write # needed for trusted publishing (OIDC) to PyPI
jobs:
test:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
python-version: ["3.10", "3.11", "3.12"]
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Install
run: |
python -m pip install -U pip
python -m pip install -e ".[dev]"
- name: Test
run: pytest
build-and-publish:
needs: test
if: startsWith(github.ref, 'refs/tags/v')
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.12"
- name: Build
run: |
python -m pip install -U pip build twine
python -m build
twine check dist/*
- name: Publish to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
- Publish only on tags: reduces accidental releases
- Build after tests: prevents publishing a version that fails your own test suite
If you don’t want trusted publishing, you can publish with a PyPI token via Twine instead — the structure stays the same: tests → build → check → upload.
Step 5 — Do a safe first release (TestPyPI → PyPI)
Your first release is where most “oops” moments happen: missing README, broken classifiers, wrong package name, or missing files in the wheel. TestPyPI is a low-risk rehearsal that helps you confirm the whole pipeline.
Before you publish
- Pick your distribution name (check it’s free on PyPI)
- Confirm import name matches your docs (
import your_pkg) - Build artifacts locally once and run
twine check - Decide your Python support range and declare it
After you publish
- Install in a fresh venv:
pip install your-package-name - Run a smoke test: import + minimal functionality
- If you have a CLI, test:
yourcli --help - Open the PyPI page and confirm the README renders
Step 6 — Maintain like a grown-up: yanks, hotfixes, and metadata hygiene
Even with a careful workflow, mistakes happen. The difference between “annoying” and “disaster” is whether you can recover cleanly. Keep releases small, automate publishing, and make it easy to ship a patch version when needed.
Small habits that prevent big problems
- Tag releases and keep tags immutable
- Don’t delete releases to “fix” them — publish a patch version instead
- Keep runtime dependencies minimal (move dev tools to extras)
- Document your public API (even a small README section helps)
Common mistakes
If you’ve ever hit “ModuleNotFoundError after pip install” or “my files aren’t in the wheel”, you’re not alone. These are the packaging pitfalls that show up in real projects — plus how to fix them fast.
Mistake 1 — Repo imports work, installed package doesn’t
This usually happens when you don’t use a src layout, or when your build config isn’t including the right package directory.
- Fix: use
src/and ensure your build config points at it. - Fix: run a clean install test from the wheel before releasing.
Mistake 2 — Forgetting requires-python (users install on unsupported versions)
Without requires-python, pip may install your package on Python versions you don’t support, causing confusing runtime errors.
- Fix: set
requires-pythoninpyproject.toml. - Fix: run CI tests on your supported Python versions.
Mistake 3 — Shipping dev dependencies to users
If your runtime dependencies include tools like linters, test frameworks, or formatters, installs become slower and conflict-prone.
- Fix: keep
dependenciesruntime-only; put tooling inoptional-dependencies.dev. - Fix: document: “For contributors:
pip install -e .[dev]”.
Mistake 4 — Publishing from a local machine (non-reproducible)
Local builds can include uncommitted changes, different Python versions, or missing build dependencies.
- Fix: publish only from CI on a tag.
- Fix: make CI run
twine checkbefore upload.
Mistake 5 — Confusing distribution name vs import name
PyPI name (what you install) can differ from import name (what you import). Mixing them breaks docs and user expectations.
- Fix: decide early and be consistent in README and examples.
- Fix: add a tiny “Install / Import” section to your README.
Mistake 6 — Missing non-code files in the built artifact
Templates, static data, and type marker files can silently disappear if you don’t include them correctly.
- Fix: verify the wheel contents (clean install test + runtime checks).
- Fix: keep package data rules explicit in your build tool configuration.
If something is missing after install, inspect the installed site-packages directory in the venv. Packaging bugs are often just “file didn’t end up in the wheel”.
FAQ
Do I still need setup.py?
In most modern workflows, no. For many projects, pyproject.toml is enough, and your build backend reads everything from there.
Some legacy or highly customized setups still use setup.py, but it’s increasingly a compatibility layer rather than the primary configuration.
Which build backend should I choose: setuptools, hatchling, flit, poetry?
Pick the one that matches your complexity. If you want a lightweight, standards-first workflow, a simple backend (like hatchling or flit) is great. If you need maximum ecosystem compatibility or complex build customization, setuptools remains common. If you want an opinionated tool that also manages environments, Poetry can be convenient — just ensure you can build wheel/sdist reliably in CI.
How do I include package data (templates, JSON files, etc.)?
Prefer explicit inclusion rules via your build tool configuration, then validate by installing from the wheel and verifying the file exists at runtime. Avoid relying on “it’s in the repo so it must be included” — wheel contents are defined by your build configuration, not your Git tree.
Should I publish both wheel and sdist?
Yes, by default. Wheels provide fast installs, while sdists provide a source fallback and transparency. Building both also helps catch misconfigurations early (because sdists often expose missing files or metadata issues).
How do I publish pre-releases (alpha/beta/rc)?
Use a pre-release version scheme (for example: 1.2.0rc1 or 1.2.0b1) and tag it explicitly.
Many users won’t install pre-releases unless they opt in, which is exactly what you want for “try it early” builds.
What’s the safest way to publish from CI?
Publish only on tags/releases, run tests first, build artifacts in CI, run a metadata check, and then upload. If your hosting supports it, trusted publishing (OIDC) avoids storing long-lived tokens and reduces secret leakage risk.
I published a broken release — what now?
Don’t panic-delete and re-upload the same version. The clean recovery path is: fix the issue, bump a patch version, and publish again. If needed, you can discourage installs of a specific version (for example by yanking) while keeping history intact.
Cheatsheet
Keep this as your “shipping checklist” for every Python release. If you follow it, packaging becomes boring — in the best way.
Project setup
- Use src/ layout
- Include
README.mdandLICENSE - Have a minimal test suite (even 5–10 tests is fine)
- Decide distribution name vs import name
pyproject.toml
- Set name, version, description, readme
- Set requires-python
- Declare runtime dependencies only
- Put dev tools in optional-dependencies
- Add project.urls (Homepage/Issues)
Build + verify
- Build wheel + sdist
- Run twine check
- Install from the wheel in a fresh venv
- Smoke test: import + minimal function + CLI help
CI + publish
- Run tests on every PR/push
- Publish only on a tag (e.g.,
vX.Y.Z) - Build artifacts in CI (not locally)
- Keep publishing credentials scoped and secure
The three most valuable checks are: (1) clean install from wheel, (2) CI publish on tags only, (3) requires-python set correctly. Do those and your users will feel the difference.
Wrap-up
The “right way” to publish a Python package is the way that stays stable under pressure: a single source of truth (pyproject.toml), a build you can reproduce (CI artifacts), and a release mechanism that’s explicit (tags).
If you implement only one thing from this post, make it the clean install test from the built wheel. It’s the fastest way to align your repo reality with your users’ reality.
Next actions
- Copy the cheatsheet and use it for your next release
- Move your project to src layout if you haven’t
- Switch publishing to CI on tags (and delete the “manual upload” ritual)
Quiz
Quick self-check (demo). This quiz is auto-generated for programming / packaging / publish.