The Ultimate Guide to Cross-Platform Python Libraries
End-to-end advanced guide to design, build, test, publish, and maintain production-ready Python libraries with native extensions and CI/CD.
Built with β€οΈ for the Python Community
1. Project Goals & Design Principles
Foundation of every successful Python library
Before writing a single line of code, establish crystal-clear goals and design principles. This foundational step guides every architectural decision from API design to performance optimization strategies.
π Clarify Goals
Who Uses It?
-
π±
Application Developers: Need intuitive APIs, extensive examples, clear error messages. They want to get productive quickly without diving into internals.
-
π
Data Scientists: Require vectorized operations, NumPy/Pandas compatibility, efficient memory usage, and performance. They work with large datasets.
-
βοΈ
Systems Engineers: Demand fine-grained control, minimal dependencies, predictable behavior, and low overhead. Work in resource-constrained environments.
Performance vs Portability vs Simplicity
Every library faces trade-offs. You cannot maximize all threeβchoose your priorities:
β‘ Performance
Native extensions (C/C++/Rust), vectorization, algorithmic optimization. Accept build complexity.
π Portability
Pure Python, minimal dependencies, cross-platform testing. Accept potential performance limits.
π― Simplicity
Clear APIs, sensible defaults, minimal config. May sacrifice advanced features.
π Stable Public API Surface
Your public API is a contract with users. Breaking changes cause significant pain. Minimize your public API to only what's truly necessary.
- β Use semantic versioning: MAJOR.MINOR.PATCH
- β Define __all__ explicitly in modules
- β Mark internal APIs with leading underscore (_internal)
- β Maintain clear deprecation policy (warn 2+ versions before removal)
ποΈ Design Principles
Small, Orthogonal Modules
Embrace composition over inheritance. Build small, focused modules that do one thing well. Avoid deep inheritance hierarchiesβthey're fragile and hard to test.
Clear Sync/Async Boundaries
Never mix blocking and async code without clear separation. Provide separate modules or clear naming (process() vs async_process()). Never call blocking I/O from async code.
Strong Typing Everywhere
Type hints are no longer optional. Add type annotations to every public function. Enable strict type checking (mypy --strict) in CI. Types catch bugs before runtime.
Explicit Errors > Silent Failures
Fail fast and fail loudly. Raise clear, actionable exceptions with context: what failed, why, and how to fix it. Use custom exception classes.
Deterministic Behavior & Reproducible Builds
Ensure your library behaves consistently across runs, platforms, and Python versions. Pin build dependencies precisely using lock files.
π¨ API Shape
- β Avoid "god" objectsβprefer functions for stateless ops, thin classes for stateful resources
- β Expose minimal public symbols via __all__
- β Keep I/O (network/fs) pluggable using interfaces/protocols
- β Opt into stable limited C API for C extensions when possible for ABI stability
2. Choose Your Toolchain
Select the right tools for your project type
Modern Python has rich tooling options. Establish one "happy path"βa coherent set of tools that work together seamlessly.
Build Backend Options
π Pure Python
- Hatchling: Fast, modern, plugin system
- PDM-Backend: Blazing fast, dynamic metadata
- Flit-core: Minimalist, zero config
βοΈ C/C++ Extensions
- scikit-build-core: Modern CMake integration (recommended)
- setuptools + pybind11: Battle-tested, more boilerplate
π¦ Rust Extensions
- Maturin + PyO3: Zero-config, handles cross-compilation automatically
- Best DX: Memory safety + performance + great tooling
π¬ Cython
- scikit-build-core + Cython: Modern approach
- setuptools + Cython: Traditional, widely used
Essential Development Tools
π¦ Package Management
- β’ UV: Lightning-fast pip replacement (Rust-based)
- β’ PDM: Modern dependency management with lock files
- β’ Hatch: All-in-one project manager
β¨ Code Quality
- β’ Ruff: Ultra-fast linter + formatter (replaces Black, Flake8, isort)
- β’ Mypy/Pyright: Static type checking
- β’ Pre-commit: Git hooks for quality checks
π§ͺ Testing
- β’ Pytest: The standard (assert-based, fixtures)
- β’ Hypothesis: Property-based testing
- β’ Nox: Multi-environment test automation
π Ruff: The Game Changer
Ruff has revolutionized Python tooling by combining multiple tools into one blazingly fast executable. It's 10-100x faster than traditional tools.
# Install and use Ruff
uv pip install ruff
# Lint and autofix issues
ruff check . --fix
# Format code (Black-compatible)
ruff format .
Documentation & Release Tools
π Documentation
- β’ mkdocs-material: Fast, beautiful (recommended)
- β’ Sphinx: Heavy but comprehensive for complex API docs
π’ Release & Publishing
- β’ towncrier: Changelog generation
- β’ bump-my-version: Version bumping
- β’ cibuildwheel: Cross-platform wheel building
- β’ Trusted Publishing (OIDC): Secure PyPI publishing
3. Repository Layout
Organize your project for success
A clean, consistent repository structure makes your project maintainable and contributor-friendly. The src-layout is recommended to prevent accidental imports.
Recommended Src-Layout
yourlib/
βββ pyproject.toml # Modern Python configuration
βββ src/yourlib/ # src-layout (recommended)
β βββ __init__.py
β βββ core.py
β βββ utils.py
β βββ py.typed # PEP 561 type marker
β βββ _version.py # Optional, managed by Hatch
βββ tests/ # Tests mirror src structure
β βββ test_core.py
β βββ test_utils.py
βββ bench/ # Micro-benchmarks (pytest-benchmark)
βββ examples/ # Usage examples
β βββ quickstart.py
βββ include/ # Headers for C/C++
βββ yourlib_cpp/ # Or yourlib_rust / yourlib_c
βββ docs/ # Documentation (mkdocs or sphinx)
β βββ index.md
β βββ api.md
βββ .github/workflows/ # CI/CD pipelines
β βββ test.yml
β βββ build.yml
β βββ publish.yml
βββ README.md
βββ LICENSE
βββ CHANGELOG.md # towncrier builds it
βββ CITATION.cff # Optional for academic use
βββ .pre-commit-config.yaml
βββ .gitignore
Benefits of Src-Layout
- β Prevents accidental imports from the local directory instead of installed package
- β Enforces proper package installation during development
- β Clearer separation between source code and tests
- β Standard practice in modern Python projects
Key Files Explained
pyproject.toml
Modern Python configuration file. Replaces setup.py and setup.cfg. Contains metadata, dependencies, build config, and tool settings all in one place.
src/yourlib/__init__.py
Package entry point. Import and expose your public API here. Define __version__ and __all__ to control what's exported.
py.typed
Empty marker file indicating your package supports type checking (PEP 561). Required for mypy/pyright to recognize your type hints.
.github/workflows/
GitHub Actions CI/CD configuration. Automated testing, building wheels, and publishing across platforms and Python versions.
tests/
Test directory that mirrors your source structure. Use pytest for running tests with coverage reporting.
4. Minimal pyproject.toml
Modern Python project configuration
pyproject.toml is the modern way to configure Python projects following PEP 621 standards. It replaces setup.py and consolidates all configuration in one file.
Complete Example (Pure Python + Hatchling)
# pyproject.toml for pure Python library with Hatchling
[build-system]
requires = ["hatchling>=1.25"]
build-backend = "hatchling.build"
[project]
name = "yourlib"
version = "0.1.0"
description = "Awesome cross-platform Python library"
readme = "README.md"
license = { file = "LICENSE" }
authors = [
{ name = "You", email = "you@example.com" }
]
requires-python = ">=3.9"
dependencies = []
[project.optional-dependencies]
dev = [
"pytest>=7.4.0",
"pytest-cov>=4.1.0",
"mypy>=1.5.0",
"ruff>=0.1.0",
"nox>=2024.0.0",
"hypothesis>=6.92.0",
"pytest-benchmark>=4.0.0",
]
docs = [
"mkdocs-material>=9.0.0",
"mkdocstrings[python]>=0.24.0",
]
[project.urls]
Homepage = "https://github.com/yourusername/yourlib"
Documentation = "https://yourlib.readthedocs.io"
Repository = "https://github.com/yourusername/yourlib"
Changelog = "https://github.com/yourusername/yourlib/blob/main/CHANGELOG.md"
# Hatchling configuration
[tool.hatch.version]
path = "src/yourlib/_version.py"
[tool.hatch.build.targets.wheel]
packages = ["src/yourlib"]
# Ruff configuration
[tool.ruff]
line-length = 100
target-version = "py39"
select = ["E", "F", "I", "N", "UP", "B", "C4", "SIM"]
[tool.ruff.lint]
ignore = []
# Mypy configuration
[tool.mypy]
python_version = "3.9"
strict = true
warn_return_any = true
warn_unused_configs = true
disallow_untyped_defs = true
# Pytest configuration
[tool.pytest.ini_options]
testpaths = ["tests"]
addopts = "-v --cov=yourlib --cov-report=term --cov-report=html"
python_files = ["test_*.py"]
Key Sections Explained
[build-system]
Declares the build backend (Hatchling, setuptools, maturin, etc.) and its requirements. This tells pip/build how to build your package.
[project]
Package metadata following PEP 621βname, version, dependencies, authors, license, and more. This is the heart of your configuration.
[project.optional-dependencies]
Extra dependency groups for different purposesβdev tools, documentation, testing. Install with pip install -e ".[dev]"
[tool.*]
Configuration for development toolsβlinters, formatters, type checkers, test runners. All in one file instead of separate config files.
π‘ Pro Tips
- β’ Use requires-python to specify minimum Python version your library supports
- β’ Keep dependencies minimal in the main dependencies list
- β’ Use optional dependencies to separate dev, docs, and test requirements
- β’ Enable strict mode for mypy and comprehensive rule sets for ruff
5. Cross-Platform Native Builds
Building native extensions for performance
When pure Python isn't fast enough, native extensions provide C-level performance. This section covers C/C++, Rust, Cython, and Zig options for building cross-platform native code.
Option A: C/C++ via pybind11 (Recommended)
Setup with scikit-build-core
Modern CMake integration provides clean builds and better dependency management.
# pyproject.toml snippet
[build-system]
requires = ["scikit-build-core>=0.9", "pybind11>=2.12"]
build-backend = "scikit_build_core.build"
[project]
name = "yourlib"
version = "0.1.0"
[tool.scikit-build]
wheel.packages = ["src/yourlib"]
cmake.minimum-version = "3.18"
build.verbose = true
CMakeLists.txt
cmake_minimum_required(VERSION 3.18)
project(yourlib_native LANGUAGES CXX)
find_package(pybind11 CONFIG REQUIRED)
pybind11_add_module(_native src/yourlib/_native.cpp)
# Enable manylinux-friendly builds
target_compile_features(_native PRIVATE cxx_std_17)
_native.cpp (pybind11 example)
#include <pybind11/pybind11.h>
namespace py = pybind11;
int add(int a, int b) {
return a + b;
}
PYBIND11_MODULE(_native, m) {
m.doc() = "Native extension module";
m.def("add", &add, "Add two integers");
}
Option B: Rust via PyO3 + Maturin (Cleanest DX)
Rust provides memory safety and excellent performance
Cargo.toml
[package]
name = "yourlib_rust"
version = "0.1.0"
edition = "2021"
[lib]
name = "yourlib_rust"
crate-type = ["cdylib"]
[dependencies]
pyo3 = { version = "0.22", features = ["extension-module"] }
# Optional: for numpy interop
# numpy = "0.21"
src/lib.rs
use pyo3::prelude::*;
#[pyfunction]
fn add(a: i64, b: i64) -> PyResult<i64> {
Ok(a + b)
}
#[pymodule]
fn yourlib_rust(py: Python, m: &PyModule) -> PyResult<()> {
m.add_function(wrap_pyfunction!(add, m)?)?;
Ok(())
}
pyproject.toml (Rust build)
[build-system]
requires = ["maturin>=1.6"]
build-backend = "maturin"
[project]
name = "yourlib-rust"
requires-python = ">=3.9"
# Call from Python: from yourlib_rust import add
Option C: C via cffi/ctypes
Build a shared library (C ABI) separately and load it at runtime. cffi is better than ctypes for ergonomics and speed.
Python side (cffi)
from cffi import FFI
ffi = FFI()
ffi.cdef("int add(int a, int b);")
# .dll on Windows, .dylib on macOS, .so on Linux
lib = ffi.dlopen("libyourlib.so")
def add(a: int, b: int) -> int:
return lib.add(a, b)
Option D: Cython (Python-ish Syntax + Speed)
Cython provides Python-like syntax with C-level performance. Great for gradually optimizing existing Python code.
src/yourlib/_fast.pyx
cpdef int add(int a, int b):
"""Add two integers with C performance."""
return a + b
Use scikit-build-core or setuptools with Cython. For ABI stability, prefer limited API when writing CPython C extensions.
Option E: Zig (via C ABI)
Compile Zig to a C-ABI shared library and load from Python.
add.zig
// add.zig
export fn add(a: c_int, b: c_int) c_int {
return a + b;
}
Build command
zig build-lib -dynamic -O ReleaseFast add.zig
Load from Python with cffi or ctypes exactly like C.
π― Choosing Your Native Build Strategy
- β C/C++ + pybind11: Best for integrating existing C++ codebases, mature ecosystem
- β Rust + PyO3: Best DX, memory safety, modern toolingβrecommended for new projects
- β Cython: Gradual optimization of Python code, easy learning curve
- β CFFI: When you already have C libraries and don't want compilation complexity
- β Zig: Simpler than C, comptime features, but smaller ecosystem
6. Testing Strategy
Comprehensive testing for reliability
A comprehensive testing strategy ensures your library works reliably across platforms, Python versions, and use cases. Cover unit tests, integration tests, performance benchmarks, type checking, and ABI compatibility.
β Unit Tests with Pytest
Pytest is the de facto standard for Python testing. Use assert-based syntax, powerful fixtures, and extensive plugin ecosystem.
# tests/test_core.py
import pytest
from yourlib import add, process_data
def test_add_basic():
assert add(2, 3) == 5
assert add(-1, 1) == 0
assert add(0, 0) == 0
def test_add_type_errors():
with pytest.raises(TypeError):
add("2", 3)
# Use fixtures for setup
@pytest.fixture
def sample_data():
return {"key": "value", "count": 42}
def test_process_data(sample_data):
result = process_data(sample_data)
assert result["count"] == 42
Key Testing Principles
- β Test public API only: Internal tests allowed but keep separate
- β Mock I/O: Don't make real network calls or filesystem writes in unit tests
- β Fast tests: Unit tests should run in milliseconds
π² Property-Based Testing with Hypothesis
Hypothesis automatically generates test cases to find edge cases you wouldn't think of manually. It's invaluable for testing complex functions with many inputs.
from hypothesis import given, strategies as st
# Test with thousands of random integers
@given(st.integers(), st.integers())
def test_add_commutative(a, b):
"""Addition is commutative."""
assert add(a, b) == add(b, a)
@given(st.integers(), st.integers())
def test_add_associative(a, b):
"""Adding to itself equals double."""
assert add(a, a) == 2 * a
β‘ Performance Testing
Use pytest-benchmark for micro-benchmarks to track performance regressions and compare pure Python vs native implementations.
def test_benchmark_add(benchmark):
# Benchmark runs the function many times
result = benchmark(add, 1000, 2000)
assert result == 3000
def test_compare_implementations(benchmark):
# Compare pure Python vs native
data = list(range(10000))
result = benchmark(sum_vec, data)
assert result == sum(data)
π€ Type Checking & Coverage
Type Checking
Run mypy --strict or pyright on your source code to catch type errors.
# In CI
mypy --strict src/
pyright src/
Code Coverage
Track test coverage with pytest-cov and enforce minimum thresholds.
# Track coverage
pytest --cov=yourlib \
--cov-report=html \
--cov-report=term
π§ Native Extension Testing
For libraries with native extensions, test on all target platforms (Linux/macOS/Windows) and Python versions.
- β ABI checks: Import tests on manylinux/musllinux, Windows, macOS
- β Use abi3 (limited API) for Python β₯3.8 to reduce wheel matrix
-
β
Test import time:
python -X importtime -c "import yourlib"
π Security & Supply Chain
- β’ Run pip-audit or uv pip audit to check for vulnerabilities
- β’ Pin or constrain build dependencies for reproducible builds
- β’ Avoid vendoring large shared libraries unless necessary
7. Documentation & Examples
Make your library accessible and discoverable
Great documentation is critical for adoption. Use mkdocs-material for fast, beautiful docs with auto-generated API references, usage examples, and a cookbook of common patterns.
β‘ MkDocs Material (Recommended)
MkDocs Material provides a modern, fast documentation site with excellent DX. It includes instant search, dark mode, and beautiful responsive design.
mkdocs.yml Configuration
site_name: YourLib
site_description: Cross-platform Python library
site_url: https://yourlib.readthedocs.io
theme:
name: material
palette:
- scheme: default
primary: indigo
accent: indigo
features:
- navigation.tabs
- navigation.sections
- toc.integrate
- search.suggest
markdown_extensions:
- admonition
- codehilite
- toc:
permalink: true
- pymdownx.highlight
- pymdownx.superfences
plugins:
- search
- mkdocstrings:
handlers:
python:
options:
docstring_style: google
show_source: true
show_root_heading: true
nav:
- Home: index.md
- Getting Started: getting-started.md
- User Guide: guide.md
- API Reference: api.md
- Cookbook: cookbook.md
- FAQ: faq.md
Auto-Generated API Docs
# docs/api.md
# API Reference
::: yourlib
options:
show_root_heading: true
show_source: true
π Documentation Structure
Getting Started
- β’ Installation instructions
- β’ Quick start guide (5 minutes)
- β’ Basic usage examples
- β’ Common patterns
User Guide
- β’ Detailed concepts
- β’ Advanced features
- β’ Best practices
- β’ Performance tips
API Reference
- β’ Auto-generated from docstrings
- β’ Type signatures
- β’ Parameter descriptions
- β’ Return values & exceptions
Cookbook & FAQ
- β’ Copy-paste recipes
- β’ Common use cases
- β’ Troubleshooting
- β’ Migration guides
π‘ Usage Examples
Maintain a /examples directory with working, tested code samples that demonstrate real-world usage patterns.
# examples/quickstart.py
"""
Quickstart example for YourLib.
This example demonstrates basic usage patterns.
"""
from yourlib import process_data
# Load some data
data = {"key": "value", "count": 42}
# Process it
result = process_data(data)
print(f"Result: {result}")
if __name__ == "__main__":
# Run the example
main()
β¨ Documentation Best Practices
- β Keep examples tested: Use doctest or run examples in CI
- β Use diagrams: Mermaid or Diagrams library for architecture
- β Version your docs: mike (MkDocs versioning) for API evolution
- β Document exceptions: What can go wrong and how to handle it
8. Continuous Integration (CI/CD)
Automate testing, building, and publishing
GitHub Actions provides powerful CI/CD automation for testing across platforms and Python versions, building wheels, and publishing to PyPI with Trusted Publishing (OIDCβno API tokens needed!).
π Pure Python CI Workflow
.github/workflows/ci.yml
name: CI
on:
push:
branches: [main]
pull_request:
jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.9", "3.10", "3.11", "3.12", "3.13"]
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
pip install -U pip uv
uv pip install -e ".[dev]"
- name: Lint with ruff
run: |
ruff check .
ruff format --check .
- name: Type check
run: mypy src
- name: Run tests
run: pytest --maxfail=1 --cov=yourlib --cov-report=xml
- name: Upload coverage
uses: codecov/codecov-action@v4
with:
file: ./coverage.xml
π Publishing with Trusted Publishing
Trusted Publishing (OIDC) eliminates the need for PyPI API tokens. Configure your PyPI project to trust GitHub Actions, then publish directly.
.github/workflows/release.yml
name: Release
on:
push:
tags: ["v*.*.*"]
permissions:
id-token: write # Required for OIDC
contents: read
jobs:
build-and-publish:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.11"
- name: Build package
run: |
pip install -U pip build
python -m build # Creates sdist + wheel
- name: Publish to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
with:
skip-existing: true
π§ Native Extension Wheels (cibuildwheel)
For C/C++/Cython extensions, use cibuildwheel to build wheels for multiple platforms and Python versions automatically.
.github/workflows/wheels.yml
name: Build Wheels
on:
push:
tags: ["v*.*.*"]
jobs:
build_wheels:
name: Wheels on ${{ matrix.os }}
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-22.04, windows-2022, macos-13]
steps:
- uses: actions/checkout@v4
- uses: pypa/cibuildwheel@v2.21.3
env:
CIBW_SKIP: "pp* *-win32 cp36-*"
CIBW_TEST_COMMAND: "pytest -q"
CIBW_TEST_EXTRAS: "dev"
CIBW_BUILD_VERBOSITY: 1
- uses: actions/upload-artifact@v4
with:
name: wheels-${{ matrix.os }}
path: wheelhouse/*.whl
publish:
needs: [build_wheels]
runs-on: ubuntu-latest
permissions:
id-token: write
steps:
- uses: actions/download-artifact@v4
with:
pattern: wheels-*
merge-multiple: true
- uses: pypa/gh-action-pypi-publish@release/v1
π¦ Rust Wheels with Maturin
Maturin makes building Rust extension wheels incredibly simple with automatic manylinux support.
name: Rust Wheels
on:
push:
tags: ["v*.*.*"]
jobs:
build:
strategy:
matrix:
os: [ubuntu-22.04, windows-2022, macos-13]
python: ["3.9", "3.10", "3.11", "3.12", "3.13"]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python }}
- uses: dtolnay/rust-toolchain@stable
- name: Build wheel
run: |
pip install maturin pytest
maturin build --release --strip --manylinux 2014
- name: Test wheel
run: |
pip install target/wheels/*.whl[dev]
pytest -q
- uses: actions/upload-artifact@v4
with:
name: wheels-${{ matrix.os }}-${{ matrix.python }}
path: target/wheels/*.whl
9. Cross-Platform Nuances
Platform-specific considerations
Building truly cross-platform libraries requires understanding platform-specific quirks, build tools, and dependency management for Linux, macOS, and Windows.
π§ Linux (manylinux)
-
β
Prefer manylinux2014+ or musllinux: These wheels work across many Linux distributions.
auditwheel repairis run by cibuildwheel/maturin automatically. -
β
Avoid newer glibc symbols: Don't link against system glibc symbols newer than the manylinux baseline. Use manylinux Docker images.
-
β
Bundle C/C++ dependencies: Bundle small dependencies; document GPL/LGPL obligations if any.
π macOS
-
β
Ship universal2 wheels: Combine x86_64 and arm64 (Apple Silicon) in one wheel for maximum compatibility.
-
β
Use delocate: Bundle
.dylibdependencies into your wheel. -
β
SDK paths: Ensure SDK path is correctly set in CI (GitHub Actions Xcode images usually work fine).
πͺ Windows
-
β
MSVC build tools: Use Microsoft Visual C++ Build Tools. CRT runtime redistribution handled by wheels.
-
β
Use delvewheel: Bundle
.dlldependencies automatically. -
β
Path length limits: Be aware of 260 character path limits (fixed in newer Windows 10+, but use short paths in CI).
π§ Python & ABI Compatibility
- β Choose abi3 (Stable ABI): For broad cp3x compatibility with one wheel per platform
- β NumPy C-API: Pin build against oldest supported NumPy for that major version
- β Test import on all platforms: Catch ABI incompatibilities early in CI
10. FFI Recipes (Copy-Paste)
Ready-to-use foreign function interface examples
Here are battle-tested FFI (Foreign Function Interface) recipes for integrating C, C++, Rust, Cython, and Zig with Python. Copy, paste, and adapt for your use case.
βοΈ pybind11 (C++)
vector_sum.cpp
#include <pybind11/pybind11.h>
#include <pybind11/stl.h>
long long sum_vec(const std::vector<int>& v) {
long long s = 0;
for (int x : v) s += x;
return s;
}
PYBIND11_MODULE(_vecops, m) {
m.doc() = "Vector operations module";
m.def("sum_vec", &sum_vec, "Sum all integers in vector");
}
Python usage
from yourlib._vecops import sum_vec
result = sum_vec([1, 2, 3, 4, 5])
print(result) # 15
π¦ PyO3 (Rust)
src/lib.rs
use pyo3::prelude::*;
#[pyfunction]
fn sum_vec(xs: Vec<i64>) -> PyResult<i64> {
Ok(xs.iter().sum())
}
#[pymodule]
fn yourlib_rust(_py: Python, m: &PyModule) -> PyResult<()> {
m.add_function(wrap_pyfunction!(sum_vec, m)?)?;
Ok(())
}
Python usage
from yourlib_rust import sum_vec
result = sum_vec([1, 2, 3, 4, 5])
print(result) # 15
π CFFI (C ABI)
libsum.c
// Compile: gcc -shared -fPIC -o libsum.so libsum.c
int sum(const int *a, int n) {
long s = 0;
for (int i = 0; i < n; i++) s += a[i];
return (int)s;
}
Python wrapper (cffi)
from cffi import FFI
ffi = FFI()
ffi.cdef("int sum(const int *a, int n);")
lib = ffi.dlopen("./libsum.so")
def sum_py(a: list[int]) -> int:
buf = ffi.new("int[]", a)
return lib.sum(buf, len(a))
π¬ Cython
_sum.pyx
cpdef long long sum_cython(list xs):
"""Sum integers with C-level performance."""
cdef Py_ssize_t i, n = len(xs)
cdef long long s = 0
for i in range(n):
s += <long long> xs[i]
return s
β‘ Zig (via C ABI + CFFI)
mul.zig
export fn mul(a: c_int, b: c_int) c_int {
return a * b;
}
Build and use
# Build shared library
zig build-lib -dynamic -O ReleaseFast mul.zig
# Python usage with cffi
from cffi import FFI
ffi = FFI()
ffi.cdef("int mul(int a, int b);")
lib = ffi.dlopen("./libmul.so") # or .dylib/.dll
result = lib.mul(6, 7)
print(result) # 42
11. Quality Gates & Pre-commit
Automated quality checks before commits
Quality gates ensure code quality before it reaches your repository. Use pre-commit hooks to automatically run linters, formatters, and checks on every commitβcatching issues early and maintaining consistent code quality across all contributors.
πͺ Pre-commit Configuration
Pre-commit runs checks automatically before each commit. Install once, then checks run on every commit across all contributors.
.pre-commit-config.yaml
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.5.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-added-large-files
args: ['--maxkb=1000']
- id: check-merge-conflict
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.1.9
hooks:
- id: ruff
args: [--fix, --exit-non-zero-on-fix]
- id: ruff-format
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.8.0
hooks:
- id: mypy
additional_dependencies: [types-requests]
args: [--strict]
Installation
# Install pre-commit
pip install pre-commit
# Install hooks in your repo
pre-commit install
# Run manually on all files
pre-commit run --all-files
π― Essential Quality Checks
Import Time Testing
Test cold import speed to catch slow imports:
python -X importtime -c "import yourlib"
Goal: Keep import time under 100ms
Wheel Auditing
Ensure wheels have no unvendored dependencies:
# Linux
auditwheel show dist/*.whl
# macOS
delocate-listdeps dist/*.whl
Size Budget
Fail CI if wheel exceeds threshold:
# In CI
MAX_SIZE=10485760 # 10MB
wheel_size=$(stat -f%z dist/*.whl)
[ $wheel_size -gt $MAX_SIZE ] && exit 1
Backward Compatibility
Run examples from prior versions:
# Test against v0.9 examples
pytest tests/compat/v0_9_examples.py
π Security Checks
-
β
Dependency vulnerabilities:
pip-auditoruv pip audit -
β
Secrets scanning: Use
detect-secretsor GitHub Secret Scanning -
β
License compliance: Check all dependencies with
pip-licenses
12. Versioning & Changelog
Semantic versioning and automated changelog
Follow semantic versioning (semver) religiously: MAJOR.MINOR.PATCH. Use automated tools like towncrier for changelog generation and maintain a clear deprecation policy that respects your users' time.
π’ Semantic Versioning (MAJOR.MINOR.PATCH)
MAJOR (X.0.0)
Incompatible API changes that break backward compatibility.
- β’ Remove deprecated features
- β’ Change function signatures
- β’ Rename public APIs
- β’ Change default behavior
MINOR (0.X.0)
Add functionality in a backward-compatible manner.
- β’ Add new functions/classes
- β’ Add new parameters with defaults
- β’ Deprecate features (with warnings)
- β’ Internal refactors
PATCH (0.0.X)
Backward-compatible bug fixes only.
- β’ Fix bugs
- β’ Performance improvements
- β’ Documentation updates
- β’ Security patches
π Automated Changelog with Towncrier
Towncrier generates changelogs from individual news fragments. Each PR adds a small fragment file, then towncrier builds the changelog on release.
pyproject.toml Configuration
[tool.towncrier]
directory = "changelog.d"
filename = "CHANGELOG.md"
start_string = "\n"
template = "changelog.d/template.md"
title_format = "## [{version}] - {project_date}"
issue_format = "[#{issue}](https://github.com/you/yourlib/issues/{issue})"
[[tool.towncrier.type]]
directory = "feature"
name = "Features"
showcontent = true
[[tool.towncrier.type]]
directory = "bugfix"
name = "Bug Fixes"
showcontent = true
[[tool.towncrier.type]]
directory = "deprecation"
name = "Deprecations"
showcontent = true
[[tool.towncrier.type]]
directory = "breaking"
name = "Breaking Changes"
showcontent = true
Creating News Fragments
# Create fragment for issue #123
towncrier create 123.feature.md
# Edit changelog.d/123.feature.md
echo "Add amazing new feature for data processing" > changelog.d/123.feature.md
Building Changelog on Release
# Build and update CHANGELOG.md
towncrier build --version 1.2.0
# Preview without modifying files
towncrier build --draft
β οΈ Deprecation Policy
Give users at least 2 minor versions of warning before removing deprecated features. Use Python's warnings module with proper stack levels.
Deprecation Example
import warnings
def old_function(x: int) -> int:
"""Deprecated function (will be removed in v2.0)."""
warnings.warn(
"old_function is deprecated and will be removed in v2.0. "
"Use new_function instead.",
DeprecationWarning,
stacklevel=2 # Shows caller's location
)
return new_function(x)
def new_function(x: int) -> int:
"""Improved implementation."""
return x * 2
Deprecation Timeline
- v1.5.0: Deprecate old_function, add DeprecationWarning
- v1.6.0: Still deprecated, warning continues
- v2.0.0: Remove old_function (MAJOR version bump)
π― Version Bumping Tools
Hatch
hatch version minor
hatch version patch
hatch version major
bump-my-version
bump-my-version bump minor
bump-my-version bump patch
13. Documentation Configuration
Complete MkDocs setup with API docs
A comprehensive documentation setup using MkDocs Material with automatic API reference generation, beautiful theming, and live preview during development.
π Complete mkdocs.yml Configuration
site_name: YourLib Documentation
site_description: Cross-platform Python library for awesome things
site_url: https://yourlib.readthedocs.io
repo_url: https://github.com/yourusername/yourlib
repo_name: yourusername/yourlib
theme:
name: material
palette:
# Light mode
- scheme: default
primary: indigo
accent: indigo
toggle:
icon: material/brightness-7
name: Switch to dark mode
# Dark mode
- scheme: slate
primary: indigo
accent: indigo
toggle:
icon: material/brightness-4
name: Switch to light mode
features:
- navigation.tabs
- navigation.tabs.sticky
- navigation.sections
- navigation.expand
- navigation.top
- toc.integrate
- search.suggest
- search.highlight
- content.code.copy
- content.code.annotate
markdown_extensions:
- admonition
- pymdownx.details
- pymdownx.superfences:
custom_fences:
- name: mermaid
class: mermaid
format: !!python/name:pymdownx.superfences.fence_code_format
- pymdownx.highlight:
anchor_linenums: true
line_spans: __span
pygments_lang_class: true
- pymdownx.inlinehilite
- pymdownx.snippets
- pymdownx.tabbed:
alternate_style: true
- toc:
permalink: true
- attr_list
- md_in_html
plugins:
- search
- mkdocstrings:
handlers:
python:
options:
docstring_style: google
show_source: true
show_root_heading: true
show_root_full_path: false
show_signature_annotations: true
separate_signature: true
merge_init_into_class: true
docstring_section_style: table
nav:
- Home: index.md
- Getting Started:
- Installation: getting-started/installation.md
- Quick Start: getting-started/quickstart.md
- Basic Usage: getting-started/usage.md
- User Guide:
- Core Concepts: guide/concepts.md
- Advanced Usage: guide/advanced.md
- Best Practices: guide/best-practices.md
- API Reference: api.md
- Cookbook:
- Common Tasks: cookbook/common.md
- Performance Tips: cookbook/performance.md
- FAQ: faq.md
- Changelog: changelog.md
π§ API Reference Setup
docs/api.md
# API Reference
Complete API documentation for YourLib.
## Core Module
::: yourlib.core
options:
show_root_heading: true
show_source: true
members_order: source
## Utilities
::: yourlib.utils
options:
show_root_heading: true
show_source: false
Docstring Example (Google Style)
def process_data(data: dict, validate: bool = True) -> dict:
"""Process input data with optional validation.
This function processes the input dictionary and returns
a transformed version. Optionally validates input before
processing.
Args:
data: Input dictionary to process
validate: Whether to validate input (default: True)
Returns:
Processed dictionary with transformed values
Raises:
ValueError: If validation fails
TypeError: If data is not a dictionary
Example:
>>> result = process_data({"key": "value"})
>>> print(result)
{"key": "VALUE"}
"""
if validate and not isinstance(data, dict):
raise TypeError("data must be a dictionary")
return {k: v.upper() for k, v in data.items()}
π Development & Deployment
Local Development
# Install docs dependencies
pip install -e ".[docs]"
# Live preview (auto-reload)
mkdocs serve
# Build static site
mkdocs build
GitHub Pages Deploy
# Deploy to gh-pages branch
mkdocs gh-deploy
# Or in CI
- run: mkdocs gh-deploy --force
14. Publishing to PyPI
Trusted Publishing with GitHub Actions
Use Trusted Publishing (OIDC) with PyPIβno API tokens needed! Configure your PyPI project to trust GitHub Actions, then publish directly from CI with cryptographic proof of identity.
π Setup Trusted Publishing
Step 1: Configure PyPI Project
- 1. Go to pypi.org (or test.pypi.org) and create your project
- 2. Navigate to Publishing section in project settings
- 3. Add a new Trusted Publisher with:
- Owner: your-github-username
- Repository: yourlib
- Workflow: release.yml
- Environment: (leave empty or specify)
No API tokens needed! GitHub Actions will authenticate using OIDC (OpenID Connect).
π GitHub Actions Workflow
.github/workflows/release.yml
name: Release to PyPI
on:
push:
tags:
- "v*.*.*"
permissions:
id-token: write # REQUIRED for OIDC
contents: read
jobs:
build-and-publish:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.11"
- name: Install build tools
run: |
pip install -U pip build twine
- name: Build package
run: python -m build # Creates sdist + wheel
- name: Check package
run: twine check dist/*
- name: Publish to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
with:
skip-existing: true
verbose: true
β Release Checklist
-
1.
Update version and changelog
hatch version minor && towncrier build --version 1.2.0 -
2.
Commit changes
git commit -am "Release v1.2.0" -
3.
Create and push tag
git tag v1.2.0 && git push origin main --tags -
4.
GitHub Actions automatically publishes Watch the workflow run in GitHub Actions tab
-
5.
Verify installation
pip install yourlib==1.2.0on fresh environment -
6.
Create GitHub Release Add release notes from CHANGELOG.md
π‘ Pro Tips
- β’ Always ship both sdist and wheel for maximum compatibility
- β’ Test on test.pypi.org first before production PyPI
- β’ Use skip-existing: true to avoid errors on re-runs
- β’ Include README.md and LICENSE in your package
15. Maintenance & Operations
Long-term project sustainability
Sustainable open-source projects need clear policies, community guidelines, and operational procedures. Establish these early to create a welcoming, productive environment for contributors and users.
π Essential Project Files
Issue Templates
Create structured templates for bug reports and feature requests:
.github/ISSUE_TEMPLATE/
βββ bug_report.yml
βββ feature_request.yml
βββ config.yml
Pull Request Template
Checklist for contributors:
.github/pull_request_template.md
- [ ] Tests pass
- [ ] Docs updated
- [ ] Changelog entry added
- [ ] Type hints included
Code of Conduct
Use Contributor Covenant to establish community standards and behavior expectations.
SECURITY.md
Document security vulnerability reporting process and supported versions.
π― Support Policy
Python Version Support
| Python Version | Status | EOL Date |
|---|---|---|
| 3.13 | β Supported | Oct 2029 |
| 3.12 | β Supported | Oct 2028 |
| 3.11 | β Supported | Oct 2027 |
| 3.10 | β Supported | Oct 2026 |
| 3.9 | β LTS Only | Oct 2025 |
Policy: Support all non-EOL Python versions. Drop support 6 months after official EOL.
π Security & Privacy
Security Policy
- β Private disclosure: security@yourlib.org email for vulnerabilities
- β Response time: Acknowledge within 48 hours, patch within 7 days for critical issues
- β GitHub Security Advisories: Use for coordinated disclosure
- β Telemetry policy: No telemetry by default. If added, make it opt-in and transparent
πΊοΈ Roadmap & Governance
Project Governance
- Maintainers: Core team with commit access
- Release managers: Authorized to cut releases
- Review SLA: First response within 3 days for PRs
- Decision process: Consensus-seeking; maintainer vote for deadlocks
Public Roadmap
Maintain ROADMAP.md with planned features and timeline. Use GitHub Projects for tracking progress.
16. Performance & Profiling
Optimize what matters with data-driven insights
Profile before optimizing. Measure actual performance bottlenecks with profiling tools, then optimize with data-driven decisions. Use micro-benchmarks for hot paths and macro profilers for system-level analysis.
π¬ Micro-Benchmarking
pytest-benchmark
Track performance of hot paths with statistical analysis:
def test_benchmark_sum_pure(benchmark):
data = list(range(10000))
result = benchmark(sum, data)
assert result == 49995000
def test_benchmark_sum_native(benchmark):
data = list(range(10000))
result = benchmark(sum_vec, data) # Native impl
assert result == 49995000
Compare: Run both benchmarks to measure speedup from native code
π Macro Profiling Tools
py-spy
Sampling profiler that doesn't require code changes. Works on running processes.
py-spy record -o profile.svg -- python app.py
py-spy top --pid 12345
scalene
CPU + GPU + memory profiler with per-line attribution. Shows what's slow and why.
scalene yourscript.py
# Opens browser with results
line_profiler
Line-by-line profiling for specific functions. Add @profile decorator.
kernprof -l -v yourscript.py
cProfile + snakeviz
Built-in profiler with visualization. Good for call graphs.
python -m cProfile -o output.prof script.py
snakeviz output.prof
πΎ Memory Profiling
tracemalloc (Built-in)
import tracemalloc
tracemalloc.start()
# Your code here
snapshot = tracemalloc.take_snapshot()
top_stats = snapshot.statistics('lineno')
for stat in top_stats[:10]:
print(stat)
memray
memray run yourscript.py
memray flamegraph output.bin
Memory profiler with native stack traces. Excellent for finding leaks.
π― Performance Best Practices
- β Choose data structures carefully: dict/set O(1) vs list O(n) lookups matter at scale
- β Avoid unnecessary object churn: Reuse buffers, use generators where possible
- β Use NumPy for numerical work: Vectorized ops are 10-100x faster than pure Python loops
- β Profile before writing native code: Python is often "fast enough" with proper algorithms
- β For data buffers: Use buffer protocol, memoryview, or NumPy arrays
17. Packaging Pitfalls & Solutions
Avoid common traps in cross-platform packaging
Learn from others' mistakes. These are the most common packaging issues that plague Python libraries, especially those with native extensions.
π Locale/Encoding Issues
Problem: Files opened without explicit encoding fail on non-UTF-8 systems (Windows, some Linux).
β Bad:
with open("file.txt") as f:
data = f.read()
β Good:
with open("file.txt", encoding="utf-8") as f:
data = f.read()
π§ͺ Tests Not in Sdist
Problem: Downstream packagers can't run tests from source distribution.
Solution: Ensure tests are included via MANIFEST.in or pyproject.toml include rules.
π RPATH/DYLIB Hell
Problem: Native extensions can't find shared libraries at runtime.
- Linux: Use
auditwheel repair(automatic with cibuildwheel) - macOS: Use
delocate-wheelto bundle dylibs - Windows: Use
delvewheelto bundle DLLs
π Incompatible NumPy ABI
Problem: Wheels built against new NumPy don't work with older versions.
Solution: Pin build dependency to oldest supported NumPy for that Python version. Use oldest-supported-numpy package.
π§± glibc Too New
Problem: Linux wheels built on recent systems don't work on older distributions.
Solution: Use manylinux Docker images for building. cibuildwheel handles this automatically.
πͺ MSVC Runtime Issues
Problem: Windows builds fail due to missing Visual C++ runtime.
Solution: Let wheels pull the right CRT. Don't statically link unless absolutely necessary. Use official Python.org builds for consistency.
π Vendored Dependencies
Problem: Vendored OpenSSL/zlib/etc. become security liabilities and licensing nightmares.
Solution: Avoid vendoring if possible. If required, keep updated, document licenses clearly, and consider static analysis for vulnerabilities.
18. Hybrid Projects (Pure Python + Optional Native)
Best of both worlds: portability + performance
Hybrid projects provide pure Python fallbacks with optional native acceleration. Users get a working library everywhere, with automatic speedups when native wheels are available.
π Repository Structure
yourlib/
βββ src/yourlib/
β βββ __init__.py
β βββ core.py # Pure Python implementation
β βββ _fast.py # Wrapper that tries native import
βββ yourlib-rust/ # Optional Rust extension (colocated)
βββ Cargo.toml
βββ src/lib.rs
π§ Implementation Pattern
Python Wrapper (_fast.py)
# Try to import native implementation
try:
from yourlib_rust import add as _fast_add
from yourlib_rust import sum_vec as _fast_sum
HAS_NATIVE = True
except ImportError: # pragma: no cover
_fast_add = None
_fast_sum = None
HAS_NATIVE = False
def add(a: int, b: int) -> int:
"""Add two integers (uses native if available)."""
if _fast_add is not None:
return _fast_add(a, b)
return a + b # Pure Python fallback
def sum_vec(items: list[int]) -> int:
"""Sum a list of integers."""
if _fast_sum is not None:
return _fast_sum(items)
return sum(items) # Pure Python fallback
Public API (__init__.py)
from yourlib._fast import add, sum_vec, HAS_NATIVE
__all__ = ["add", "sum_vec", "HAS_NATIVE"]
π¦ Distribution Strategies
Strategy 1: Two Packages
Ship separate distributions:
- β’ yourlib (pure Python, always works)
- β’ yourlib-rust (native wheels for speedup)
Users: pip install yourlib-rust for acceleration
Strategy 2: Single Package
One package with optional native:
- β’ Ship both sdist and native wheels
- β’ Falls back to pure Python if no wheel
- β’ Automatic acceleration
Requires managing larger wheel matrix
19. Local Development Workflow
Fast iteration and testing locally
A smooth local development workflow is critical for productivity. Use modern tools for fast environment setup, instant feedback, and efficient testing.
π Quick Setup
# Clone and setup (30 seconds)
git clone https://github.com/you/yourlib
cd yourlib
# Create virtual environment with uv (blazing fast)
uv venv
source .venv/bin/activate # or .venv\Scripts\activate on Windows
# Install in editable mode with dev dependencies
uv pip install -e ".[dev,docs]"
# Install pre-commit hooks
pre-commit install
β‘ Fast Iteration Loop
Code Quality Checks
# Format code (auto-fix)
ruff format .
# Lint and fix issues
ruff check . --fix
# Type check
mypy src
Testing
# Run tests quickly
pytest -q
# With coverage
pytest --cov=yourlib
# Watch mode (pytest-watch)
ptw
Documentation
# Live reload (auto-refresh browser)
mkdocs serve
# Build static site
mkdocs build
All-in-One
# Run all checks with nox
nox
# Specific session
nox -s tests
nox -s lint
π§ Native Extension Development
Rust (Maturin)
# Develop mode (fast rebuilds)
maturin develop
# Release mode (optimized)
maturin develop --release
C/C++ (scikit-build-core)
# Editable install rebuilds on import
pip install -e . --no-build-isolation
# Force rebuild
pip install -e . --no-build-isolation --force-reinstall
20. Release Runbook
Step-by-step release process
Follow this checklist for every release to ensure consistent, reliable deployments to PyPI with proper versioning and documentation.
Complete Release Checklist
Update News Fragments
Create changelog entries for all changes since last release:
towncrier create 123.feature.md
towncrier create 124.bugfix.md
Bump Version
Update version following semver:
hatch version minor # or patch/major
# Or manually edit pyproject.toml
Build Changelog
Generate CHANGELOG.md from fragments:
towncrier build --version 1.2.0
Update Documentation
Review and update docs, examples, and README if needed
Commit Changes
git add .
git commit -m "Release v1.2.0"
git push origin main
Create and Push Tag
git tag v1.2.0
git push origin v1.2.0
This triggers GitHub Actions to build and publish
Monitor CI/CD
Watch GitHub Actions workflow complete:
- β’ Wheels built for all platforms
- β’ Tests pass on all Python versions
- β’ Package published to PyPI
Verify Installation
Test on fresh environments:
pip install yourlib==1.2.0
python -c "import yourlib; print(yourlib.__version__)"
Create GitHub Release
Add release notes from CHANGELOG.md to GitHub Releases page
Announce Release
Share on social media, mailing lists, or relevant communities
21. Best Practices & Final Tips
Essential recommendations for production libraries
These battle-tested recommendations will save you countless hours and prevent common pitfalls. Apply them from day one for a maintainable, professional Python library.
ποΈ Architecture
- β Default to src-layout to avoid import issues
- β Keep public API tinyβre-export from single __init__.py
- β Avoid global state; make thread-safe where relevant
- β Prefer functions or thin classes over deep hierarchies
π€ Type Safety
- β Provide typing stubs for compiled modules
- β Run mypy/pyright in strict mode in CI
- β Include py.typed marker for PEP 561 compliance
- β Export protocols for duck-typed interfaces
β‘ Performance
- β Add feature flag to toggle native vs pure Python
- β Bench before writing native codeβPython is often fast enough
- β For data buffers, prefer buffer protocol/memoryview
- β Use NumPy interop for numerical operations
π§ͺ Testing
- β Snapshot tests for text/markdown/json outputs
- β Continuous fuzzing with OSS-Fuzz/atheris
- β Test across all target platforms in CI
- β Include backwards compatibility tests
π¦ Distribution
- β Ship CLI via project.scripts for quick demos
- β Always include both sdist and wheels
- β Test installation from PyPI before announcing
- β Provide clear migration guides for breaking changes
π Documentation
- β Maintain cookbook with copy-paste examples
- β Document failure modes and error handling
- β Keep examples in sync (run them in CI)
- β Provide architecture diagrams for complex projects
π― Golden Rules
- 1. Start simple, optimize later: Pure Python first, native code only for proven bottlenecks
- 2. Test everything: If it's not tested, it's brokenβespecially across platforms
- 3. Document relentlessly: Future you (and your users) will thank you
- 4. Respect semver: Breaking changes need MAJOR version bumpsβno exceptions
- 5. Automate everything: CI/CD, releases, changelog, quality checksβno manual steps
π Resources & Next Steps
π Documentation
- β’ Python Packaging Guide - packaging.python.org
- β’ PyPA Specifications - PEPs 517, 518, 621, 660
- β’ Trusted Publishing - docs.pypi.org/trusted-publishers
π οΈ Essential Tools
- β’ UV - github.com/astral-sh/uv
- β’ Ruff - github.com/astral-sh/ruff
- β’ cibuildwheel - cibuildwheel.pypa.io
π¦ Native Extensions
- β’ PyO3 - pyo3.rs
- β’ pybind11 - pybind11.readthedocs.io
- β’ Maturin - maturin.rs
Ready to Build?
You now have everything you need to create production-ready, cross-platform Python libraries.
Start with pure Python, test thoroughly, document extensively, and ship with confidence! π
Copy what you need. Ship it. Iterate.
The Python community is waiting for your library! π