Skip to content

wgsim/EnvKnit

Repository files navigation

🧢 EnvKnit

License: MIT Status: Experimental Rust CLI Python API CI

In-process multi-version isolation for Python β€” use conflicting package versions in the same process, without subprocesses or virtual environments.

EnvKnit solves a problem that venv, uv, and pip cannot: loading multiple versions of the same package simultaneously inside a single Python process. Instead of spinning up a subprocess or maintaining separate virtual environments, you declare which version you need at the call site and EnvKnit routes imports accordingly.

import envknit

with envknit.use("requests", "2.28.2"):
    import requests
    legacy_response = requests.get(url)  # uses 2.28.2

with envknit.use("requests", "2.31.0"):
    import requests
    new_response = requests.get(url)    # uses 2.31.0

⚠️ Experimental: EnvKnit intentionally bypasses Python's "one module per process" singleton rule. This breaks isinstance checks across version boundaries and is unsuitable for production use without understanding the constraints. See caveats.


✨ What EnvKnit Does

The core problem

Tools like uv and venv manage environments β€” each environment gets one version of a package. If script A needs numpy==1.26 and script B needs numpy==2.0, you maintain two separate environments and run them as separate processes.

EnvKnit takes a different approach: install all versions to a global store, route imports at runtime.

What this enables

  • Multiple conflicting versions in one process β€” load numpy==1.26 and numpy==2.0 in the same Python session, controlled by ContextVar-scoped import routing.
  • No virtual environment overhead β€” packages live in ~/.envknit/packages/, shared across all projects. No gigabytes of duplicated .venv folders.
  • Version-pinned environments in one config β€” define multiple environments with conflicting dependencies in a single envknit.yaml, impossible with uv dependency-groups.
  • Hard isolation via sub-interpreters β€” Python 3.12+: spawn a true C-API sub-interpreter (PEP 684) with its own sys.modules for packages that can't share global state.

What this is NOT

EnvKnit is not a replacement for uv as a general-purpose package manager. If you need:

  • Standard virtual environments β†’ use uv venv
  • Fast dependency resolution β†’ use uv pip compile
  • Simple per-project isolation β†’ use uv or pip

EnvKnit's CLI (lock, install, run) is a thin wrapper that prepares packages for the Python library. The Python library is the product.


πŸš€ Quick Start

1. Install

# CLI binary (Linux)
curl -L https://github.com/wgsim/EnvKnit/releases/latest/download/envknit-linux-amd64 -o envknit
chmod +x envknit && sudo mv envknit /usr/local/bin/

# Python library
pip install envknit  # Requires Python 3.10+

2. Declare and install versions

# Initialize project config
envknit init

# Add package versions to environments
envknit add "requests==2.28.2"
envknit add "requests==2.31.0" --env new

# Resolve and install all versions to global store
envknit lock
envknit install

3. Use conflicting versions in-process

import envknit

# Route imports to specific installed versions
with envknit.use("requests", "2.28.2"):
    import requests
    print(requests.__version__)  # 2.28.2

with envknit.use("requests", "2.31.0"):
    import requests
    print(requests.__version__)  # 2.31.0

4. Run scripts with environment injection

# Inject environment packages into PYTHONPATH
envknit run -- python app.py
envknit run --env ml -- python train.py
envknit run --no-dev -- python -m pytest

πŸ”¬ Isolation Strategies

Which strategy to use?

Your situation Recommended API Notes
Pure-Python package, multi-version in one process use() Fastest, zero overhead
C-extension package (numpy, torch, pandas) use(auto_worker=True) or worker() Subprocess IPC via proxy
Need strict global state isolation (Python 3.12+) SubInterpreterEnv Hard isolation, C-ext requires worker() fallback
Threading with versioned imports ContextThread / ContextExecutor Propagate ContextVar to threads

Gen 1 β€” Soft Isolation (use())

Routes imports via ContextVar. Zero subprocess overhead. Works for pure-Python packages.

import envknit
envknit.enable()

with envknit.use("requests", "2.28.2"):
    import requests
    print(requests.__version__)  # 2.28.2

C-extension packages (numpy, torch, pandas) share C-level global state and cannot be loaded as two versions simultaneously. Use auto_worker=True to fall back transparently to a subprocess:

# auto_worker=True: tries in-process first, falls back to worker() for C-ext
with envknit.use("numpy", "1.26.4", auto_worker=True) as np:
    print(np.__version__)  # 1.26.4 β€” via subprocess proxy if C-ext

with envknit.use("numpy", "2.0.0", auto_worker=True) as np:
    print(np.__version__)  # 2.0.0 β€” separate worker process

isinstance caveat: objects returned across version boundaries are not the same type. Use primitive types (dict, list, str, int) or DTOs to pass data between versions. Avoid passing version-specific objects directly.

Subprocess Isolation (worker())

Direct subprocess worker pool. Use when you always want subprocess isolation regardless of package type.

with envknit.worker("numpy", "1.26.4") as np:
    arr = np.array([1, 2, 3]).tolist()  # serialize primitives back to host

Thread Context Propagation (ContextThread, ContextExecutor)

threading.Thread does not inherit ContextVar state by default. EnvKnit provides opt-in wrappers:

from envknit import ContextThread, ContextExecutor

with envknit.use("requests", "2.28.2"):
    # Snapshots context at __init__ time
    t = ContextThread(target=worker_fn)
    t.start()

    # Snapshots context at submit() time
    with ContextExecutor(max_workers=4) as pool:
        future = pool.submit(worker_fn)

Gen 2 β€” Hard Isolation (SubInterpreterEnv, Python 3.12+)

Spawns a true C-API sub-interpreter (PEP 684) with its own independent sys.modules, sys.path, and GIL. Host site-packages are never visible inside the sub-interpreter.

from envknit import SubInterpreterEnv, CExtIncompatibleError

with SubInterpreterEnv("ml") as interp:
    interp.configure_from_lock("envknit.lock.yaml", env_name="ml")

    # Pure-Python packages work directly
    result = interp.eval_json("""
import some_lib
result = {"version": some_lib.__version__}
""")

    # C-extension packages need explicit fallback to worker()
    try:
        interp.try_import("numpy", raise_on_cext=True)
        result = interp.eval_json("import numpy; result = numpy.__version__")
    except CExtIncompatibleError:
        import envknit
        with envknit.worker("numpy", "1.26.4") as np:
            result = np.__version__

See the Gen 2 Hard Isolation Guide for DTO patterns and serialization constraints.


⚠️ Caveats

EnvKnit intentionally breaks Python's "one module per process" assumption:

  • isinstance checks fail across version boundaries β€” obj from requests==2.28.2 is not an instance of requests.Response from 2.31.0. Workaround: exchange only primitives (dict, list, int, str) or version-neutral DTOs between version contexts.
  • C-extension singletons are not isolated β€” packages like numpy, pandas, torch share C-level global state. Use use(auto_worker=True) or worker() for subprocess isolation.
  • Sub-interpreters require Python 3.12+ CPython β€” SubInterpreterEnv raises UnsupportedPlatformError on 3.10/3.11 or non-CPython runtimes.
  • C-extensions cannot load inside SubInterpreterEnv β€” use try_import(raise_on_cext=True) + except CExtIncompatibleError to detect and fall back to worker().
  • Not for production use without full understanding of these constraints.

🐍 Python Version Compatibility

Feature Python 3.10 Python 3.11 Python 3.12 Python 3.13+
envknit.use() (pure Python) βœ… βœ… βœ… βœ…
envknit.use(auto_worker=True) βœ… βœ… βœ… βœ…
envknit.worker() (subprocess) βœ… βœ… βœ… βœ…
ContextThread / ContextExecutor βœ… βœ… βœ… βœ…
SubInterpreterEnv (hard isolation) ❌ ❌ βœ… CPython only βœ… CPython only
C-ext inside SubInterpreterEnv ❌ ❌ ❌ use worker() ❌ use worker()

SubInterpreterEnv requires CPython β€” it will not work on PyPy, GraalPy, or CPython builds with --disable-gil/--without-threads.


πŸ“¦ CLI Reference

The CLI prepares packages for the Python library.

envknit init          # Create envknit.yaml
envknit add <pkg>     # Add a package requirement
envknit lock          # Resolve and write envknit.lock.yaml (via uv)
envknit install       # Install locked packages to global store (via uv)
envknit run -- <cmd>  # Run command with environment injected into PYTHONPATH
envknit verify        # Verify installed packages match lock file hashes
envknit doctor        # Check installation health
envknit store         # Inspect global package store

Requires uv (v0.2.0+).


πŸ“š Documentation

Guides

Document Description
πŸš€ Getting Started Installation, first run, and a 20-minute tutorial.
πŸ”„ Migration Guide How to move from requirements.txt, Poetry, or venv to EnvKnit.
🧠 Architecture & Concepts How the global store, PYTHONPATH, and import hook work.
πŸ’» CLI Scripts How to run pytest, black, mypy, etc., with envknit run.
🐍 Python Version Using python_version with mise/pyenv.
🟒 Node Version Using node_version with fnm/nvm/mise.
πŸ”Œ Python API Deep dive into use(), worker(), and configure_from_lock().
πŸ›‘οΈ Gen 2 Hard Isolation Using Python 3.12+ Sub-interpreters for strict global state isolation.
🌍 Environments Managing multiple environments (default, ml, dev).
βš™οΈ CI Integration Setting up EnvKnit in GitHub Actions.
πŸ› οΈ Troubleshooting & FAQ Solutions for common errors, C-extensions, and CLI path issues.

Reference

Document Description
⌨️ CLI Reference Complete CLI command reference.
πŸ“ Config Schema envknit.yaml and global config fields.
πŸ”’ Lock Schema envknit.lock.yaml structure.

🀝 Contributing

EnvKnit is built with Rust and Python.

git clone https://github.com/wgsim/EnvKnit.git
cd EnvKnit

# Test the Rust CLI
cargo test

# Test the Python runtime library
pip install -e ".[dev]"
python -m pytest

πŸ“„ License

EnvKnit is distributed under the MIT License.

About

Multi-version Python package manager and dependency isolation tool. Run multiple package versions concurrently in a single Python process.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages