This repository holds common scripts for use in python repositories. The main script of note is setup_python_app.sh which performs all of the actions needed to setup a developer workspace for a Python project.
The root setup file in this repository is only a local helper for working on utility-repo-scripts itself. It is not the supported downstream entrypoint and should not be treated as part of the consumer-facing interface for repositories that add this project as a submodule.
- utility-repo-scripts
In order to use setup_python_app.sh, you will need the following:
Optional but recommended:
- Node.js if you want the
prettier/sort-jsonformatting steps and relatedpre-commithooks to work locally when those tools are not already installed - Mac/Linux Node.js version management: nvm
This repo should be added to another repo as a submodule
git submodule add -b main https://github.com/mrlonis/utility-repo-scripts.git
git commit -m "Adding utility-repo-scripts"This will add the utility-repo-scripts repository as a submodule in the utility-repo-scripts folder within your project. Submodules are not cloned by default so you should add a step to your setup script in the project to initialize it if it wasn't cloned already.
In the examples below, setup means a script in your downstream project, not this repository's own root setup file. Downstream repositories should source setup_python_app.sh; that is the supported entrypoint intended for reuse.
This downstream setup script should work for most projects:
#!/bin/bash
git submodule update --init --remote --force
source utility-repo-scripts/setup_python_app.sh --package_manager="pip"To override the default Python version, pass --python_version with any version string supported by pyenv install, for example --python_version="3.12.9". The setup script uses that Python version to create a project-local virtual environment at .venv/.
The setup_python_app.sh script accepts a few flags to customize the setup process:
Note: For options that take 0/1, 0 is False and 1 is True. --debug/-d is a presence-only flag, so including it enables debug output.
| Flag | Description | Default | Valid Values |
|---|---|---|---|
-d or --debug |
Enables debug echo statements when the flag is present | False |
Presence-only flag; omit to leave debug disabled |
-r or --rebuild_venv |
Controls whether the virtual environment should be deleted and re-created | 0 |
0, 1 |
--python_version |
Specifies which Python version pyenv should install and use when creating the project-local .venv |
3.14.3 |
Any non-empty pyenv install version string |
--package_manager |
Specifies which package manager to use | poetry |
[pip, pip-tools, poetry, uv-pip, uv] |
--is_package |
Specifies whether or not the project is a package | False |
|
--include_jumanji_house |
Specifies whether or not to include the jumanjihouse pre-commit hooks |
True |
|
--include_prettier |
Specifies whether or not to include the prettier pre-commit hooks |
True |
|
--include_isort |
Specifies whether or not to include the isort pre-commit hooks |
True |
|
--isort_profile |
isort Profiles | black |
Any valid isort profile |
--python_formatter |
Specifies which python formatter to use | black |
["", autopep8, black] |
--pylint_enabled |
Specifies whether or not to enable pylint |
True |
|
--flake8_enabled |
Specifies whether or not to enable flake8 |
True |
|
--mypy_enabled |
Specifies whether or not to enable mypy |
True |
|
--pytest_enabled |
Specifies whether or not to enable pytest |
True |
|
--unittest_enabled |
Specifies whether or not to enable unittest |
False |
|
--pre_commit_autoupdate |
Runs pre-commit autoupdate after installing hooks |
False |
|
--overwrite_vscode_launch |
Overwrites an existing .vscode/launch.json; a missing file is created automatically from .vscode/launch.sample.json |
False |
|
--line_length |
Specifies the line length to use for various settings | 120 |
Any non-zero positive integer |
Example semantics: use --debug to turn debug output on, and use --rebuild_venv=1 to force a rebuild or --rebuild_venv=0 to leave rebuild behavior off.
setup_python_app.sh always runs pre-commit install when a .pre-commit-config.yaml file exists. pre-commit autoupdate is opt-in and only runs when --pre_commit_autoupdate is enabled.
setup_python_app.sh defaults to Python 3.14.3. This repository's local-only setup wrapper also defaults to 3.14.3, and forwards additional CLI arguments to setup_python_app.sh, so commands like ./setup 1 --python_version=3.12.9 rebuild the virtual environment with that Python version when you need an override while working on this repository itself.
Formatting quirk: whenever the script calls prettier_format or json_sort (for example when formatting .prettierrc, .pre-commit-config.yaml, or .vscode/settings.json), missing prettier or sort-json binaries are installed globally with npm install -g if npm is available. If Node.js/npm is unavailable, the setup still completes and simply skips those formatting steps. --include_prettier only controls the optional Prettier-specific pre-commit hook fix later in the script. This is intentional for this personal workflow.
Version 1 of the setup script uses pip to manage the python dependencies. This is the default version of the setup script. To use version 1 of the setup script, add the following to your setup script:
#!/bin/bash
git submodule update --init --remote --force
source utility-repo-scripts/setup_python_app.sh \
--package_manager="pip" \
--rebuild_venv="$rebuild_venv" \
--python_formatter="" \
--pylint_enabled \
--pytest_enabled \
--overwrite_vscode_launch \
--line_length=125We can separate our requirements into different files to make it easier to manage them. For example, we can have a requirements-dev.txt file that contains the dependencies needed for development and testing. We can then have a requirements-test.txt file that contains the dependencies needed for testing. We can then have a requirements.txt file that contains the dependencies needed for production.
# requirements-dev.txt
-r requirements-test.txt
pre-commit
# requirements-test.txt
-r requirements.txt
pylint
# requirements.txt
pydantic[dotenv]
The version 1 of the setup script looks for the above files starting with requirements-dev.txt and then requirements-test.txt and then requirements.txt.
If you wish to have dev dependencies but not test dependencies, you can create a requirements-dev.txt file that points the -r flag to requirements.txt instead of requirements-test.txt.
If you wish to only have test dependencies, you can create a requirements-test.txt file that points the -r flag to requirements.txt and remove the requirements-dev.txt file from the repo.
As shown in the example files above, if you run ./setup in your project, the resulting command the setup script would run is pip install -r requirements-dev.txt which would end up installing all the dependencies in requirements-dev.txt, requirements-test.txt and requirements.txt.
This is because the -r requirements-test.txt line in requirements-dev.txt and the -r requirements.txt line in requirements-test.txt will ensure that the dependencies in requirements-test.txt and requirements.txt are also installed when calling the command pip install -r requirements-dev.txt.
Version 2 of the setup script uses pip-tools to manage the python dependencies. To use version 2 of the setup script, modify your setup script to the following:
#!/bin/bash
git submodule update --init --remote --force
rebuild_venv=$1
rebuild_venv="${rebuild_venv:-0}"
source utility-repo-scripts/setup_python_app.sh \
--package_manager="pip-tools" \
--rebuild_venv="$rebuild_venv" \
--python_formatter="" \
--pylint_enabled \
--pytest_enabled \
--overwrite_vscode_launch \
--line_length=125To ensure your project will work with version 2 of the setup script, we must first setup at least a requirements.in file. This file will contain the direct dependencies of your project. For now, this file should at least contain pip-tools.
requirements.in:
pip-tools
Since the newly created virtual environment might not have pip-tools:
- Run:
pip install pip-tools - Then run,
pip-compile -Uto build therequirements.txtfile.
You can now run ./setup to setup your project.
We can separate our requirements into different files to make it easier to manage them. For example, we can have a requirements-dev.in file that contains the dependencies needed for development and testing. We can then have a requirements-test.in file that contains the dependencies needed for testing. We can then have a requirements.in file that contains the dependencies needed for production.
# requirements-dev.in
-c requirements.txt
-c requirements-test.txt
pip-tools
pre-commit
# requirements-test.in
-c requirements.txt
pylint
# requirements.in
pydantic[dotenv]
We can then compile these files into their respective .txt files:
pip-compile -U --resolver=backtracking --strip-extras requirements.in
pip-compile -U --resolver=backtracking --strip-extras requirements-test.in
pip-compile -U --resolver=backtracking requirements-dev.inThe version 2 of the setup script looks for the compiled output of the above commands (aka the requirements-dev.txt, requirements-test.txt, and requirements.txt).
- If
dev,test, andprodrequirements are found, the resulting command thesetupscript would run ispip-sync requirements-dev.txt requirements-test.txt requirements.txt. - If only
devandprodrequirements are found, the resulting command thesetupscript would run ispip-sync requirements-dev.txt requirements.txt. - If only
testandprodrequirements are found, the resulting command thesetupscript would run ispip-sync requirements-test.txt requirements.txt. - If only
prodrequirements are found, the resulting command thesetupscript would run ispip-sync requirements.txt.
The goal of Version 2 of the setup script is to make it faster to setup and install dependencies. A core part of this is to not rebuild the virtual environment every time the setup script is run. This done by checking if the virtual environment destination directory already exists. If it does, we assume that this environment is in a healthy state (i.e. the python and pip executables and other site-packages are intact). If the folder exists, we can speed up setup by leveraging pip-sync on the requirements files in the repository. If the virtual environment does not exist, we create one prior to performing pip-sync.
Since the virtual environment is built and linked to the system executables at creation, we must rebuild the virtual environment whenever our system changes. One example that would prompt a rebuild of the virtual environment is python being updated by brew. This would result in the previously linked python executable to no longer exist, and this the virtual environment would be in an unhealthy state. To rebuild the virtual environment, run the setup script with the --rebuild_venv flag set to 1.
Alternatively, you can set the setup script to accept input for easy ad-hoc re-building. To do this, set your setup script to the following:
#!/bin/bash
git submodule update --init --remote --force
rebuild_venv=$1
rebuild_venv="${rebuild_venv:-0}"
source utility-repo-scripts/setup_python_app.sh \
--package_manager="pip-tools" \
--rebuild_venv="$rebuild_venv" \
--python_formatter="" \
--pylint_enabled \
--pytest_enabled \
--overwrite_vscode_launch \
--line_length=125To then easily run and re-build the virtual environment on the fly without modifying the setup script, run the following:
./setup 1To rebuild with a different Python version at the same time, run:
./setup 1 --python_version=3.12.9The uv-pip mode uses uv in pip-compatible mode to manage Python dependencies from requirements files. To use this mode, modify your setup script to the following:
#!/bin/bash
git submodule update --init --remote --force
rebuild_venv=$1
rebuild_venv="${rebuild_venv:-0}"
source utility-repo-scripts/setup_python_app.sh \
--package_manager="uv-pip" \
--rebuild_venv="$rebuild_venv" \
--python_formatter="" \
--pylint_enabled \
--pytest_enabled \
--overwrite_vscode_launch \
--line_length=125To setup a project for the uv-pip mode, first install uv. To install uv, view the uv installation instructions.
This mode works best when your repository already has fully compiled requirements-dev.txt, requirements-test.txt, and requirements.txt files, similar to the pip-tools workflow.
The uv-pip mode looks for requirements-dev.txt, requirements-test.txt, and requirements.txt and synchronizes the environment using uv pip sync.
- If
dev,test, andprodrequirements are found, the resulting command thesetupscript runs isuv pip sync requirements-dev.txt requirements-test.txt requirements.txt. - If only
devandprodrequirements are found, the resulting command thesetupscript runs isuv pip sync requirements-dev.txt requirements.txt. - If only
testandprodrequirements are found, the resulting command thesetupscript runs isuv pip sync requirements-test.txt requirements.txt. - If only
prodrequirements are found, the resulting command thesetupscript runs isuv pip sync requirements.txt.
Because this mode uses uv pip sync, the requirements files should fully describe the environment you want inside .venv. As with the other sync-oriented package manager modes, .venv is reused until you pass --rebuild_venv=1 or the environment is missing.
The uv mode uses uv in its native project mode to manage Python dependencies from pyproject.toml. To use this mode, modify your setup script to the following:
#!/bin/bash
git submodule update --init --remote --force
rebuild_venv=$1
rebuild_venv="${rebuild_venv:-0}"
source utility-repo-scripts/setup_python_app.sh \
--package_manager="uv" \
--rebuild_venv="$rebuild_venv" \
--python_formatter="" \
--pylint_enabled \
--pytest_enabled \
--overwrite_vscode_launch \
--line_length=125To setup a project for the uv mode, first install uv. To install uv, view the uv installation instructions.
This mode expects a pyproject.toml file at the root of the repository. You can create one with uv init, and then add dependencies with commands such as uv add <dependency>.
When --package_manager="uv" is selected, the setup script uses the shared project .venv and runs uv sync to install dependencies from pyproject.toml. The .venv folder is reused until you pass --rebuild_venv=1 or the environment is missing.
Version 3 of the setup script uses Poetry to manage the python dependencies. To use version 3 of the setup script, modify your setup script to the following:
#!/bin/bash
git submodule update --init --remote --force
rebuild_venv=$1
rebuild_venv="${rebuild_venv:-0}"
source utility-repo-scripts/setup_python_app.sh \
--package_manager="poetry" \
--rebuild_venv="$rebuild_venv" \
--python_formatter="" \
--pylint_enabled \
--pytest_enabled \
--overwrite_vscode_launch \
--line_length=125To setup a project for version 3 of the setup script, we must first install Poetry. To install Poetry, view the Poetry installation instructions.
Once Poetry is installed, we can then run the following command to setup the project:
poetry initThis will create a pyproject.toml file in the root of the project. This file will contain the project's dependencies. To add a dependency, run the following command:
poetry add <dependency>For more information about Poetry, view the Poetry documentation.
The goal of Version 3 of the setup script is to make it faster to setup and install dependencies. A core part of this is to not rebuild the virtual environment every time the setup script is run. This done by checking if the virtual environment destination directory already exists. If it does, we assume that this environment is in a healthy state (i.e. the python and pip executables and other site-packages are intact). If the folder exists, we can speed up setup by leveraging poetry install --sync in the repository. If the virtual environment does not exist, we create one prior to performing poetry install --sync.
Since the virtual environment is built and linked to the system executables at creation, we must rebuild the virtual environment whenever our system changes. One example that would prompt a rebuild of the virtual environment is python being updated by brew. This would result in the previously linked python executable to no longer exist, and this the virtual environment would be in an unhealthy state. To rebuild the virtual environment, run the setup script with the --rebuild_venv flag set to 1.
Alternatively, you can set the setup script to accept input for easy ad-hoc re-building. To do this, set your setup script to the following:
#!/bin/bash
git submodule update --init --remote --force
rebuild_venv=$1
rebuild_venv="${rebuild_venv:-0}"
source utility-repo-scripts/setup_python_app.sh \
--package_manager="poetry" \
--rebuild_venv="$rebuild_venv" \
--python_formatter="" \
--pylint_enabled \
--pytest_enabled \
--overwrite_vscode_launch \
--line_length=125To then easily run and re-build the virtual environment on the fly without modifying the setup script, run the following:
./setup 1To rebuild with a different Python version at the same time, run:
./setup 1 --python_version=3.12.9The ensure_venv.sh script is used by the generated local pylint pre-commit hook. Its job is to make sure a virtual environment is active before running a command, which is especially helpful when hooks are triggered from the VS Code Git UI or another shell that does not already have VIRTUAL_ENV set.
When no virtual environment is active, the script tries to activate a virtual environment in the following order:
$PWD/.venv$WORKON_HOME/<project_folder_name>${PYENV_ROOT:-$HOME/.pyenv}/versions/<project_folder_name>
This matches the environment layout created by setup_python_app.sh, which creates project environments inside the repository at .venv by default.
If you keep your environments somewhere else, set WORKON_HOME to the parent directory that contains your project environments. The script will try that location after .venv and then fall back to the pyenv location if the project environment is missing there or cannot be activated:
export WORKON_HOME="$HOME/.venvs"For example, if your project folder is named my-python-app, ensure_venv.sh will look for $PWD/.venv, then $WORKON_HOME/my-python-app, and finally ${PYENV_ROOT:-$HOME/.pyenv}/versions/my-python-app.
To test this repo, run the following command:
poetry run pytest --cov -n autoTo lint this repo, run the following command:
poetry run pylint src tests setup_flake8.py setup_pre_commit_config.py setup_pylintrc.py setup_pyproject_toml.py setup_vscode.py
poetry run flake8 src tests setup_flake8.py setup_pre_commit_config.py setup_pylintrc.py setup_pyproject_toml.py setup_vscode.pyThe following section details recommended brew packages to install. Note: These packages are required for the pre-commit to function.
To format shell files, install shfmt with brew:
brew install shfmtand then run the following command to format all shell files in the repo:
shfmt -l -w setup_python_app.shTo lint shell files, install shellcheck with brew:
brew install shellcheckand then run the following command to lint a shell file changing out the file name/path as needed:
shellcheck setup_python_app.shTo get integrated shellcheck linting in VS Code, install the shellcheck extension
brew install openssl@3 readline libyaml gmp
brew install rust
brew install rbenv ruby-build
echo 'export PATH="$HOME/.rbenv/bin:$PATH"' >> ~/.zprofile
echo 'eval "$(rbenv init -)"' >> ~/.zprofile
rbenv install 3.2.1
rbenv global 3.2.1
rbenv local 3.2.1
rbenv rehash
gem update --system
gem install mdlgem install mdlTo install pyenv, run the following command:
brew install pyenv