FlowWatch is a tiny ergonomic layer on top of watchfiles
that makes it easy to build file-driven workflows using simple decorators and a pretty
Rich + Typer powered CLI.
Instead of wiring watchfiles.watch() manually in every project, you declare:
- what folder(s) you want to watch
- which patterns you care about (e.g.
*.mxf,*.json) - which function should run for a given event (created / modified / deleted)
FlowWatch takes care of:
- subscribing to all roots in a single watcher loop
- debouncing and recursive watching
- dispatching events to handlers with a small thread pool
- optional processing of existing files on startup
- nicely formatted logs and a CLI overview of registered handlers
FlowWatch is published as a normal Python package.
Using uv:
uv add flowwatchUsing pip:
pip install flowwatchHandlers receive a FileEvent object describing what happened:
event.change– awatchfiles.Change(added,modified,deleted)event.path–pathlib.Pathpointing to the fileevent.root– the root folder you registeredevent.pattern– the pattern that matched (if any)
It also has convenience properties:
event.is_createdevent.is_modifiedevent.is_deleted
You register handlers using decorators from flowwatch:
@on_created(root, pattern="*.txt", process_existing=True)@on_modified(root, pattern="*.json")@on_deleted(root, pattern="*.bak")@on_any(root, pattern="*.*")
Behind the scenes these attach to a global FlowWatchApp instance, which you can run
using flowwatch.run() or via the CLI.
The decorator + runner pattern is the simplest:
from pathlib import Path
from flowwatch import FileEvent, on_created, run
WATCH_DIR = Path("inbox")
WATCH_DIR.mkdir(exist_ok=True)
@on_created(str(WATCH_DIR), pattern="*.txt", process_existing=True)
def handle_new_text(event: FileEvent) -> None:
print(f"New text file: {event.path}")
print("Was it created?", event.is_created)
if __name__ == "__main__":
run() # blocks until Ctrl+CRun it:
python my_script.pyThen drop *.txt files into inbox/ and watch the handler fire.
FlowWatch also ships with a small CLI, exposed as the flowwatch command.
You typically:
- Create a watchers module that only defines handlers.
- Call
flowwatch run your_module.path.
For example, myproject/watchers.py:
from pathlib import Path
from flowwatch import FileEvent, on_created
BASE = Path("/media/incoming")
@on_created(str(BASE), pattern="*.mxf", process_existing=True)
def handle_mxf(event: FileEvent) -> None:
print(f"[handler] New MXF at {event.path}")flowwatch run myproject.watchersThe CLI will:
- import
myproject.watchers - discover all handlers registered via decorators
- show a Rich table with handlers, roots, events, patterns, and priorities
- start the watcher loop and stream pretty logs to your terminal
You can customize:
flowwatch run myproject.watchers \
--debounce 8 \
--max-workers 8 \
--no-recursive \
--log-level DEBUGA common pattern is to run FlowWatch as its own worker container:
services:
backend:
build: ./backend
volumes:
- media:/media
flowwatch:
build: ./backend
command: flowwatch run myproject.watchers
depends_on:
- backend
volumes:
- media:/media
restart: unless-stopped
volumes:
media:Where myproject/watchers.py inside the image contains your handlers and watches
paths under /media (shared volume with the backend).
If you need more control than the global decorators/CLI, you can instantiate your
own FlowWatchApp:
from pathlib import Path
from watchfiles import Change
from flowwatch import FileEvent, FlowWatchApp
app = FlowWatchApp(name="my-custom-app", debounce=0.7, max_workers=8)
def handle_any(event: FileEvent) -> None:
print(event.change, event.path)
app.add_handler(
handle_any,
root=Path("data"),
events=[Change.added, Change.modified, Change.deleted],
pattern="*.*",
process_existing=True,
)
app.run()This is the same engine used under the hood by the decorators and CLI.
FlowWatch is a good fit when you want:
- simple file pipelines like:
- "When a new MXF appears here, run this ingester."
- "When a JSON config changes, reload some state."
- "When a sidecar file is deleted, clean up something else."
- readable, declarative code:
- your intent is obvious from the decorators
- a pretty terminal UX when running workers in Docker, k8s, or bare metal
It is not trying to be a full-blown workflow engine. Think of it as a thin,
Pythonic glue layer over watchfiles.
Potential future additions:
asyncmode usingwatchfiles.awatch- optional structured JSON logs for production
- pattern-based routing helpers (e.g. per-extension multiplexing)
- more first-class Docker/Kubernetes examples
If you end up using FlowWatch in your own projects, feel free to open issues or PRs with real-world improvements.