Goal
Add one short note to examples/README.md so example readers know the compare result is also available through the local API while the BrowserTrace UI is running.
Context: the examples guide already shows browsertrace compare <failed_run_id> <success_run_id> and --json. It should also mention that local scripts or dashboards can call:
GET /api/compare/<failed_run_id>/<success_run_id>
Scope
Please keep this small:
- Update
examples/README.md near the existing compare output / --json explanation.
- Add one sentence saying the local compare API is useful for scripts, dashboards, or automation checks that need the same first-divergence JSON payload while the local UI is running.
- Update
tests/test_metadata.py so the examples README metadata test covers the new sentence.
Out of scope
- Do not change server behavior.
- Do not add new examples or generated output files.
- Do not edit unrelated docs.
Verification
Please run:
uv run --python 3.11 --extra dev pytest tests/test_metadata.py -q
git diff --check
Comment here before starting so I can mark the issue as claimed.
Goal
Add one short note to
examples/README.mdso example readers know the compare result is also available through the local API while the BrowserTrace UI is running.Context: the examples guide already shows
browsertrace compare <failed_run_id> <success_run_id>and--json. It should also mention that local scripts or dashboards can call:Scope
Please keep this small:
examples/README.mdnear the existing compare output /--jsonexplanation.tests/test_metadata.pyso the examples README metadata test covers the new sentence.Out of scope
Verification
Please run:
Comment here before starting so I can mark the issue as claimed.