You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Right now if you explore an event in Phoenix (rotate the camera, select tracks, tag particles, filter collections) there's no way to save that exploration. The only workaround is a screen recording, which gives you a flat video. Students can't pause and rotate the scene themselves, can't hover over a track the instructor didn't click, can't toggle a layer off, and you're stuck with ~100 MB for 5 minutes.
Proposal
A session recorder that listens on the event bus (event-display.ts) and samples camera state, then serializes the whole stream into a small compressed JSON blob (same deflate + base64 trick we already use for ?state= in the share-link dialog). A matching player reads the blob and re-emits the events on the same bus, so the replay stays live 3D, not pre-rendered pixels. The viewer can pause at any point, orbit the camera, click new objects, toggle detector layers, then resume.
Why this matters
Masterclass: instructor shares a URL (a few KB) instead of uploading a 100 MB video
Bug reports: "Phoenix broke when I did X" becomes a replayable URL for maintainers
Homework checkpoints: instructor records halfway, student finishes in the same live scene
This is not a screen recorder. Replays capture bus events + camera samples, not pixels. That's the point (tiny files, fully interactive replay), but it means anything drawn outside the bus won't show up in replays until we also emit for it.
Body
Motivation
Right now if you explore an event in Phoenix (rotate the camera, select tracks, tag particles, filter collections) there's no way to save that exploration. The only workaround is a screen recording, which gives you a flat video. Students can't pause and rotate the scene themselves, can't hover over a track the instructor didn't click, can't toggle a layer off, and you're stuck with ~100 MB for 5 minutes.
Proposal
A session recorder that listens on the event bus (event-display.ts) and samples camera state, then serializes the whole stream into a small compressed JSON blob (same deflate + base64 trick we already use for ?state= in the share-link dialog). A matching player reads the blob and re-emits the events on the same bus, so the replay stays live 3D, not pre-rendered pixels. The viewer can pause at any point, orbit the camera, click new objects, toggle detector layers, then resume.
Why this matters
Non-goals
This is not a screen recorder. Replays capture bus events + camera samples, not pixels. That's the point (tiny files, fully interactive replay), but it means anything drawn outside the bus won't show up in replays until we also emit for it.