Skip to content

Session recording and replay for interactive event exploration #883

@rx18-eng

Description

@rx18-eng

Body

Motivation

Right now if you explore an event in Phoenix (rotate the camera, select tracks, tag particles, filter collections) there's no way to save that exploration. The only workaround is a screen recording, which gives you a flat video. Students can't pause and rotate the scene themselves, can't hover over a track the instructor didn't click, can't toggle a layer off, and you're stuck with ~100 MB for 5 minutes.

Proposal

A session recorder that listens on the event bus (event-display.ts) and samples camera state, then serializes the whole stream into a small compressed JSON blob (same deflate + base64 trick we already use for ?state= in the share-link dialog). A matching player reads the blob and re-emits the events on the same bus, so the replay stays live 3D, not pre-rendered pixels. The viewer can pause at any point, orbit the camera, click new objects, toggle detector layers, then resume.

Why this matters

  • Masterclass: instructor shares a URL (a few KB) instead of uploading a 100 MB video
  • Bug reports: "Phoenix broke when I did X" becomes a replayable URL for maintainers
  • Homework checkpoints: instructor records halfway, student finishes in the same live scene
  • Integration API (Allow easier integration of phoenix in other applications #826): every emit() point becomes introspectable, which pushes us toward what Sebastien asked about

Non-goals

This is not a screen recorder. Replays capture bus events + camera samples, not pixels. That's the point (tiny files, fully interactive replay), but it means anything drawn outside the bus won't show up in replays until we also emit for it.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions