-
Notifications
You must be signed in to change notification settings - Fork 135
Add local video publisher & subcriber examples #830
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR adds two comprehensive examples demonstrating local video capture and streaming with the LiveKit Rust SDK: a publisher that captures frames from a local camera and publishes them to a LiveKit room, and a subscriber that connects to a room and renders received video in a window with GPU acceleration.
Key changes:
- Publisher example with camera capture, format conversion (YUYV/MJPEG/RGB24 to I420), and H.264/H.265 codec support
- Subscriber example with GPU-accelerated YUV rendering using WGPU/egui and simulcast layer controls
- Enhanced yuv-sys build script to detect and enable libjpeg/libjpeg-turbo for MJPEG fast-path decoding
Reviewed changes
Copilot reviewed 8 out of 10 changed files in this pull request and generated 9 comments.
Show a summary per file
| File | Description |
|---|---|
| yuv-sys/build.rs | Adds pkg-config detection for system libjpeg to enable MJPEG fast-path conversion in libyuv |
| yuv-sys/Cargo.toml | Adds pkg-config dependency for build-time library detection |
| examples/local_video/Cargo.toml | New example package configuration with required dependencies for camera capture, video processing, and GPU rendering |
| examples/local_video/README.md | Documentation for both publisher and subscriber examples with usage instructions |
| examples/local_video/src/publisher.rs | Complete publisher implementation with camera capture, format detection/conversion, and LiveKit video track publishing |
| examples/local_video/src/subscriber.rs | Complete subscriber implementation with video stream reception, GPU rendering, and simulcast controls |
| examples/local_video/src/yuv_shader.wgsl | WGSL shader for YUV to RGB conversion and rendering in the subscriber |
| Cargo.toml | Adds local_video example to workspace members |
| Cargo.lock | Lock file updates for new dependencies |
| .gitignore | Adds .cursor IDE directory to ignore list |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
examples/local_video/README.md
Outdated
| --api-secret YOUR_SECRET | ||
|
|
||
| # subscribe to a specific participant's video only | ||
| cargo run -p local_video --bin subscriber -- \ | ||
| --room-name demo \ | ||
| --identity viewer-1 \ | ||
| --participant alice |
Copilot
AI
Jan 10, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Inconsistent indentation: these lines use 3 spaces while lines 40-45 above use 4 spaces (with the exception of line 45). The indentation should be consistent throughout the command blocks.
| --api-secret YOUR_SECRET | |
| # subscribe to a specific participant's video only | |
| cargo run -p local_video --bin subscriber -- \ | |
| --room-name demo \ | |
| --identity viewer-1 \ | |
| --participant alice | |
| --api-secret YOUR_SECRET | |
| # subscribe to a specific participant's video only | |
| cargo run -p local_video --bin subscriber -- \ | |
| --room-name demo \ | |
| --identity viewer-1 \ | |
| --participant alice |
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
This reverts commit 2a6aae3.
| env_logger::init(); | ||
| let args = Args::parse(); | ||
|
|
||
| let ctrl_c_received = Arc::new(AtomicBool::new(false)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion: This can be handled more idiomatically by defining a run function that accepts the CLI arguments and using tokio::select!:
tokio::select! {
_ = run(args) => {},
_ = signal::ctrl_c() => {}
}| env_logger::init(); | ||
| let args = Args::parse(); | ||
|
|
||
| let ctrl_c_received = Arc::new(AtomicBool::new(false)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here.
| while let Some(evt) = events.recv().await { | ||
| debug!("Room event: {:?}", evt); | ||
| match evt { | ||
| RoomEvent::TrackSubscribed { track, publication, participant } => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nitpick: Handling each event type in a separate function would improve readability.
|
Note Other AI code review bot(s) detectedCodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review. 📝 WalkthroughWalkthroughThis pull request introduces a complete new example project for local video capture, publishing, and subscription via LiveKit. It adds three binaries (list_devices, publisher, subscriber) with comprehensive camera enumeration, video publishing capabilities with codec negotiation, and a GUI-based subscriber with YUV rendering and simulcast quality control. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant Publisher as Publisher Binary
participant Camera as Camera (Nokhwa)
participant LiveKit as LiveKit Room
participant Network as Network
User->>Publisher: Run with camera index & room info
Publisher->>Camera: Open camera, negotiate format
Camera-->>Publisher: Return camera handle + format (YUYV/MJPEG)
Publisher->>LiveKit: Create access token & connect
LiveKit-->>Publisher: Connection established
Publisher->>LiveKit: Create LocalVideoTrack (H.265/H.264)
loop Frame Capture & Publish Loop
Publisher->>Camera: Read frame
Camera-->>Publisher: Frame data
Publisher->>Publisher: Convert to I420 (via yuv_sys or MJPEG decode)
Publisher->>Publisher: Maintain RTP timestamp, pace at target FPS
Publisher->>LiveKit: Push I420 frame to track
LiveKit->>Network: Transmit encoded video
end
User->>Publisher: Ctrl-C signal
Publisher->>LiveKit: Unpublish & disconnect
sequenceDiagram
participant User
participant Subscriber as Subscriber Binary
participant LiveKit as LiveKit Room
participant Network as Network
participant GPU as GPU (WGPU)
participant UI as egui/eframe
User->>Subscriber: Run with room credentials
Subscriber->>LiveKit: Create token & connect with auto-subscribe
LiveKit-->>Subscriber: Connected to room
loop Room Event Handling
LiveKit->>Subscriber: TrackSubscribed event
Subscriber->>LiveKit: Subscribe to video track
LiveKit-->>Subscriber: NativeVideoStream created
Subscriber->>Subscriber: Spawn frame sink thread
par Frame Reception
LiveKit->>Subscriber: Transmit video frames
Network->>Subscriber: Receive encoded frames
and GUI Rendering
UI->>Subscriber: Request frame for display
Subscriber->>Subscriber: Lock shared YUV buffer
Subscriber->>GPU: Upload Y/U/V textures (WGPU)
GPU->>GPU: Execute yuv_shader (YUV→RGB conversion)
GPU-->>UI: Render textured quad
UI->>User: Display video + HUD + simulcast controls
end
Subscriber->>Subscriber: Update FPS stats, HUD overlay
alt User adjusts simulcast quality
User->>UI: Select quality level
UI->>Subscriber: Update SimulcastState
Subscriber->>LiveKit: Publish desired quality
end
end
User->>Subscriber: Ctrl-C signal
Subscriber->>LiveKit: Disconnect & cleanup
Estimated code review effort🎯 4 (Complex) | ⏱️ ~50 minutes
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@examples/local_video/src/publisher.rs`:
- Line 1: Validate pace_fps before computing Duration::from_secs_f64 to avoid a
divide-by-zero panic: in the code paths that call Duration::from_secs_f64(1.0 /
pace_fps) (look for uses in publisher.rs around the publisher setup and the loop
where pace_fps is applied), check that pace_fps > 0 and return an Err or print a
clear error and exit when it is zero or negative; update both occurrences (the
initial interval computation and the repeated use at lines ~184-186) to perform
this guard and handle the invalid value gracefully instead of letting
Duration::from_secs_f64 panic.
♻️ Duplicate comments (3)
examples/local_video/README.md (1)
55-60: Inconsistent indentation in command examples.Lines 56-59 use 4-space indentation while the earlier block (lines 48-53) uses 3-space indentation. Consider aligning for consistency throughout the document.
examples/local_video/src/publisher.rs (2)
165-175: Verify the nokhwa API call name (set_camera_request).The method name appears misspelled as
set_camera_requset, which would fail to compile if the API is actuallyset_camera_request. Please confirm against nokhwa’s current API and fix both call sites if needed.🔧 Possible fix (if the API is `set_camera_request`)
- if let Err(_) = camera - .set_camera_requset(RequestedFormat::new::<RgbFormat>(RequestedFormatType::Exact(wanted))) + if let Err(_) = camera + .set_camera_request(RequestedFormat::new::<RgbFormat>(RequestedFormatType::Exact(wanted))) { @@ - let _ = camera - .set_camera_requset(RequestedFormat::new::<RgbFormat>(RequestedFormatType::Exact(alt))); + let _ = camera + .set_camera_request(RequestedFormat::new::<RgbFormat>(RequestedFormatType::Exact(alt))); }
283-317: Handle libyuv conversion failures instead of ignoring return codes.
rs_YUY2ToI420/rs_RGB24ToI420return non‑zero on failure; ignoring them can publish corrupted frames. Consider logging and skipping the frame, and apply the same handling to the other conversion paths.🔍 Example handling for the YUYV path
- unsafe { - // returns 0 on success - let _ = yuv_sys::rs_YUY2ToI420( - src_bytes.as_ptr(), - src_stride, - data_y.as_mut_ptr(), - stride_y as i32, - data_u.as_mut_ptr(), - stride_u as i32, - data_v.as_mut_ptr(), - stride_v as i32, - width as i32, - height as i32, - ); - } + let ret = unsafe { + // returns 0 on success + yuv_sys::rs_YUY2ToI420( + src_bytes.as_ptr(), + src_stride, + data_y.as_mut_ptr(), + stride_y as i32, + data_u.as_mut_ptr(), + stride_u as i32, + data_v.as_mut_ptr(), + stride_v as i32, + width as i32, + height as i32, + ) + }; + if ret != 0 { + log::warn!("YUYV->I420 conversion failed: {}", ret); + continue; + }Also applies to: 362-374
🧹 Nitpick comments (4)
examples/local_video/README.md (1)
15-30: Add language specifier to fenced code blocks.The code blocks are missing language specifiers, which affects syntax highlighting and linting compliance.
📝 Suggested fix
-``` +```bash cargo run -p local_video --bin publisher -- --list-camerasApply similar changes to the code blocks starting at lines 33 and 43.
examples/local_video/src/subscriber.rs (2)
437-440: Minor redundancy in repaint requests.
request_repaint()on line 439 triggers an immediate repaint, whilerequest_repaint_after(16ms)on line 520 schedules a delayed one. For video playback at ~60fps, the delayed repaint alone should suffice. The immediate request causes extra repaints but isn't harmful.Consider removing the immediate
request_repaint()call if the 16ms delayed repaint provides sufficient smoothness.
992-998: Per-frame allocations for texture packing.The
packed,packed_u, andpacked_vvectors are allocated on every dirty frame. For 30-60fps video, this creates significant allocation pressure. Consider caching these buffers inYuvGpuStateand reusing them across frames.♻️ Optimization suggestion
Add cached buffers to
YuvGpuState:struct YuvGpuState { // ... existing fields ... packed_y: Vec<u8>, packed_u: Vec<u8>, packed_v: Vec<u8>, }Then resize and reuse instead of allocating new vectors each frame.
examples/local_video/src/list_devices.rs (1)
49-59: Consider graceful handling of per-format failures.If
compatible_list_by_resolutionfails for one fourcc (line 52), the entire function returns an error. For better robustness, consider skipping formats that fail to enumerate rather than aborting completely.♻️ Suggested improvement
for fourcc in fourccs { - let mut res_map = camera.compatible_list_by_resolution(fourcc)?; + let Ok(mut res_map) = camera.compatible_list_by_resolution(fourcc) else { + continue; // Skip formats that fail to enumerate + }; let mut res_sorted = BTreeMap::new();
📜 Review details
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
Cargo.lockis excluded by!**/*.lock
📒 Files selected for processing (10)
.gitignoreCargo.tomlexamples/local_video/Cargo.tomlexamples/local_video/README.mdexamples/local_video/src/list_devices.rsexamples/local_video/src/publisher.rsexamples/local_video/src/subscriber.rsexamples/local_video/src/yuv_shader.wgslyuv-sys/Cargo.tomlyuv-sys/build.rs
🧰 Additional context used
🧬 Code graph analysis (1)
examples/local_video/src/publisher.rs (2)
yuv-sys/build.rs (1)
main(108-174)livekit/src/room/options.rs (1)
as_str(30-38)
🪛 markdownlint-cli2 (0.18.1)
examples/local_video/README.md
15-15: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
33-33: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
43-43: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
🔇 Additional comments (8)
Cargo.toml (1)
22-22: LGTM!The new workspace member is correctly added and maintains the alphabetical ordering of the examples list.
.gitignore (1)
6-7: LGTM!Sensible additions:
.envfor local environment variables (used by the new example for LiveKit credentials) and.cursorfor Cursor IDE artifacts.examples/local_video/src/yuv_shader.wgsl (1)
1-61: LGTM!The shader correctly implements:
- Fullscreen triangle rendering via vertex index
- BT.601 limited-range YUV to RGB conversion with proper coefficients
- Padding-aware UV coordinate scaling to avoid sampling padded columns
- Vertical flip for correct orientation
examples/local_video/src/subscriber.rs (3)
23-27: LGTM!Simple and effective shutdown polling with appropriate memory ordering.
656-659: LGTM!Properly signals shutdown to background threads after the UI window closes, ensuring clean termination of the frame sink loop.
284-298: The review comment is incorrect. The I420 buffer slices returned bydata()are guaranteed to match the calculated sizes because they use identical formulas.Looking at the implementation of
I420Buffer::data()inlibwebrtc/src/native/video_frame.rs(lines 304-314), the slice lengths are constructed using:
- Y plane:
stride_y() * height()- U plane:
stride_u() * chroma_height- V plane:
stride_v() * chroma_heightThe code in subscriber.rs calculates the exact same values:
y_size = sy * h(wheresyis fromstrides()andhis fromheight())u_size = su * ch(wheresuis fromstrides()andch = (h + 1) / 2)v_size = sv * chSince both use the same strides, height, and chroma height calculations from the same underlying I420 buffer object, the sizes are guaranteed to match. The
copy_from_slicecalls cannot panic.Likely an incorrect or invalid review comment.
examples/local_video/src/list_devices.rs (2)
68-86: LGTM!Clean implementation that properly aggregates formats, deduplicates FPS values, and maintains sorted output via BTreeMap.
13-31: LGTM!The main function provides clear output with proper handling of the no-cameras case and graceful error reporting for capability enumeration failures.
✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.
| @@ -0,0 +1,447 @@ | |||
| use anyhow::Result; | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Guard against --fps 0 causing a panic.
Duration::from_secs_f64(1.0 / pace_fps) will panic on zero; validate before computing the interval.
✅ Suggested fix
-use anyhow::Result;
+use anyhow::{bail, Result};
@@
- // Pace publishing at the requested FPS (not the camera-reported FPS) to hit desired cadence
- let pace_fps = args.fps as f64;
+ if args.fps == 0 {
+ bail!("--fps must be > 0");
+ }
+ // Pace publishing at the requested FPS (not the camera-reported FPS) to hit desired cadence
+ let pace_fps = args.fps as f64;Also applies to: 184-186
🤖 Prompt for AI Agents
In `@examples/local_video/src/publisher.rs` at line 1, Validate pace_fps before
computing Duration::from_secs_f64 to avoid a divide-by-zero panic: in the code
paths that call Duration::from_secs_f64(1.0 / pace_fps) (look for uses in
publisher.rs around the publisher setup and the loop where pace_fps is applied),
check that pace_fps > 0 and return an Err or print a clear error and exit when
it is zero or negative; update both occurrences (the initial interval
computation and the repeated use at lines ~184-186) to perform this guard and
handle the invalid value gracefully instead of letting Duration::from_secs_f64
panic.
adding
examples/local_videoto demonstrate publishing & subscribing to video tracks using the Rust SDK.Summary by CodeRabbit
Release Notes
New Features
Documentation
Chores
✏️ Tip: You can customize this high-level summary in your review settings.