Real-time multi-camera 360-degree obstacle awareness system for cyclists, built on NVIDIA Jetson Orin Nano.
Urban cycling is dangerous because riders have almost no visibility behind or beside them. Cyclops solves this by mounting four cameras on a bicycle and running all processing on-device at the edge. The system detects vehicles and pedestrians in every direction, estimates their distance and closing speed, determines vehicle orientation, and streams the results over WebSocket to a web app on a phone mounted to the handlebars so the rider has continuous spatial awareness of their surroundings.
- 4-camera surround vision
- YOLOv11s object detection with TensorRT FP16 inference
- Kalman-filtered multi-object tracking across all camera feeds
- Monocular depth estimation via per-camera calibrated look-up tables
- Vehicle orientation detection (front/rear-facing vs side-on)
- Radial speed and time-to-collision estimation
- Real-time WebSocket streaming to a 3D visualization frontend
Rear node -- Jetson Orin Nano
- 2x USB cameras (left and right)
- 1x CSI camera (rear, wide FOV)
- Handles all processing and streams results to the phone
Front node -- Raspberry Pi 4
- 1x CSI camera (forward-facing)
- Streams H.264 video over WiFi (UDP RTP) to the Jetson for processing
Capstone/
software/
core/ Main C++ application
include/ Header files
src/ Implementation files
camera_calibration/ Intrinsic calibration YAMLs and depth LUTs
CMakeLists.txt
build.sh Build script
main.cpp Entry point
models/ TensorRT engine files (not tracked in git)
scripts/
service/ systemd user service for auto-start at boot
object_detection.py Python detection scripts (prototyping)
calibration_tools/ Camera intrinsic calibration utilities
tools/ Debugging utilities
config/ Runtime config (auto-detected camera devices)
enclosures/ CAD models for both enclosures
docs/images/ Project images and diagrams
- NVIDIA Jetson Orin Nano with JetPack 6
- CUDA 12
- TensorRT 10+
- OpenCV 4.x (with GStreamer support)
- jetson-utils
- IXWebSocket (optional, for WebSocket streaming)
cd software/core
./build.shThis cleans any previous build, runs CMake, compiles with all available cores, and places the cyclops binary in software/.
To enable the debug bounding-box overlay during development, edit build.sh and uncomment the DEBUG_DRAW CMake flag.
The service auto-detects connected cameras at startup and launches Cyclops with the correct device paths.
cd software/scripts/service
./install_service.sh
systemctl --user start cyclops # start now
systemctl --user stop cyclops # stop
systemctl --user restart cyclops # restart
systemctl --user status cyclops # check status
journalctl --user -u cyclops -f # live logscd software
./cyclops --usb-left /dev/video1 --usb-right /dev/video3| Flag | Description |
|---|---|
--usb-left <dev> |
USB left camera device (default: /dev/video1) |
--usb-right <dev> |
USB right camera device (default: /dev/video3) |
--ws-port <port> |
WebSocket server port (default: 8765) |
--calib |
Enter depth calibration mode |
--cam <id> |
Camera ID for calibration (0-3) |
--class <name> |
Object class for calibration (pedestrian, car, truck) |
--dist <meters> |
Known distance for calibration |
--samples <count> |
Number of calibration samples (default: 30) |
Each camera needs a distance look-up table (LUT) that maps bounding box height in pixels to real-world distance in meters. To calibrate:
- Place a known object (car, pedestrian, or truck) at a measured distance from the camera.
- Run calibration mode:
cd software ./cyclops --calib --cam 0 --class car --dist 5.0 --samples 30 - The system collects bounding box samples, fits a curve, and saves the LUT to
core/camera_calibration/. - Repeat for each camera and object class.
When built with WebSocket support, Cyclops streams detection data as JSON arrays on each inference frame:
[
{
"id": "cam0_obj12",
"type": "car",
"speed": 2.35,
"direction": "left",
"distance": 8.40,
"x": -1.20,
"y": 0,
"isFrontFacing": true
}
]| Field | Description |
|---|---|
id |
Unique object identifier (camera + track ID) |
type |
Object class (car, truck, pedestrian, bicycle) |
speed |
Radial speed in m/s (positive = approaching) |
direction |
Camera direction (left, back, right, front) |
distance |
Estimated distance in meters |
x |
Lateral offset in meters |
isFrontFacing |
true if vehicle is front/rear-facing, false if side-on |





