Distant Frames is a smart video frame extraction tool designed to capture distinct visual moments from video files. Instead of simply saving every Nth frame, it analyzes the visual similarity between consecutive potential frames and only saves those that are sufficiently different.
- Smart Deduplication: Avoids saving redundant frames where the scene hasn't changed.
- Histogram Correlation: Uses HSV color space histogram comparison for robust similarity detection.
- Configurable Threshold: Fine-tune the sensitivity of frame dropping to suit your specific video content.
- Efficient Processing: Seeks directly to target timestamps (
CAP_PROP_POS_FRAMES) for faster processing than frame-by-frame reading. - Custom Start Time: Begin extraction from any point in the video using a timestamp in seconds.
- Open Eyes Filter: Optionally keep only frames where at least one face with both eyes open is detected, using local Haar cascade classifiers.
- Python: 3.12 or higher
- Dependencies:
opencv-python
Install the latest stable release using pip:
pip install distant-frames
or
uv add distant-framesClone the repo and run directly without installing the package:
-
Clone the repository:
git clone git@github.com:yubraaj11/distant-frames.git cd distant-frames -
Install dependencies:
uv sync --frozen
-
Run via
main.py:uv run main.py path/to/video.mp4 -o output_dir -t 0.75
distant-frames path/to/video.mp4 -o path/to/output -t 0.75uv run main.py path/to/video.mp4 -o path/to/output -t 0.75| Argument | Description | Default |
|---|---|---|
video_path |
Path to the input video file (Required). | N/A |
--output, -o |
Directory to save the extracted frames. | extracted_frames |
--threshold, -t |
Similarity score threshold (0.0 to 1.0). Frames with a score higher than this value are discarded. | 0.65 |
--start, -s |
Timestamp in seconds to begin extraction from. | 0.0 |
--open-eyes |
When set, only saves frames where at least one face with both eyes open is detected. | Off |
Extract frames with default settings:
distant-frames my_vacation.mp4Save to a custom folder with a stricter similarity check:
distant-frames my_vacation.mp4 -o best_shots -t 0.95Start extraction from a specific timestamp (e.g. 1 minute 30 seconds in):
distant-frames interview.mp4 -s 90Only keep frames where a person's eyes are open:
distant-frames interview.mp4 --open-eyes -o key_framesCombine all options:
distant-frames interview.mp4 -s 90 -t 0.80 --open-eyes -o key_frames- Sampling: The script checks one frame every second (based on the video's FPS), starting from
--startif provided. - Comparison: It compares the current candidate frame against the last successfully saved frame.
- Algorithm: It converts frames to HSV color space and calculates Normalized Histogram Correlation.
- Decision:
- If similarity <
threshold: candidate for saving. - If similarity >=
threshold: SKIP (The scene is too similar).
- If similarity <
- Open Eyes Filter (optional): If
--open-eyesis set, a candidate frame is only saved if a face with two open eyes is detected using Haar cascade classifiers.
You can generate a test video to verify the functionality:
uv run generate_test_video.py
uv run main.py test_video.mp4This will create a test_video.mp4 with known scene changes and then extract frames from it, demonstrating the deduplication logic.