A rule‑based computer vision application built with Python, OpenCV, and Streamlit to detect human actions (walking, sitting, waving) and emotions (happy, angry, surprised) in real time without datasets or pre‑trained models. The system uses logical heuristics and simulated inputs, making it lightweight, fast, and ideal for low‑resource environments, prototypes, and educational demonstrations.
- Real‑time camera feed with pose & facial landmarks
- Rule‑based detection (no dataset required)
- Interactive Streamlit dashboard with animated UI
- Analytics: session stats, FPS, behavior & emotion distribution, recent activity log
- Exportable session data for further analysis
- Python
- OpenCV
- Streamlit
- Plotly & Pandas
- Camera Handler captures frames from the webcam.
- Behavior Detector applies heuristic rules to identify actions (e.g., walking, sitting).
- Emotion Detector analyzes facial features to infer emotions (e.g., happy, surprised).
- Landmark Overlay draws pose and facial landmarks on the live feed for visualization.
- Analytics Tracker records detections, calculates session stats, and generates charts.
- The Streamlit Dashboard displays the live feed, current detections, confidence scores, and interactive analytics in real time.
- Aakriti Mogha — Project Lead, Computer Vision & Heuristic Design
- Anamika Uniyal — UI/UX & Streamlit Dashboard Development
- Aarushi Agrawal — Analytics & Visualization (Plotly, Pandas)
- Anushka Negi — Testing, Documentation & Integration
git clone https://github.com/CrapeBell/human-behavior-recognition.git
cd human-behavior-recognition
pip install -r requirements.txt
streamlit run app.py- Behavior Distribution: Pie chart of detected actions
- Emotion Distribution: Bar chart of detected emotions
- Recent Activity: Log of last 10 detections
- Session Stats: Duration, FPS, unique behaviors & emotions