Skip to content

Vignan2659/HandGesture-GameController

Repository files navigation

Hand Gesture Game Controller

A computer vision-based system that enables touchless game control through real-time hand gesture recognition using webcam input. This project leverages MediaPipe's hand tracking capabilities to translate finger positions into keyboard commands, allowing users to control games like Temple Run without physical keyboard interaction.

📋 Project Overview

Domain: Computer Vision | Human-Computer Interaction
Problem Statement: Traditional gaming requires physical interaction with input devices. This project explores contactless control mechanisms using hand gestures detected via webcam, making gaming more accessible and interactive.

✨ Features

  • Real-time Hand Detection: Tracks hand landmarks with high accuracy using MediaPipe
  • Gesture-to-Action Mapping: Converts specific finger combinations into game controls
  • Multi-Gesture Support: Recognizes four distinct gestures for comprehensive game control
  • Low Latency Processing: Ensures smooth gameplay experience with minimal input delay
  • Visual Feedback: Displays hand landmarks overlay on live video feed

🎮 Gesture Controls

Gesture Finger Position Action Key Output
👍 Thumb Only Thumb up Jump ⬆️ Up Arrow
🤙 Pinky Only Pinky up Slide ⬇️ Down Arrow
☝️ Index Only Index finger up Move Right ➡️ Right Arrow
✌️ Index + Middle Two fingers up Move Left ⬅️ Left Arrow

🛠️ Tech Stack

Programming Language: Python 3.7+

Libraries & Frameworks:

  • OpenCV (cv2) - Webcam capture and image processing
  • MediaPipe - Hand landmark detection and tracking
  • PyAutoGUI - Keyboard input simulation

🏗️ Project Architecture

Input (Webcam) → Hand Detection (MediaPipe) → Finger Position Analysis 
→ Gesture Classification → Keyboard Command (PyAutoGUI) → Game Control

Workflow:

  1. Video Capture: Continuous frame acquisition from webcam using OpenCV
  2. Preprocessing: Convert BGR to RGB color space for MediaPipe compatibility
  3. Hand Tracking: Detect 21 hand landmarks per frame
  4. Feature Extraction: Calculate finger tip positions relative to joints
  5. Gesture Recognition: Compare finger states against predefined patterns
  6. Action Execution: Trigger corresponding keyboard events via PyAutoGUI

📂 Repository Structure

HandGesture-GameController/
│
├── main.py              # Core application with gesture detection logic
├── requirements.txt     # Python dependencies with pinned versions
├── README.md           # Project documentation
└── .gitignore          # Git exclusion patterns

File Descriptions:

  • main.py: Implements webcam capture, hand landmark detection, finger state analysis, and keyboard control logic
  • requirements.txt: Specifies exact library versions for reproducible environment setup

🚀 Installation & Setup

Prerequisites

  • Python 3.7 or higher
  • Webcam/Camera access
  • Windows/macOS/Linux operating system

Steps

  1. Clone the repository:
git clone https://github.com/Vignan2659/HandGesture-GameController.git
cd HandGesture-GameController
  1. Create virtual environment (recommended):
python -m venv venv

# Windows
venv\Scripts\activate

# macOS/Linux
source venv/bin/activate
  1. Install dependencies:
pip install -r requirements.txt

▶️ How to Run

  1. Launch the application:
python main.py
  1. Position your hand:

    • Ensure adequate lighting
    • Keep hand within camera frame
    • Maintain 30-60 cm distance from camera
  2. Start the game (e.g., Temple Run) and perform gestures

  3. Exit: Press q key to terminate the application

📊 Sample Output

Console Output:

Jump
Right
Left
Slide

Visual Output:

  • Live video feed with hand skeleton overlay
  • Green landmarks and connections indicating detected hand structure

👥 Contributors

This project was developed as an academic exploration of computer vision applications in human-computer interaction.

  • Contributor 1
  • Contributor 2

🔧 Technical Notes

  • Hand Detection Confidence: MediaPipe default threshold (0.5)
  • Maximum Hands Tracked: 1 (optimized for single-player control)
  • Finger Detection Method: Tip landmark comparison with joint positions
  • Supported Games: Any application using arrow key inputs

⚠️ Limitations

  • Requires consistent lighting conditions
  • Single-hand tracking only
  • Limited to four predefined gestures
  • Performance depends on hardware capabilities

🚀 Future Enhancements

  • Add custom gesture configuration
  • Implement multi-hand support
  • Introduce gesture recording and playback
  • Optimize for lower-end hardware
  • Add GUI for gesture calibration

📄 License

This project is open source and available under the MIT License.


Note: Ensure you have necessary permissions to send keyboard inputs on your operating system (some OS require accessibility permissions for PyAutoGUI).

About

Real-time hand gesture recognition system for touchless game control using MediaPipe and OpenCV

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages