Skip to content

rithwiksb/gesturerecog_dl

Repository files navigation

🤖 Gesture Recognition System

An advanced computer vision system for comparing and analyzing hand gestures using MediaPipe and OpenCV.

🎯 Overview

This project provides a sophisticated gesture recognition system that can:

  • Record reference gestures from video files
  • Capture live gestures using your webcam
  • Compare gestures using advanced similarity algorithms
  • Provide detailed scoring and quality assessment

✨ Features

  • 🎥 Real-time Gesture Capture: Live webcam gesture recording
  • 📹 Video Processing: Extract hand keypoints from reference videos
  • 🔍 Advanced Comparison: Cosine similarity-based gesture matching
  • 📊 Intelligent Scoring: Aggressive scoring algorithm for precise matching
  • 🎯 Quality Assessment: Qualitative feedback on gesture similarity
  • 🤖 MediaPipe Integration: Leverages Google's MediaPipe for accurate hand tracking
  • 📈 Frame-by-Frame Analysis: Detailed comparison across multiple video frames

🧠 How It Works

  1. Keypoint Extraction: Uses MediaPipe to extract 21 hand landmarks per hand
  2. Feature Vector: Creates 126-dimensional feature vectors (21 points × 3 coordinates × 2 hands)
  3. Similarity Calculation: Computes cosine similarity between reference and live gestures
  4. Scoring Algorithm: Applies aggressive penalty for medium-quality matches to ensure precision

📊 Scoring System

Score Range Assessment Description
90-100% Excellent Near-perfect gesture match
70-89% Good Strong similarity with minor differences
50-69% Fair Moderate similarity
30-49% Poor Significant differences
0-29% Failed Gestures don't match

🛠️ Installation

Prerequisites

  • Python 3.8+
  • PyTorch
  • OpenCV
  • MediaPipe
  • NumPy

Setup

  1. Clone the repository

    git clone https://github.com/yourusername/gesture-recognition.git
    cd gesture-recognition
  2. Install dependencies

    pip install opencv-python mediapipe numpy
  3. Prepare your reference video

    • Place your reference gesture video as reference_video.mp4 in the project directory
    • Ensure clear hand visibility and good lighting

🚀 Quick Start

  1. Prepare reference video

    • Place your reference gesture video as reference_video.mp4
    • Ensure clear hand visibility
  2. Run the gesture comparison

    python gesture_recognition.py
  3. Follow the interactive process

    • The system will automatically process your reference video
    • Your webcam will activate for live gesture recording
    • Perform your gesture in front of the camera
    • Press 'q' to stop recording and see the similarity results

� Example Output

Frames collected: 45 (reference), 38 (live)
Raw Similarity Score: 87.34%
Final Score: 42.15%
Assessment: Fair match

🔧 Configuration

Edit gesture_recognition.py to adjust:

  • MIN_FRAMES_REQUIRED: Minimum frames for valid gesture (default: 10)
  • FPS_REDUCTION_FACTOR: Frame processing rate (default: 2)
  • FIXED_KEYPOINTS: Number of keypoints per gesture (default: 126)

Scoring Algorithm Customization

The apply_aggressive_scoring() function can be modified to change how similarity scores are transformed:

def apply_aggressive_scoring(normalized_similarity):
    # Modify these ranges to adjust scoring sensitivity
    if normalized_similarity >= 0.90:
        return normalized_similarity * 100  # Keep high scores
    elif 0.80 <= normalized_similarity < 0.90:
        # Aggressive penalty for medium scores
        return (0.20 + (normalized_similarity - 0.80) * 3) * 100
    # ... continue customization

🤝 Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

🤝 Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

🙏 Acknowledgments

  • MediaPipe: For excellent hand tracking capabilities
  • OpenCV: For computer vision processing
  • NumPy: For numerical computations

Built with ❤️ by Rithwik

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages