AI Virtual Mouse is a computer vision-based project that allows users to control their system cursor using hand gestures captured through a webcam. It uses MediaPipe for hand tracking and PyAutoGUI for cursor control.
- Real-time hand tracking
- Cursor movement using index finger
- Click action using pinch gesture (thumb and index finger)
- Lightweight and responsive
- No physical mouse required
- Python
- OpenCV
- MediaPipe (Tasks API)
- PyAutoGUI
git clone https://github.com/your-username/ai-virtual-mouse.git
cd ai-virtual-mouse
pip install opencv-python mediapipe pyautogui
Download the MediaPipe Hand Landmarker model file:
- Filename: hand_landmarker.task
- Place it in the root directory of the project
Run the script using:
python main.py
| Gesture | Action |
|---|---|
| Index finger | Move cursor |
| Thumb + index (pinch) | Left click |
| Press 'Q' | Exit program |
- The webcam captures live video.
- MediaPipe detects 21 hand landmarks.
- The index finger tip is mapped to screen coordinates.
- Distance between thumb tip and index tip is calculated.
- A click is triggered when the distance goes below a threshold.
ai-virtual-mouse/
│
├── main.py
├── hand_landmarker.task
└── README.md
You can adjust click sensitivity by modifying:
if distance < 0.06:- Lower value increases precision (harder to click)
- Higher value makes clicking easier
- Ensure proper lighting conditions
- Keep your hand clearly visible in the frame
- Avoid cluttered backgrounds for better detection
- On some systems, screen control permissions may be required
- Right-click gesture
- Scrolling support
- Multi-hand tracking
- Gesture customization
- Graphical user interface
This project is open-source and available under the MIT License.