AI Virtual Control
The Project
This project is a hand gesture recognition and control system. It uses a webcam to capture video of the user's hands, and then uses machine learning to identify the hand gestures being made. Once a gesture is recognized, the program can perform different actions on the computer, such as moving the mouse cursor, clicking, or scrolling. The program uses two main libraries: OpenCV (cv2) for computer vision tasks and MediaPipe (mp) for machine learning tasks. Here's a breakdown of the code: Import libraries: The code starts by importing the necessary libraries like OpenCV, MediaPipe, PyAutoGUI, and others. Define gestures: The code defines a class called Gest which represents different hand gestures that can be recognized. Each gesture is assigned a numerical value for easy identification. Define hand labels: The code defines a class named HLabel to represent the handedness of the detected hand (left or right). Hand recognition: The HandRecog class handles recognizing gestures from the hand landmarks detected by MediaPipe. It takes the hand label (left or right) as input during initialization. It has methods to update the hand result based on the latest frame from the webcam, calculate finger states based on landmark locations, and finally identify the current gesture based on the finger states. Control actions: The Controller class contains functions to perform various actions on the computer based on the detected gestures. It can move the mouse cursor, perform clicks, control scroll, and even adjust system volume and brightness using external libraries. Main Class - GestureController: This class is the entry point of the program. It handles initializing the webcam, capturing video frames, processing them using MediaPipe to detect hands and gestures, and then calling the appropriate control functions from the Controller class based on the recognized gesture. Cursor Control: Move the cursor on the screen by tracking the hand (usually wrist). Clicking: Single click, double click, and right-click based on specific gestures. Dragging: Hold a click and drag the cursor across the screen. Scrolling: Scroll up/down or horizontally by pinching two fingers together. Volume Control: Adjust system volume by pinching fingers together and moving them up/down. Brightness Control: Adjust screen brightness by pinching fingers together and moving them up/down Overall, this project demonstrates how to use computer vision and machine learning to create a system that can interact with the computer using hand gestures.
About the team
Team members
More cool Art projects
Hireling Provider
Mobile apps

Save Environment
Scratch

Motu Patlu & Chicken Biryani
Scratch
