Hand Motion Controlled Robotic Arm – Mechanical Project
Hand Motion Controlled Robotic Arm – Mechanical Project
In today’s world, most of all sectors, the work is done by robots or robotic arm having different number of degree of freedoms (DOF’s) as per the requirement. The idea is to change a perception of remote controls for actuating manually operated Robotic-Arm. Well, this paper presents a thought and a way to eradicate the buttons, joysticks and replace them with some of the more intuitive technique that is, controlling the complete Robotic Arm by the operators hand gesture. The proposed electronics system recognizes a particular hand gesture that will be performed in front of webcam & transmitted respected signals wirelessly through RF module. Depending on the received signals the robotic arm which is followed by AVR microcontroller performs the receptive motions at the receiver section.
Nowadays, the most of the human-computer interaction (HCI) is based on mechanical devices such as keyboards, mouse, joysticks or gamepads. In recent years there has been a growing interest in a class of methods based on computational vision due to its ability to recognize the human gestures in a natural way. Such methods use as input the images acquired from a camera or from a stereo pair of cameras. The main goal of such algorithms is to measure the hand configuration in each time instant. To facilitate this process many gesture recognition applications resort to the use of uniquely coloured gloves or markers on hands or on the fingers. In addition, using a controlled background makes it possible to localize the different hand efficiently and even in real-time. These two conditions impose restrictions on the user and on the interface setup. We have specifically avoided solutions which require coloured gloves or markers and a controlled background because of the initial requirements of our application. It must work for different people, without any complement on them and for unpredictable backgrounds.
Our application uses images from a low-cost web camera placed in front of the work area, where the recognized gestures act as the input for particular robotic arm motion. Here, webcam is connected with computer or laptop for human machine interface. Computer is already loaded with MATLAB 7 tool having Windows XP installed. Webcam precedes several of recognizing values to the computer. MATLAB tool recognizing the preferred gestures by comparing stored gestures values & gives respective outputs. The output which was generated by comparison has been transmitted wirelessly through RF module. Receiver section accepts the transmitting signals and given to AVR microcontroller which check the several values. The output of microcontroller is given to the motor which has been mounted in robotic arm and we will get a respective motion of robotic arm.
In the robotics field, several research efforts have been directed towards recognizing human hand gestures.
Following are the few popular systems:
A. Vision-based Gesture Recognition –
This Recognition system basically worked in the field of Service Robotics and the researchers are finally designed a Robot performing the cleaning task. They designed a gesture-based interface to control a mobile robot equipped with the manipulator. The interface uses a camera to track a person and recognize the different gestures involving arm motion. A fast, adaptive tracking algorithm enables the robot to track and follow a person reliably through an office environment with changing lighting conditions. Two gesture recognition methods i.e. a template based approach and a neural based approach were compared and combined with the Viterbi algorithm for the recognition of gestures defined through the arm motion. It results in an interactive clean-up task, where the user guides the robot to go to the specific locations that need to be cleaned and also instructs the robot to pick up available trash.
B. Motion Capture Sensor Recognition –
Such recognition technique made it possible to implement an accelerometer based system to communicate with an industrial robotic arm wirelessly. In this particular project the robotic arm is powered with an ARM7 based LPC1768 core. Actually, MEMS is a three dimensional accelerometer sensor which captures gestures of human-arm and produces three different analog output voltages in three dimensional axes. And two flex sensors are used to control the gripper movement.
C. Accelerometer-based Gesture Recognition –
This Gesture Recognition methodology has become increasingly popular in a very short span of time. The low-moderate cost and relative small size of the accelerometers are the two factors that make it an effective tool to detect and recognize different human body gestures. Several studies have been conducted on the recognition of gestures from the acceleration data using Artificial Neural Networks (ANNs).
A low cost computer vision system that can be executed in a common PC equipped with low power USB web cam was one of the main objectives of our work, which has been implemented successfully. We have experimented with around 30 hand gesture images and achieved higher average precision. The best classification rate of 97% was obtained under different light conditions. But the drawback in this method is that the hand should be properly placed with respect to the webcam so that the entire hand region is captured. If the hand is not placed properly the gesture is not recognized appropriately. Gesture made in this method involves only one hand and this reduces the number of gestures that can be made using both hands.
Reference and report Download :