Imitation of the human upper limb by convolutional neural networks
Abstract
The paper outlines the development of an algorithm focused on imitating movements of a human arm and replicating strokes generated by the user's hand within a working environment. The algorithm was crafted to discern the position of either the user's left or right arm, tracking each section (fingers, wrist, elbow, and shoulder) through a detection and tracking system. These movements are then replicated onto a virtual arm, simulating the actions of a cutting tool, generating strokes as it moves. Convolutional neural networks (CNNs) were employed to detect and classify each arm section, while geometric analysis determined the rotation angles of each joint, facilitating the virtual robot's motion. The stroke replication program achieved an 84.2% accuracy in stroke execution, gauged by the closure of the polygon, distance between initial and final drawing points, and generated noise, which was under 10%, with a 99% probability of drawing a closed polygon. A Fast region-based convolutional neural network (Fast R-CNN) network detected each arm section with 60.2% accuracy, producing detection boxes with precision ranging from 17% to 59%. Any recognition shortcomings were addressed through mathematical estimation of missing points and noise filters, resulting in a 90.4% imitation rate of human upper limb movement.
Keywords
Convolutional neural networks; Fast R-CNN; Gestures recognition system human motions; Motion capture data; Real-time imitation;
Full Text:
PDFDOI: http://doi.org/10.11591/ijai.v14.i1.pp193-203
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
IAES International Journal of Artificial Intelligence (IJ-AI)
ISSN/e-ISSN 2089-4872/2252-8938
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).