Gestual Guidance Control for an Anthropomorphic Robot Based on Artificial Vision Techniques
Abstract
This article presents the development and implementation of an algorithm that allows evaluating the movements of an operator, which are captured by a structured light camera, generating a series of points that indicate the trajectory that the robotic arm must execute, this procedure is based on artificial vision techniques. The results are displayed through a user interface that allows the operator's movements to be graphically replicated through points that indicate whether or not they belong to the robotic manipulator’s workspace, the corroboration is done through the simulation of the robot traversing the path obtained. The captured movement shows fluctuations in the plotted trajectory, showing points with a slight margin of error in the segments generated from them. To smooth this movement out, intermediate points are added between each pair of points, improving the relationship between speed and precision without ruling out those generated by the variation in the involuntary movements of the operator when executing the trajectory.



