MENDIBURU, F. J.; http://lattes.cnpq.br/8884676140983123; MENDIBURU, Fernando Javier.
Abstract:
In this dissertation the integration of a sensing system and a path planning for object handler
robot is addressed. The system objective is the intelligent part handling, generating free
trajectories when possible. The workspace is defined by a computational vision system that
captures a point cloud for the scene performing modeling of obstacles to be provided to the
routing algorithm. The manipulator modelling is performed a priori, and it is determined if the
cinematic can generate a collision free trajectory. Device modelling and calibration is realized,
removing the lens distortions and determining their intrinsic parameters. The relation between
the camera measurement unities and the physical world unities is a critical component in any
attempt to reconstruct a tridimensional scene. The calibration between the kinect reference system
and the manipulator base is performed to maintain some coherence in the measurements
and to obtain good precision in the handler commands. The integration of the vision loops with
the trajectory planner is important to define the task correctly, and for it must adapt the routing
algorithm, transforming the input settings to be read by the planner and its output settings to
be sent to the controller of the manipulator. The robotic manipulator Pegasus 880-RA2-1-B
was used in the proposed solution, together with an external RGB-D sensor and a monocular
CMOS IR-Syntek camera installed in the manipulator effector. Positive results were observed
in the application of the processing and in the cameras calibration. The part transportation by
the task execution was defined by the trajectory planner and was successfully performed with
good results with several obstacles between the object position and the final target. The execution
time for each movement and for each control system block were obtained. The vision
system errors were calculated and they don’t influence in the grasp position in short distances.
A good response for the system was obtained for the control system that finds an obstacle free
trajectory, being performed satisfactorily. Greater precision in the object, robot and obstacles
detection was observed the closer to RGB-D device they are. The system shows robustness to
changes in the illumination that could degrade the system performance by using infrared vision
systems.