Object Manipulation by a Humanoid Robot via Single Camera Pose Estimation
Şefik Emre Eskimez
Mechatronics, M.Sc. Thesis, 2013
Assoc. Prof. Kemalettin Erbatur (Thesis Supervisor), Assoc. Prof. Volkan Patoğlu, Assoc. Prof. Albert Levi, Assoc. Prof. Meriç Özcan, Asst. Prof. Hakan Erdoğan
Date &Time: August, 13th, 2013 - 11:00
Place: FENS L035
Keywords: Humanoid robots, object manipulation, object grasping, computer vision systems, navigation
Humanoid robots are designed to be used in daily life as assistance robots for people. They are expected to fill the jobs that require physical labor. These robots are also considered in healthcare sector. The ultimate goal in humanoid robotics is to reach a point where robots can truly communicate with people, and to be a part of labor force.
Usual daily environment of a common person contains objects with different geometric and texture features. Such objects should be easily recognized, located and manipulated by a robot when needed. These tasks require high amount of information from environment.
The Computer Vision field interests in extraction and use of visual cues for computer systems. Visual data captured with cameras contains the most of the information needed about the environment for high level tasks relative to the other sensors. Most of the high level tasks on humanoid robots require the target object to be segmented in image and located in the 3D environment. Also, the object should be kept in image so that the information about the object can be retrieved continuously. This can be achieved by gaze control schemes by using visual feedback to drive neck motors of the robot.
In this thesis an object manipulation algorithm is proposed for a humanoid robot. A white object with red square marker is used as the target object. The object is segmented by color information. Corners of the red marker is found and used for the pose estimation algorithm and gaze control. The pose information is used for navigation to the object and for the grasping action. The described algorithm is implemented on the humanoid experiment platform SURALP (Sabanci University ReseArch Labaratory Platform).