VIsually-GuIded WalkIng Reference ModIfIcatIon for HumanoId Robots
Kaan Can Fidan
Mechatronics, MSc Program, 2012
Assoc. Prof. Kemalettin Erbatur (Thesis Supervisor), Prof. Dr. Aytül Erçil, Assoc. Prof. Gözde Ünal, Asst. Prof. Müjdat Çetin , Prof. Dr. Asif Şabanovic
Date &Time: August 7th, 2012 – 10:30
Place: FENS L063
Humanoid robots are expected to assist humans in the future. As for any robot with mobile characteristics, autonomy is an invaluable feature for a humanoid interacting with its environment. Autonomy, along with components from artificial intelligence, requires information from sensors. Vision sensors are widely accepted as the source of richest information about the surroundings of a robot. Visual information can be exploited in tasks ranging from object recognition, localization and manipulation to scene interpretation, gesture identification and self-localization.
Any autonomous action of a humanoid, trying to accomplish a high-level goal, requires the robot to move between arbitrary waypoints and inevitably relies on its self-localization abilities. Due to the disturbances accumulating over the path, it can only be achieved by gathering feedback information from the environment.
This thesis proposes a path planning and correction method for bipedal walkers based on visual odometry. A stereo camera pair is used to find distinguishable 3D scene points and track them over time, in order to estimate the 6 degrees-of-freedom position and orientation of the robot. The algorithm is developed and assessed on a benchmarking stereo video sequence taken from a wheeled robot, and then tested via experiments with the humanoid robot SURALP (Sabanci University Robotic ReseArch Laboratory Platform).