Shape of things to comeHumanoid robots to help people in hazardous environments
Current calculations required for robots’ handling objects and avoiding collisions are based on movements in the XYZ coordinates, and are sensitive to any minor deformations in the environment, such as a flexible object that would cause the calculations to be invalid; researchers offer a new way to govern robots’ movement
Humanoid robots that can assist injured people in hazardous environments are moving closer to reality as a result of research being carried out at Edinburgh University. The three-year project is intended to combine the latest advances in mathematics and engineering to develop robotic systems that can safely handle flexible objects in extreme conditions.
Ellie Zolfagharifard writes that at the moment, this ability is beyond the reach of motion synthesis techniques because of the complex nature of the calculations required for handling objects and avoiding collisions. These methods, based on movements in the XYZ coordinates, are sensitive to any minor deformations in the environment, such as a flexible object that would cause the calculations to be invalid.
The Edinburgh team has developed an alternative technique using a topological model that takes into account the position of objects in relation to one another. The system is based on Gauss’s theory of linking numbers, which calculates the relationship between threads.
Dr. Sethu Vijayakumar, co-investigator, said: “By considering the topological space using this theory, we are able to capture the invariances in the environment. Topology-based motion synthesis is a fairly radical change in concept for programming robots. Our hope is that it will lead to robots that act more like humans.”
Along with the Honda Research Institute Europe, the team hopes to have a prototype humanoid robot that is able to dress itself by 2013.
Vijayakumar believes that the research could have wider implications in creating robots that could assist casualties out of burning buildings or operate complex tasks in uncontrolled environments, such as in nuclear clean-up operations.
Dr. Taku Komura, principal investigator, said that, while the theory has progressed significantly, the engineering challenges could prove an obstacle.
“One of the biggest difficulties we foresee is in recording movements and feeding back the information to the robots,” he said. “Currently, we’re using a range of methods including a camera-based motion system and inertial techniques that record movements using accelerometers and gyroscopes.”
Vijayakumar said that further research in this field would require advances in sensing technology, particularly visual sensing systems, which he believes are currently not robust enough to relay the required information to the robot.