Mass High Tech
How to make machines feel a little more human
By Sue Mellen
Thanks to a real-world research partnership between the robotics industry and academia, man and machine may soon be on better terms in the workplace.
Although imbued with artificial intelligence, early robotic devices were, in fact, fairly unpleasant company. They were simply programmed to perform predetermined tasks in predesignated ways, with no ability to make decisions and change course along the way. And because of an inefficient, gear-based system of power transfer, the machines were necessarily big and heavy. Sized to perform tasks that frequently required them to lift heavy objects, they could roll over objects or push through walls to reach their objectives.
"Traditionally, robots were programmed with a great deal of knowledge, but it was all a priori intelligence, just a plan for completing a certain task," explains Rod Grupen, associate professor and director of the Laboratory of Perceptual Robotics at the University of Massachusetts at Amherst.
But before considering the future, let's shatter some illusions about the present state of conventional robotics. America's gross domestic product is quickly approaching $10 trillion. How much is invested in robotics? As you think about the answer, recall recent television commercials you've seen with robots lined up on factory floors.
"The role of the robot was just to track the plan and reach its objective. The problem was that the world doesn't always cooperate - things change - so the plan failed in many ways," Grupen says. "It was impossible to take into account, in advance, all possible variables. This was especially true if a robot had to touch something."
Learning from children
In this case, robots are still programmed with plans for specific tasks, but the plans are only guidelines, which can change based on signals from the world at large. In their work, the academicians study sensory and motor development in young children.
"When infants move their arms to reach out and touch things, they are responding in a certain way to visual stimuli. We're trying to build a robotic system with some commitment to those reflex arcs (responses to stimuli)," Grupen says.
Grupen hopes to create systems that can predict how the world should feel based on visual stimuli and, conversely, how things should look based on how they feel. The first step, of course, is for the robot to understand the implications of its actions in the world, in much the same way as children begin to understand their impact on the world.
In the lab, Grupen and his students are working with a robotic arm from Cambridge-based Barrett Technologies. Based on a tele-operator model of systems that transfer information from one robotic device to another and then to a human operator, the Barrett arm produces a certain discernible current when it comes in contact with an object. That current is evident, and therefore usable, because the lightweight arm is cable-driven rather than dependent on gears; the cable structure provides a more efficient transfer of power. There is less vibration than in traditional devices, so the current more usable.
Advanced artificial intelligence software, being developed both at Barrett and in academic research labs, creates learning algorithms that allow a robotic mechanism to makes choices based on how hard its motor is working.
Robotics proves a great course of study, Howe says, because it requires an integration of several disciplines, including mechanical, electrical and computer engineering. That integration teaches students who are often accustomed to focusing on one narrow field of study to think a little differently, he says.
"Students learn that they have to think outside the box to make the connections between the disciplines that make things work in the real world. As computers make their appearance in all kinds of devices out in the world, robotics can be very helpful in making those necessary connections," Howe says.
Like Grupen, Howe, his colleagues and students are working to solve the problems that until now have kept robots and humans segregated. For example, they are looking at ways to teach a device to change its level of rigidity based on information from the environment.
One application might be lifting humans out of hazardous situations like fires, toxic spills and battle lines. Human-lifting robots might also prove useful in hospitals and nursing homes, where nurses and aides sometimes suffer back injuries when trying to move patients. Lifting a human, Howe points out, is a task that requires a high degree of intelligence.
"When you think about it, people are really hard to pick up," Howe notes. "They have no handles and a lot of delicate parts."
Sue Mellen is a freelance writer based in Tyngsboro, Mass., and a frequent contributor to Mass High Tech.
Publish Date: January 11-17, 1998