Friday 2 December 2022

The quest for a robot with a sense of touch

Benjamin Tee is an associate professor at the National University of Singapore

Robots that can touch are the next step in robotics technology. It will allow them to perform all sorts of tasks that are now impossible.

Robots have become increasingly advanced over the past decade, and are found everywhere from warehouses to surgery theaters. But if robots are to do more jobs, and be trusted with more complex tasks, researchers say these machines need to be able to interact more with real-world environments like people do—with touch.

In commerce, a sense of touch can give robots the ability to handle delicate items and stock shelves. In the home, touch would let robots know if they bump into someone or grab fingers too hard. For robotic prosthetics, having touch-sensitive electronic skin could greatly improve an artificial limb’s performance.

“Touch is crucial to be able to react to an unpredictable world,” says Perla Maiolino, an associate professor of engineering at the University of Oxford in England. “We take the sense for granted, but it’s the interface between us and the world.”

Giving the sense of touch to robots, however, isn’t easy. Consider how complicated touch is for biological bodies. It picks up on temperature, force, texture, weight and shape. It relies on different types of receptors in multiple layers of the skin. From there, the brain and body work together to produce the perception of touch.

Touch is very important but also very complex,” says Prof. Maiolino.

Feeling by sight

Cameras, which are inexpensive and plentiful, have been one of the most explored ways to give robots touch, by providing visuals of many of the details that touch can convey.

For one thing, interpreting pixels from cameras is easier than interpreting data generated by touch. The use of touch also is complicated because by coming into physical contact with things, you potentially change the environment in a way that a camera doesn’t, says Ted Adelson, a professor of vision science at the Massachusetts Institute of Technology.

Robotics researchers say cameras work best when put close to a robot’s touch points to discern information like texture and hardness. That is the approach that GelSight, a robotic system founded by Prof. Adelson, takes to gathering touch information.

A light and a camera are placed behind a soft piece of material that changes shape when something presses against it. The camera captures the shift in the material and measures the changes down to the micrometer. This allows GelSight and other vision-based systems to handle something as delicate as an uncooked egg without breaking its shell.

Still, camera-based systems have their limitations. Like when searching a handbag for a set of keys at night, vision isn’t always helpful. Frame rates might be too slow to capture something like an item slipping, and large objects can block the lens. Because of light refraction, handling glass items can also be challenging.

“Vision can help get things started and guide you into place,” Prof. Adelson says. “But the specifics of the object properties and exactly where it is and exactly where your hand is with respect to objects—those things are supplied by touch.”

Those limitations have researchers looking at creating sensors in skinlike technology that can be wrapped around parts of robots like fingers and hands.

“If you really want robots in a human environment, they need to have skin,” says Benjamin Tee, associate professor of materials science and engineering at the National University of Singapore . “I wouldn’t feel safe otherwise.”

One feature of electronic skin could let robots know if they were in physical contact with a person. With that data, robots could modulate their behavior—moving away from a person, slowing their speed, or reducing the force of their motions.

Constant use will wear these skins down and will necessitate replacing or finding ways to repair them, says Prof. Tee. He and other researchers, he says, want to create something that lasts and can heal like human skin.

Real-world challenges

The percentage of robots with touch that exist outside of academic settings is still in single digits, says Prof. Tee. High development costs are a big hurdle. Industry insiders say that once a company decides a key problem can be solved by adopting robotics using electronic skin, that should become a tipping point, helping drive down the cost of its development.

“The difficulty or sort of the unknown at the moment is how do we use it?” says Peter Botticelli, vice president of SynTouch Inc., a robotics company based in Montrose, Calif. “What’s the application where everybody all of a sudden needs a robot with touch capabilities?”

Before answering that question, though, additional technology needs to be developed, including algorithms that combine touch data with other inputs. More-complicated robotic systems will also require a boost in power. While battery technology is improving, sophisticated robots will be stuck tethered to an outlet or relying on batteries that need to be constantly charged or become massive. 

“It’s a systems-integration project,” Mr. Botticelli says. “You can’t just build a sensor, you can’t just buy a motor, you can’t just put a camera on, and you can’t just write an algorithm. You’ve got to do all of the things and then pack them into the space and make it work and make it stable and give it some sort of reliability.”

And even if the data gets integrated and a power source is supplied, some touch data will be subjective and hard to understand. It’s easy enough to know if something is hot and can damage a robot, but other senses aren’t always easy to act on.

Soft robots

While most of these innovations are for hard, metal machines, touch might be easier to add to robots that are made up mostly, or even entirely, from squishy or soft materials. While a less studied area of robotics, these would be more straightforward machines to build and mean new manufacturing possibilities, including 3-D printing.

On-demand printing, for example, could be used to create a helper for housework or to print a part to fix a current robot, says Zhenan Bao, a professor of chemical engineering at Stanford University in Palo Alto, Calif.

“Soft robots could be made by digital design and then just printed out as we need it,” Prof. Bao says.

Softness would mean a robot hand could envelop something, like a blob that wraps around an item, creating many points of contact—a simpler way to grasp than the precise movements of most robotic arms and pincers. Robots, which can require viewing thousands of examples to learn to do something, would be able to do more with less typical training time than robots now need, according to Kris Dorsey, an associate professor of electrical and computer engineering at Northeastern University in Boston.

“It doesn’t have to have seen a bunch of coffee mugs; it can just conform around one,” Prof. Dorsey says.

Softness also could help ease the way for humans and robots to have more social relationships. Touch will be vital, says Wenzhen Yuan, an assistant professor in the Robotics Institute at Carnegie Mellon University in Pittsburgh. People will expect pats on the back, high-fives and hugs from robots, says Prof. Yuan. It’s an area that must be studied more before robots get integrated into our lives, she adds.

“Contact and touch is an important part of the relationship you have with your friends and family,” Prof. Yuan says. “You’ll want that with robots, too.”

No comments:

Post a Comment

Comments are moderated and generally will be posted if they are on-topic.