Advertisements
A team of USC researchers discovered that a robotic arm learned how to better grasp objects while a human subject tried to pull it away from them

Tough love! New research shows that robot AI may learn more effectively when people offer physical resistance instead of collaboration

  • Robots had difficulty learning how to grasp new objects when they were dependent on AI
  • With a human who tried to pull the object away, the robot learned more effectively
  • The robot AI learned much less effectively from cooperative behavior
Advertisements

Robots can learn more effectively from human resistance than human cooperation.

That is a possible conclusion from the results of a recently conducted experiment at the Viterbi School of Engineering at the University of Southern California.

A group of researchers investigated how a robot arm could learn to adjust its grip to objects of different sizes and weights.

Scroll down for video

Advertisements

A team of USC researchers discovered that a robotic arm learned how to better grasp objects while a human subject tried to pull it away from them

A team of USC researchers discovered that a robotic arm learned how to better grasp objects while a human subject tried to pull it away from them

They compared the results of the only working robot arm with how it would perform if a person were present who tried to pull the object out of the robot's grip.

Surprisingly, the robots learned to adjust their grip much faster and more effectively with an opponent in the photo than when they were left alone and had to figure things out with their own AI.

The robots exposed to human opponents were also better able to generalize information about new and unknown objects, said researcher and co-author Stefanos Nikolaidis. Wired.

With human resistance, the robots were able to successfully hold objects 52 percent of the time.

When the robots had no human resistance, they established a successful grip on new objects only 26.5 percent of the time.

The robotic arm (pictured above) learned better how to hold new objects after being exposed to resistance, but a robotic arm that was not exposed to resistance and only relied on AI performed less well
Advertisements

The robotic arm (pictured above) learned better how to hold new objects after being exposed to resistance, but a robotic arm that was not exposed to resistance and only relied on AI performed less well

The robotic arm (pictured above) learned better how to hold new objects after being exposed to resistance, but a robotic arm that was not exposed to resistance and only relied on AI performed less well

The research team designed the entire experiment as a simulation, rather than as a physical encounter.

The human opponents were real, but they used a mouse to communicate with the 3D program that modeled both the object and the robot arm.

Using the mouse, human users could try to pull objects in one of the six different directions of the robot.

Advertisements

"People will not always work together with their robotic counterparts," the team concludes.

"This work shows that this is not necessarily a bad thing from a learning perspective."

The experiment was conducted as a 3D simulation, where users interacted with a digital model of a robot arm by using a computer mouse to indicate in which direction they wanted to try to pull the object out of the robot's grip

The experiment was conducted as a 3D simulation, where users interacted with a digital model of a robot arm by using a computer mouse to indicate in which direction they wanted to try to pull the object out of the robot's grip

The experiment was conducted as a 3D simulation, where users interacted with a digital model of a robot arm by using a computer mouse to indicate in which direction they wanted to try to pull the object out of the robot's grip

In the past, researchers have experimented with ways to train robotic arms to serve people.

Advertisements

An experiment from the Royal Melbourne Institute of Technology developed a chest-mounted robot arm that would feed its users.

The system was equipped with a camera system that would record the facial expressions of its wearers to try to get to know their taste preferences.

The robot arm would then adjust its choices for which food it should then bring to its users' lips.

Last year, a coffee shop in Japan debuted as a barista as a barista, designed to dynamically move its handle to handle soft cardboard cups filled with hot liquid.

The robot arm can also measure and grind beans, add soils to a paper filter and pour hot water into a cup.

HOW MAKES AND SERVES A ONE-ARMED ROBOT COFFEE?

Advertisements

Sawyer is a one-arm robot that, according to Rethink Robotics, is designed to perform tasks that cannot be practically automated with traditional industrial robots.

In the new Henna Cafe in the business and shopping area of ​​Tokyo, Shibuya, that task is serving coffee.

A cup of brewed coffee served by Sawyer costs 320 yen ($ 3) and takes a few minutes

A cup of brewed coffee served by Sawyer costs 320 yen ($ 3) and takes a few minutes

A cup of brewed coffee served by Sawyer costs 320 yen ($ 3) and takes a few minutes

It grinds the coffee beans, fills a filter and pours hot water over a paper cup for up to five people at a time.

Sawyer can also operate an automated machine for six other hot drinks, including cappuccino, hot chocolate and green tea latte.

The robot has a robot arm with 7 degrees of freedom and a range of 1260 mm.

This allows it to move into tight spaces that are typically designed for people – such as behind the counter in a cafe.

He can pick up the cups with his tong-like gripper.

Advertisements

. (TagsToTranslate) Dailymail (t) sciencetech (t) California