Watch as two robot dogs work together to craft a burger courtesy of the University of Leeds
and live on Freeview channel 276
PhD students in the university's Real Robotics Lab worked to present the video of the robotic chefs, who were controlled by a single operator.
The operator used a gamepad to control the walking of the four-legged robotic dogs, while wearing a motion capture system to manipulate the robotic arms.
Dr Chengxu Zhou, lecturer in mobile robotics at the University of Leeds, said: "Not only did the operator need to remote control the robots at the same time, but the robots also needed to entangle with a soft and flexible object - like a burger.
"This is actually a very, very challenging task in the robotics area. All the robot are very rigid. They are good at moving precisely from point A to point B, moving an object from here to there.
"Anything soft means you cannot precisely control the object and it's very hard to model this softer material. Therefore, we want the human intelligence in the loop. With a little bit of help from the human operator, the robot can work corroboratively to make a burger, to assemble even the softer object material."
In the video, the two robots work to pick up the burger, place it in a pan, and then toss the pan. The burger is then taken from the pan and placed between a bun.
Dr Zhou's areas of expertise are humanoid robotics, legged locomotion, motion planning and shared autonomy, and he believes the world of robotics will make significant leaps in the future.
He said: "Because of the pandemic, we had to stay at home but still, we want to go outside. I imagine, in the next 20/30 years, we will have robot avatars that can go outside for us. We stay at home but the robot itself has the locomotion and manipulation capabilities, and can be controlled by us at home.
"This means we can not only go one place, but several at the same time. We can send ourselves to a concert and the robot avatar represents us in this kind of scenario.
"Therefore, the remote control system framework helps us to investigate how we can map our human emotions and intentions into a robot. For example, the quadruped robot has four legs - how are we going to map our emotion into a four-legged robot?
"Robotic arms have different configuration - how can we map our emotions into that configuration? That's the problem we're trying to investigate at the University of Leeds."