Sci-fi writers love to scare readers with tales of robots becoming sentient and taking over the world. The reality is a little more comical. Consider the self-driving Las Vegas bus that 2 hours into its maiden voyage failed to avoid another truck whose human mistakenly backed into it. Or the Roomba that neglected to account for the pile of fresh dog poop on a living room carpet in Little Rock, Arkansas. Or the security robot that tumbled into the fountain in the lobby of its office building in Washington, D.C.
Enter roboticist John Hoare and his colleagues Kori Macdonald, Steve Gray and Justin Foehner at GE Global Research in Niskayuna, New York. They have a fix: Develop powerful robots that can take us to places we can’t reach and do things that we can’t do, but leave us Earthlings very much in control.
“Telerobotics,” they call it. The robotics team at Global Research has gained attention for designing wee robots that can slip into some very small spaces. But its bigger achievement may be figuring out how, as Hoare puts it, to “use virtual reality to teleport into that robot’s working environment, and control it.”
(Gray and his colleague Shiraj Sen second place last year in NASA’s $1 million competition seeking solutions to programming and controlling the agency’s R5 Valkyrie robots designed to operate on Mars.)
[Video available at: https://www.facebook.com/GE/videos/1576830992385189/]
Why keep humans in the loop? Hoare says that the problem with autonomous robots is that they “are very good nowadays at doing things over and over again, repeated things. What they’re not good at is dealing with is new data, with things they haven’t seen before, things they haven’t been pre-programmed to do.” And quality AI is way too far off to help.
The world needs robots endowed with “the abilities of the human, the knowledge and intuitiveness of the human.” That’s what the robotics team members are giving us now. They’re thinking far beyond ordinary remote control. They want telerobot operators to “feel” they are genuinely at the wheel. It’s critical, Hoare explains, that “the human being obtains the surrounding image of the environment, so the human can operate in it.”
Some of their telerobots can fit in the palm of your hand. Their small size, explains GE Global Research principal scientist Don Lipkin, will allow humans to go into an engine or a turbine and make repairs without taking it offline, a process that costs time and money.
Ideal tasks for the first generation of telerobots fit a certain profile: simple yet complex, Hoare says. Think of a failing gas-plant valve that must be manually turned open or closed. Things can get enormously expensive if the valve is in a spot that a maintenance crew has trouble reaching. So Hoare’s team has been working on a telerobot designed to turn a valve.
Sure, telerobots can be mounted with tiny video cameras. But a more powerful way for the robots and their operators to see involves lidar — a form of radar that relies on laser lights instead of radio waves. The telerobot uses lidar to precisely map out the contours of its surroundings and then relays a 3D image to its operator, who is wearing VR goggles. “They can move around in it,” Hoare explains. “They can get different views of that environment, very naturally, by just walking around and moving their head location.”
Next, with “controllers in their hands that they can use to grasp things in their virtual representation,” he says, they can manipulate the robot’s hand in the field. Crucially, this system even allows an operator to model its next moves virtually in advance before giving the robot a new command. “Once the human sees what the robot’s going to do and is happy with it, the human can tell the robot, ‘Yes, go ahead,’” Hoare says. “It’s safe. It’s predictable.”
Telerobots are also flexible. With a human brain in the loop, the same robot becomes capable of doing many tasks in various places. And what’s truly exciting is the ability to take humans to new places previously unreachable. That’s bordering on superhuman.
>> This article by Stephane Fitch was re-posted from GE Reports, Jan 2, 2018