Brown University undergraduate Eric Rosen operates a Baxter robot using a virtual reality interface developed in Brown's Humans to Robots lab. Credit to Nick Dentamaro / Brown University

PROVIDENCE, R.I. [Brown University] — Even as autonomous robots get better at doing things on their own, there will still be plenty of circumstances where humans might need to step in and take control. New software developed by Brown University computer scientists enables users to control robots remotely using virtual reality, which helps users to become immersed in a robot's surroundings despite being miles away physically.

The software connects a robot's arms and grippers as well as its onboard cameras and sensors to off-the-shelf virtual reality hardware via the internet. Using handheld controllers, users can control the position of the robot's arms to perform intricate manipulation tasks just by moving their own arms. Users can step into the robot's metal skin and get a first-person view of the environment, or can walk around the robot to survey the scene in the third person — whichever is easier for accomplishing the task at hand. The data transferred between the robot and the virtual reality unit is compact enough to be sent over the internet with minimal lag, making it possible for users to guide robots from great distances.

"We think this could be useful in any situation where we need some deft manipulation to be done, but where people shouldn't be," said David Whitney, a graduate student at Brown who co-led the development of the system. "Three examples we were thinking of specifically were in defusing bombs, working inside a damaged nuclear facility or operating the robotic arm on the International Space Station."

Whitney co-led the work with Eric Rosen, an undergraduate student at Brown. Both work in Brown's Humans to Robots lab, which is led by Stefanie Tellex, an assistant professor of computer science. A paper describing the system and evaluating its usability was presented this week at the International Symposium on Robotics Research in Chile.

Watch video of the system in use here:

Even highly sophisticated robots are often remotely controlled using some fairly unsophisticated means — often a keyboard or something like a video game controller and a two-dimensional monitor. That works fine, Whitney and Rosen say, for tasks like driving a wheeled robot around or flying a drone, but can be problematic for more complex tasks.

"For things like operating a robotic arm with lots of degrees of freedom, keyboards and game controllers just aren't very intuitive," Whitney said. And mapping a three-dimensional environment onto a two-dimensional screen could limit one's perception of the space the robot inhabits.

Whitney and Rosen thought virtual reality might offer a more intuitive and immersive option. Their software links together a Baxter research robot with an HTC Vive, a virtual reality system that comes with hand controllers. The software uses the robot's sensors to create a point-cloud model of the robot itself and its surroundings, which is transmitted to a remote computer connected to the Vive. Users can see that space in the headset and virtually walk around inside it. At the same time, users see live high-definition video from the robot's wrist cameras for detailed views of manipulation tasks to be performed.

For their study, the researchers showed that they could create an immersive experience for users while keeping the data load small enough that it could be carried over the internet without a distracting lag. A user in Providence, R.I., for example, was able to perform a manipulation task — the stacking of plastic cups one inside the others — using a robot 41 miles away in Cambridge, Mass.

In additional studies, 18 novice users were able to complete the cup-stacking task 66 percent faster in virtual reality compared with a traditional keyboard-and-monitor interface. Users also reported enjoying the virtual interface more, and they found the manipulation tasks to be less demanding compared with keyboard and monitor.

Rosen thinks the increased speed in performing the task was due to the intuitiveness of the virtual reality interface.

"In VR, people can just move the robot like they move their bodies, and so they can do it without thinking about it," Rosen said. "That lets people focus on the problem or task at hand without the increased cognitive load of trying to figure out how to move the robot."

The researchers plan to continue developing the system. The first iteration focused on a fairly simple manipulation task with a robot that was stationary in the environment. They'd like to try more complex tasks and later combine manipulation with navigation. They'd also like to experiment with mixed autonomy, where the robot does some tasks on its own and the user takes over for other tasks.

The researchers have made the system freely available on the web. They hope other robotics researchers might give it a try and take it in new directions of their own.
 

 
www.pdf24.org    Send article as PDF