The robot Internet is coming.
Researchers at five European universities realized that a major obstacle holding back the robot invasion is the size requirements and cost to put enough processing power and memory aboard household and industrial machines to help them solve complex real-world problems. That’s why they have been looking into pulling some of the brains out of robots and dropping them in the cloud. There, bots could access computation, storage and communications infrastructure of modern server farms, like those that power Google, Facebook and Amazon.
They’ve created Rapyuta: The RoboEarth Cloud Engine, a World Wide Web for robots that gives the machines access to a giant network and database repository where they “can share information and learn from each other about their behavior and their environment,” the group says. “Rapyuta helps robots to offload heavy computation by providing secured customizable computing environments in the cloud…This allows robots to process data directly inside the computational environment in the cloud without the need for downloading and local processing.”