It’s the end of October, when the days have already grown short in Redmond, Washington, and gray sheets of rain are just beginning to let up. In several months, Microsoft will unveil its most ambitious undertaking in years, a head-mounted holographic computer called Project HoloLens. But at this point, even most people at Microsoft have never heard of it. I walk through the large atrium of Microsoft’s Studio C to meet its chief inventor, Alex Kipman.

252

The headset is still a prototype being developed under the codename Project Baraboo, or sometimes just “B.” Kipman, with shoulder-length hair and severely cropped bangs, is a nervous inventor, shifting from one red Converse All-Star to the other. Nervous, because he’s been working on this pair of holographic goggles for five years. No, even longer. Seven years, if you go back to the idea he first pitched to Microsoft, which became Kinect. When the motion-sensing Xbox accessory was released, just in time for the 2010 holidays, it became the fastest-selling consumer gaming device of all time.
Right from the start, he makes it clear that Baraboo will make Kinect seem minor league.

 

 

Kipman leads me into a briefing room with a drop-down screen, plush couches, and a corner bar stocked with wine and soda (we abstain). He sits beside me, then stands, paces a bit, then sits down again. His wind-up is long. He gives me an abbreviated history of computing, speaking in complete paragraphs, with bushy, expressive eyebrows and saucer eyes that expand as he talks. The next era of computing, he explains, won’t be about that original digital universe. “It’s about the analog universe,” he says. “And the analog universe has a fundamentally different rule set.”

Translation: you used to compute on a screen, entering commands on a keyboard. Cyberspace was somewhere else. Computers responded to programs that detailed explicit commands. In the very near future, you’ll compute in the physical world, using voice and gesture to summon data and layer it atop physical objects. Computer programs will be able to digest so much data that they’ll be able to handle far more complex and nuanced situations. Cyberspace will be all around you.

What will this look like? Well, holograms.

 

First Impressions

 

That’s when I get my first look at Baraboo. Kipman cues a concept video in which a young woman wearing the slate gray headset moves through a series of scenarios, from collaborating with coworkers on a conference call to soaring, Oculus-style, over the Golden Gate Bridge. I watch the video, while Kipman watches me watch the video, while Microsoft’s public relations executives watch Kipman watch me watch the video. And the video is cool, but I’ve seen too much sci-fi for any of it to feel believable yet. I want to get my hands on the actual device. So Kipman pulls a box onto the couch. Gingerly, he lifts out a headset. “First toy of the day to show you,” he says, passing it to me to hold. “This is the actual industrial design.”

Oh Baraboo! It’s bigger and more substantial than Google Glass, but far less boxy than the Oculus Rift. If I were a betting woman, I’d say it probably looks something like the goggles made by Magic Leap, the mysterious Google-backed augmented reality startup that has $592 million in funding. But Magic Leap is not yet ready to unveil its device. Microsoft, on the other hand, plans to get Project HoloLens into the hands of developers by the spring. (For more about Microsoft and CEO Satya Nadella’s plans for Project HoloLens, read WIRED’s February cover story.)

Kipman’s prototype is amazing. It amplifies the special powers that Kinect introduced, using a small fraction of the energy. The depth camera has a field of vision that spans 120 by 120 degrees—far more than the original Kinect—so it can sense what your hands are doing even when they are nearly outstretched. Sensors flood the device with terabytes of data every second, all managed with an onboard CPU, GPU and first-of-its-kind HPU (holographic processing unit). Yet, Kipman points out, the computer doesn’t grow hot on your head, because the warm air is vented out through the sides. On the right side, buttons allow you to adjust the volume and to control the contrast of the hologram.

 

253

Tricking Your Brain

 

Project HoloLens’ key achievement—realistic holograms—works by tricking your brain into seeing light as matter. “Ultimately, you know, you perceive the world because of light,” Kipman explains. “If I could magically turn the debugger on, we’d see photons bouncing throughout this world. Eventually they hit the back of your eyes, and through that, you reason about what the world is. You essentially hallucinate the world, or you see what your mind wants you to see.”

To create Project HoloLens’ images, light particles bounce around millions of times in the so-called light engine of the device. Then the photons enter the goggles’ two lenses, where they ricochet between layers of blue, green and red glass before they reach the back of your eye. “When you get the light to be at the exact angle,” Kipman tells me, “that’s where all the magic comes in.”

Thirty minutes later, after we’ve looked at another prototype and some more concept videos and talked about the importance of developers (you always have to talk about the importance of developers when launching a new product these days), I get to sample that magic. Kipman walks me across a courtyard and through the side door of a building that houses a secret basement lab. Each of the rooms has been outfitted as a scenario to test Project HoloLens.

A Quick Trip to Mars

 

The first is deceptively simple. I enter a makeshift living room, where wires jut from a hole in the wall where there should be a lightswitch. Tools are strewn on the West Elm sideboard just below it. Kipman hands me a HoloLens prototype and tells me to install the switch. After I put on the headset, an electrician pops up on a screen that floats directly in front of me. With a quick hand gesture I’m able to anchor the screen just to the left of the wires. The electrician is able to see exactly what I’m seeing. He draws a holographic circle around the voltage tester on the sideboard and instructs me to use it to check whether the wires are live. Once we establish that they aren’t, he walks me through the process of installing the switch, coaching me by sketching holographic arrows and diagrams on the wall in front of me. Five minutes later, I flip a switch, and the living room light turns on.

Another scenario lands me on a virtual Mars-scape. Kipman developed it in close collaboration with NASA rocket scientist Jeff Norris, who spent much of the first half of 2014 flying back and forth between Seattle and his Southern California home to help develop the scenario. With a quick upward gesture, I toggle from computer screens that monitor the Curiosity rover’s progress across the planet’s surface to the virtual experience of being on the planet. The ground is a parched, dusty sandstone, and so realistic that as I take a step, my legs begin to quiver. They don’t trust what my eyes are showing them. Behind me, the rover towers seven feet tall, its metal arm reaching out from its body like a tentacle. The sun shines brightly over the rover, creating short black shadows on the ground beneath its legs.

254

Norris joins me virtually, appearing as a three-dimensional human-shaped golden orb in the Mars-scape. (In reality, he’s in the room next door.) A dotted line extends from his eyes toward what he is looking at. “Check that out,” he says, and I squat down to see a rock shard up close. With an upward right-hand gesture, I bring up a series of controls. I choose the middle of three options, which drops a flag there, theoretically a signal to the rover to collect sediment.

After exploring Mars, I don’t want to remove the headset, which has provided a glimpse of a combination of computing tools that make the unimaginable feel real. NASA felt the same way. Norris will roll out Project HoloLens this summer so that agency scientists can use it to collaborate on a mission.

A Long Way Yet

 

Kipman’s voice eventually brings me back to Redmond. As I remove the goggles, he reminds me that it’s still early days for the project. This isn’t the kind of thing that will be, say, a holiday best seller. It’s a new interface, controlled by voice and gesture, and the controls have to work flawlessly before it will be commercially viable. I get that. I love voice controls, and I talk to Siri all the time. But half the time, she doesn’t give me a good answer and I have to pull up my keyboard to find what I’m looking for more quickly. Project HoloLens won’t have a keyboard. If the voice and gesture controls don’t work perfectly the first time, consumers will write it off. Quickly.

That said, there are no misfires during three other demos. I play a game in which a character jumps around a real room, collecting coins sprinkled atop a sofa and bouncing off springs placed on the floor. I sculpt a virtual toy (a fluorescent green snowman) that I can then produce with a 3-D printer. And I collaborate with a motorcycle designer Skyping in from Spain to paint a three-dimensional fender atop a physical prototype.

As I make my way through each, Kipman seems less nervous than when we began, but no less focused. It has been three hours since we met. In each scenario, he watches a screen that shows him what I am seeing, and he watches me trying to use his device for the first time. His eyebrows draw down in deep concentration as he checks to see if every calculation is perfect—noting the touch of my thumb and forefinger as I make an upward gesture, the words I reach for instinctively to instruct the computer. Seven years in, he is trying to see Project HoloLens as if for the first time. To see it through the eyes of a 30-something female New Yorker. But that is one thing his magical head-mounted holographic computer cannot do. At least not yet.

 
www.pdf24.org    Send article as PDF   
Source: Wired
Author: JESSI HEMPEL