Oct 8, 2016

Oculus Rift, Untethered: Project Santa Cruz, Hands On

The VR HMD panacea is a completely untethered experience, free of both wires and a companion PC. Oculus is making one: Called Project Santa Cruz, it’s a version of the Rift that has its own computer on board, and we got a chance to experience it firsthand at Oculus Connect 3.

What It Is

To be clear, this is not simply a wireless version of the Rift; that is, they did not replace the cable with a wireless solution. Instead, Project Santa Cruz is a self-contained VR HMD with its own computer on board. Therefore, it’s “wireless” in the same way that Microsoft’s HoloLens is wireless, in the same way that a smartphone is wireless.

The whole system is inherently mobile, and it offers six degrees of freedom (6DoF), which is a huge deal. The Gear VR, for as great as it is, offers only 3DoF. The 6DoF capability is so crucial because it adds spatial tracking, so you can physically walk around, and the virtual environment will respond accordingly.

Oculus would not allow us to take pictures or video, and the demonstrators would not divulge much in the way of specifics. However, based on our direct observations, experience using the headset, and inferring some details based on our conversations, we have a picture of what Project Santa Cruz is and how it’s built.

First of all, it is an Oculus Rift; it’s just been added onto. There is a computer on board, attached to the rear headstrap. We say “computer,” not “PC,” because Oculus would not divulge what, exactly, the computer entailed. However, we’re reasonably certain that this is essentially mobile hardware.

Although we could not see the CPU socket, we could see some of the PCB as well as its general size. It’s just a few inches square (rectangled, actually)--too small to be socketed, and there’s no room for discrete RAM, for example. Further, the person in charge of the project runs mobile for Oculus, so it makes sense that he would be more likely to employ a high-end mobile SoC (such as that found inside the Samsung Galaxy phones that power the Gear VR) over a “mobile” PC chip like Intel Cherry Trail. Then there are John Carmack's comments (further down the page). Whatever it is, it’s covered by a copper heatsink and employs a fan.

We therefore believe that the operating system--and there is an operating system--is probably Android, or a version of it.

The (probably custom) PCB also has three USB ports (at least two of which are USB 3.0) and an HDMI port. Velcroed under the makeshift computer is a battery pack. It looked like it was probably a Li-ion rechargeable battery pack.

Note that Oculus representatives repeatedly said that this was a prototype and the final design and its actual components, including the cooling elements, are very much a work in progress. The company takes prototyping and testing on live subjects (read: us) pretty seriously.

There are no external trackers involved with Project Santa Cruz, no Constellation cameras. Instead, Oculus employed inside-out tracking. It added four cameras, one on each corner of the front of the HMD. These are not special IR cams or anything--they’re just the sort of cameras you’d find in a smartphone. We do not know the resolution. So how does this Rift perform tracking? The biggest clue is that the two fellows running the demo are computer vision guys. The cameras provide a certain amount of data, and then that data is piped via HDMI to the SoC. And there are the typical IMUs inside most smartphones and HMDs. The software performs sensor fusion, based only on the data from the cameras.


Oculus Demo

What It’s Like

It didn’t occur to me until after the demo, but I noticed almost no issue with the added weight of the computer and battery pack.

It’s hard to describe how bizarre it felt to be using a Rift with no wires. It felt almost too free, which was actually a little disorienting. There were two issues I discovered:

First, although I could move about freely in the virtual world, my hands, arms, and legs were not present, which threw me way off. This is a common issue in many VR experiences; it’s disconcerting to your brain when your body moves but you have no corresponding visual cues.

Second, it occurred to me that although we’ve all been waiting eagerly for the day when we could get rid of that pesky cable running down our backs, it actually provides you important tactile feedback that helps your brain orient your body within a physical space, even as your visual system is immersed in a virtual one. (This is not to say that I pine for the cable’s return--simply that without the cable and without any sort of body tracking, the untethered 6DoF experience is bizarre.)

The demo took me into two virtual spaces. One was a mostly empty area that felt like the launch bay of a spaceship. Nothing happened in there, other than I was able to walk around a little and test the boundaries of the Guardian system. (Yes, this untethered Rift already has the Guardian system.)

Within a couple of minutes, I was transported to a cartoonish scene. I was standing in a yard, with shrubs and flowers around me. Just beyond the grass was a street, and beyond that was a cliff and the ocean. I was able to walk around my small yard, but I couldn't interact with anything.

Suddenly, a UFO descended. I looked up to see the tractor beam glowing down at me.

The scene faded to and from black, and the UFO dropped me on a rooftop patio above the yard. I could step close to the edge of the roof (my brain screaming “Danger!” the whole time), but the Guardian system popped up to prevent me from toppling over the side/bumping into a physical couch.

The UFO lit me up again, and the demo ended.

I found that even though I knew the Guardian system would tell me when I was near a wall or object, I moved slowly, with small steps and trepidation. That would all be ameliorated if the system was not just performing object tracking, but showing it to me, and if that same detection system gave me a facsimile of my limbs, or at least my hands.

It's not clear if that will be possible with Project Santa Cruz. On one hand, using that camera data should make object- and hand-tracking simple. On the other hand, the software would have to stitch the images from four cameras in real-time, which sounds like an awful lot of computation for a system that already has much to do. But, then again, Oculus could do what Intel is doing with Project Alloy and Realsense, which is to remove the sensor data processing from the SoC entirely. However, because Project Santa Cruz appears to be fundamentally using software sensor fusion, that may not work.

What It’s Not

Whatever Project Santa Cruz is, it’s not just a muscled-up version of the Gear VR. Remember, Gear VR is sort of a “dumb” device that relies on a smartphone for its display and processing. Project Santa Cruz is a complete system that is also fundamentally an Oculus Rift. Further, the Gear VR offers just 3DoF, whereas this new prototype has 6DoF.

It’s also not being used in the same way the Rift is normally used. The Oculus Rift and HTC Vive are not systems; they're essentially mega-peripherals that rely on powerful PCs and external sensors. Project Santa Cruz does its own sensor work and is totally portable. Forget roomscale; Project Santa Cruz is a world-scale VR HMD.

What It Means

“Inside-out tracking” is a downright buzzword right now. We heard it all over OC3, and elsewhere. What is happening is that the XR world is evolving, and quickly. As users began to pine for the ability to move more within VR, we needed roomscale tracking. Now we have it. We needed tracked hand controllers. Now we have it. To have a safety system as we shuffled around in VR, we needed Guardian. Now we have it.

The next need is untethering our HMDs while maintaining 6DoF--in whatever form that takes--but that requires solving a host of problems. However, inside-out tracking is one of those problems, and Oculus is on its way to solving it. There was no discussion of how controllers fit into this mix, but at least in terms of being tracked. That's another problem for another day.

What Carmack Says

On stage at the closing keynote of OC3, John Carmack expounded a bit more on Project Santa Cruz.

He seemed to confirm what we suspected--that it runs on fundamentally mobile hardware--but he discussed how Project Santa Cruz can get more from a mobile SoC than a phone can. For example, phones are phones first. Many are provided by carriers, they have baseband processing that hogs resources, they have bloatware (some of it from carriers), they have much happening in the background like Wi-Fi and GPS, and all of these things consume power and many of them cause thermal problems when you’re trying to render video, and so on. VR engineers, he said, can make use of only about one third of the total power resources of the SoC on a phone.

However, with the mobile SoC and DSP extracted completely from a phone use case, all that power is freed up and can be dedicated to the VR experience. You can also--as we saw--cool the chips with a heatsink and fan, or eventually other, smaller mechanisms. In other words, a decided mobile processing pipeline just for VR. In other words, something akin to a gaming console, and in fact Carmack made that comparison, extolling the performance virtues of having a single, immovable, predictable target for developers.

No comments:

Post a Comment