Oculus VR’s Brendan Iribe On The Latest Virtual Reality Prototype (Interview)January 14, 2014
by Dean Takahashi
Oculus VR, the maker of the Oculus Rift virtual reality goggles, continues to lead a charmed life as it moves from an indie curiosity to a real gaming platform. The Irvine, Calif.-based company raised eyebrows again at the 2014 International CES with a cool new demo of its device.
Brendan Iribe, the chief executive of Oculus VR, showed me the prototype dubbed Crystal Cove, a machine that looks sturdier and more refined than its prior version and a big reason why Oculus was able to raise $75 million in a funding round last month led by Andreessen Horowitz, one of the most powerful venture capital firms in Silicon Valley.
On the basis of the prototype and the credibility of John Carmack, the co-creator of Doom, who joined Oculus as its top tech guru, investor Marc Andreessen threw his support behind Oculus. I got a good look at it and found it to be a much better experience than the previous model. It didn’t make me seasick, I saw no motion blur, I could move my head to any position, and the high-definition graphics were looking good.
Iribe said the company barely finished the prototype, and it received a lot of help from developers at Epic Games. Now the challenge is to take the prototype and the funding and make a real product. In Las Vegas last week, I caught up with Iribe and Dave DeMartini, a former Electronic Arts executive who has taken on the job of building a content ecosystem around the Oculus.
Here’s an edited transcript of our interview.
GamesBeat: Which version is this, that you’re showing?
Brendan Iribe: We’re showing the Crystal Cove prototype. If you remember back at [the Electronic Entertainment Expo] 2013, we showed the HD prototype. At that time, it was a single-feature prototype. The main feature we were showing was HD. Here, it’s got a few new features to it. We decided to expose the internal code name and reveal it as the Crystal Cove.
Two major new features. One is what many people were expecting us to show at some point, and hopefully confirm for the consumer product, which is positional tracking. That gives you translation, in addition to orientation. The original developer kit was orientation only. If you moved around, it was only rotating around – yaw, pitch, and roll. With positional tracking, you now get the additional three degrees of freedom, a combined total of six now.
You can now move left, right, forward, up, back, all around. It’s full head tracking. That makes it a much more comfortable experience. It also enables new gameplay. You could have something coming at your face and you have to dodge it. You can look down at things. It improves the experience. In our minds, it’s required for great virtual reality.
The second feature is somewhat of a breakthrough. It’s low persistence. There are a number of ways to describe this, and we’re still getting our heads around the best one. Essentially, we’re always trying to reduce latency. As you try to reduce the latency of the experience, you can only get it down so far before we start running into the limitations of game engines, computing, the intensity of the experience you’re trying to compute. On the first dev kit, we were right around 50 to 60 milliseconds of latency. The prototype we’re showing here is at 30 milliseconds. But really, we want to get it down much lower.
One of the issues is, when you’re moving around you’re given a frame that’s computed based on your movement. Each time I’m given this frame, it’s correct for a very short amount of time before it becomes incorrect because I continue to move. Let’s say I’m here and I’m looking at something, and I’m moving. I get a new image, but I keep moving, and so now — for some number of milliseconds, the latency required to update the next frame – the image I’m looking at is stuck here, and it’s dragging along with me until I get a new image. Then it pops back. That’s full persistence, when you have the image persisting the full time as you’re moving.
What low persistence does — because we’re only going to be able to get latency of motion down to 15 or 20 milliseconds – it helps avoid that problem. When you get the image, it’s great for the first one or two milliseconds, and then instead of keeping it on the screen, we turn off the screen. Normally it would be good for a few milliseconds, then bad, bad, bad, then good again when you got another image, then bad. Now it’s good, then off, and then you get another good one, then off. You’re getting this image, and then it goes dark on the screen for the next 10 or 11 milliseconds until you get the new image. We do it fast enough, at a really high refresh rate, that you don’t see it. You can’t see the flicker that’s caused.
It’s the same latency you would have gotten. It’s just that the persistence of that bad image is no longer there. You could avoid low persistence if you could run the screen at a few thousand hertz and only have, say, one millisecond of time between frames. But that’s not practical to tell game developers, “Hey, if you want to make VR games, you have to run at 1,000 FPS.” We want to say, “You can make great virtual reality, and you only need to run at 40, 50, 60 on your game engine.” The rendering engine will need to run a little bit faster, in sync with the refresh rate of the screen. But it’s very practical. People shouldn’t have too hard a time with where they are today.
GamesBeat: What problems does it address for the user?
Iribe: It makes it much more comfortable. Probably the biggest visual difference is that it eliminates motion blur. If you put on the headset in the past and looked around, you probably noticed that when you’re moving, everything blurs until you stop moving. Then you had to hold real still and look at something. Now, as you’re moving around and looking, you can focus on objects, especially things like text. You can focus on that, still move your head, and there’s no motion blur. It allows you to track objects in the scene in a way much closer to how you would in real life.
In real life, that’s how we’re moving around. We look at things while we’re walking and moving and turning around. We stare at objects in the world. This low persistence allows you to do that. It really does help to reduce the motion sickness aspect, the simulator sickness. When you combine positional tracking and low persistence, you get much closer to the holy grail of VR we’ve been waiting for, which is a comfortable VR experience.
GamesBeat: This is version three, then?
Iribe: This is the third iteration of a prototype. Probably we’d consider it the fourth publicly displayed prototype. The first was the duct tape model. The second was the developer kit we shipped. The third was the HD prototype. Now we have this one, with positional tracking and low persistence.
GamesBeat: Does this one take over as the latest dev kit?
Iribe: What we said before is that we want to do another developer kit just before the consumer release. Something that comes out that’s timed very closely and is very similar – almost identical – to that consumer hardware, so developers can start working with it and prep the content. We don’t want to ship a consumer product that’s very different from the previous developer kit, and then everyone has to scramble to upgrade.
It’s just like consoles. I remember the first Xbox kit was a Macintosh PC tower. It was obviously not the Xbox. [Laughs] Then, just before the console comes out, you get the actual kit that’s much closer to the box that ships. Everyone will be able to get their software compatible, so there will be a ton of software and apps and experiences for the consumer launch.
GamesBeat: Are you planning to internally address issues like creating new VR input devices?
Iribe: These things take a lot of time. There’s a lot of work involved. I don’t think we’re going to see super-low-latency, perfectly accurate hand tracking or finger tracking for a while. We’re certainly working on a lot of things internally. When we get closer to a point where we can say, “Yes, the next version will have this feature,” we’ll talk about it.
A lot of times, the internal R&D doesn’t pan out. You go down one route, you find that it doesn’t work the way you planned, and you have to switch and go down another one. We don’t want to talk about internal R&D until we’re confident we can ship it.
We would like to tackle input. We’ve often said that right now, VR vision is the first step. Really, it’s only half of the VR experience. You need to get VR input, but what is that? Is it a keyboard and a mouse, a game pad? It may be in the beginning, but in the longer term, there are going to be more natural input devices for VR.
The keyboard and mouse and game pad allow you to have these superhuman inputs, but they aren’t natural. It’s obvious, when you put on the Oculus VR headset — people put their hands up and they say, “When am I gonna see my hands?” It’s something we’re excited to have, and we’re working on it for the future. It’s a longer-term problem.
GamesBeat: Is this something Carmack has worked on, or was it too soon for him to be involved with?
Iribe: As soon as he joined, the day he signed up, he started writing code. He was probably thinking about things for quite a while. He’s working on something that we’re not disclosing yet. You can monitor his tweets and get some ideas. We’ve said that he’s spending a lot of time on the mobile side. We’re not ready to talk about what that means yet.
He’s the kind of guy that can make impossible things possible. We looked at today’s computer, wanting more power for VR, and he said, “Well, what about today’s mobile device?” Many of us didn’t think it was capable enough, and he said, “No, let me take a crack at it.” We’re not ready to talk about what he’s doing yet, but he’s up to some incredible stuff.
GamesBeat: I think you said that Marc Andreessen came in recently and was impressed with a demo. Was this that demo?
Iribe: This is part of it. Again, there are some things in the lab that we’re not showing yet. If you saw the demo he experienced, it was definitely not publicly viewable. It was an internal prototype. But what he saw, it had low persistence and positional tracking. It showed the level of comfort and the quality of the experience we’re going to be able to deliver. He tried the demo, came out, and was pretty quick to partner with us. Talking to Carmack helped him as well. He was able to do all that on the same day, which I’m sure was a pretty exciting day.
GamesBeat: Has David DeMartini been earning his paycheck so far?
Iribe: [Laughs] It’s always nice to bring somebody on board when there’s a bit of a backup of things to do. David came on board at a time where we didn’t have anyone talking to the community about publishing, about how we’re going to help seed the ecosystem. We had anticipated raising a significant round, series B, and we knew we wanted to use some of that round to invest into the ecosystem in a number of ways.
We needed someone at the helm to steer that and keep us out of trouble. Oculus is a company that often does things differently. But we don’t want to do things so differently that we start to get into trouble. David has an incredible amount of experience and a Rolodex of contacts for doing publishing deals from his time at EA. It’s been enlightening to have someone with his expertise on board.
It was the same thing we did with our COO, Laird [Malamed]. He came on board with a lot of experience shipping product. It’s incredible, when it comes to logistics and supply chain operations, what Laird has been able to do. David is the same. We need some veterans to come on and make sure the ship steers in the right direction.
David DeMartini: The team, based on how they’ve approached everything, has a better relationship with the community than any team I’ve ever been involved with, including the most successful franchises at EA. You don’t want to do anything to spoil that mix. You want the team to be the way the team is. You don’t want them to get all corporate and change at all. They’re always honest. The blog posts are very truthful and direct.
We’re starting to see, after a groundswell of hobbyists and enthusiasts, a tremendous number of triple-A developers on the gaming side whose attention has been caught by the latest round of changes. With all the pressure coming from the community, you’re going to see a lot of large developers who want their content to experience the full immersion that you get when you’re on our platform.
All this stuff with regards to latency and comfort is being built so that not just a million people can experience, but tens of millions. They don’t just want to experience it with games, either, but with 360 video, attendance at live events, and other things that make this publishing job we’re doing so much more exciting. It’s not just video games. It’s content that people will be able to consume on a great platform.
Iribe: We’re still learning what those experiences are. VR Cinema, maybe eight months ago we got to see it. We weren’t thinking about any of that. We were thinking first-person games. Now it turns out that first-person games are often too intense for many people, at least if they have a lot of fast movement. It’s these other experiences – things like VR Cinema or 360 video – that turn out to be incredibly immersive and very complementary for VR. It’s far beyond just gaming. And we’re also learning what in gaming works really well.
It’s awesome to have this team. Aaron has been on the front line on the developer side, going in and helping them get questions answered and understand best practices.
DeMartini: They’ve been moving very quickly. It’s not particularly difficult to interact with the SDK and have your game at least up and running. Within a week or two people are able to see their stuff on the screen. Not being too esoteric, but when you think of the potential of virtual reality for people with disabilities and other things, people who can’t get up out of a chair—They can go into these worlds in a comfortable way and experience things. Kids in a classroom or people with handicaps can experience things that they would never experience in another way. The possibilities are so exciting.
GamesBeat: I saw something yesterday that was pretty interesting: a little German startup called Panono that has 36 cameras in a ball. It takes pictures of everything around it. The app uploads those pictures, and it stitches it all together perfectly.
DeMartini: Yeah. You take that output and put it together with the right kind of positional audio, so that if it was a picture of a beach, you hear the waves in front of you, the ambient sounds behind you. Then you take that experience and figure out how you could make it into a shared experience among multiple people. You’re transporting yourself and your friends to the beach in Hawaii when you’re all sitting somewhere in El Segundo.
Iribe: Whether it’s a still photo or live video, we’ve been chatting about it. One of the first things David said when he got in was, “This is going to be so big for 360 video.” We were like, “Yeah, but games!” And he’s like, “But 360 video!”
DeMartini: You can’t forget where your core is. The core is games, and the gaming folks have been fantastic with their stuff. But the reach is so much broader. I’m sure that’s what the investors were really excited about. It’s so easy to see the possibilities.
GamesBeat: They had these drones that could carry them, too.
Iribe: Absolutely. Then you get the video aspect of it –
DeMartini: Or imagine that camera on the head of a surfer. Someone who could never surf could experience that. Or at a concert. I’m never going up Mount Everest, but someone could do that climb, and you’d get to experience it with them.
Iribe: Especially as the 360 camera technology gets to higher resolution. It’s going to become closer to a photo-real experience. Your brain can be tricked into thinking that you’re really on Mount Everest or wherever. It can help with a number of things beyond just gaming. Virtual travel. A lot of people never get to leave the U.S. In the future, with VR, hopefully the world will be a much smaller place.
GamesBeat: Sony’s now promoting VR in a way. I don’t know what your reaction was to their announcement.
Iribe: They announced an update to their HMZ movie viewer line. It’s the same viewer they’ve had out now for several years. It’s a headset with a very small field of view. In the past it didn’t have head tracking. It was marketed and used for watching movies. They’re starting to add some tracking to it. I can’t speak to them, but it’s a different category of product from Oculus. It’s considerably more expensive, last I heard – somewhere in the $1,500 or $2,000 range. It’s made by the TV group. It’s not part of the PlayStation group.
It’s just a different category. Things like Glyph and some of the other movie viewers are in that category. We’re in the VR category. It’s just like when people say, “What about Google Glass?” They’re very different things. Google Glass is a projected reality, a little notification system. It can be a great experience, but it’s different from virtual reality, changing the world that you’re seeing.
Right now, Oculus is the only company leading the charge on VR. I’m sure that will change in the future. Competition helps everyone get smarter and make better products. But today, it’s just us.
GamesBeat: The Avegant Glyph, I saw, had about 40 or 45 degrees to the field of view.
Iribe: Yeah. Your brain still feels like you’re looking through a window. When you go to a 100-degree field of view, you start to feel like you’re actually in the experience. They’re built for different purposes.
GamesBeat: I wonder about some of the new curved-screen TVs we’ve seen here, too. The argument now is that if you sit 4 feet away, 5 feet away, you still won’t see pixels, so they can curve it around you.
Iribe: They’re trying to say that it’s a wider field of view. I don’t know. We’re excited about what TV and cinema is going to be like in VR, because now we can have a much wider screen. Palmer’s gone on record saying that VR can provide the least expensive way to get to the highest-quality large display experience. We’re not there yet, but in the future, imagine you can put on a VR headset and being able to get an extremely large TV – an IMAX, if you will – in your home or wherever you are. It’ll be both large and very portable, which will be a lot of fun.
GamesBeat: What’s in the demo you have here today?
Iribe: We have two demos. One highlights positional tracking, mostly. The other starts to highlight the low persistence. They’re fully featured in both, but in one, you’re more inclined to lean around and look around.
If you remember, back at E3, we showed the Elemental demo. Epic has now taken the Elemental demo and made a tower defense game inside it. They call it Strategy VR. You can sit down and see another player over on the other side of this virtual castle, where little gnomes walk around and your towers shoot at them.
The idea is that, in the future, the evil lord across from you could be your friend’s avatar, you could be some other avatar, and you could play each other in any kind of game. When you do this, you want to be able to lean in and look closely at the characters. You can imagine something like Jenga, where you need to lean around and decide where you’re going to pick the next piece. That requires positional tracking.
Epic stepped up. I can’t say enough about those guys. They’ve been awesome to work with, much like Valve. We’ve been working with Valve from the beginning with VR. We’ve been working with Unity. The Epic guys have really stepped up each time. They did it at E3, and here, they did it again. They made this demo themselves, working closely with us, for CES, just to show off Oculus at the show. They got it done over the holiday break, which is a lot to — we wouldn’t want to ask that of a partner. It was awesome that they volunteered. A couple of the guys worked through Christmas and New Year’s to get it done. It was a little hot at the end, but it came together, and it’s an amazing experience.
The second demo is EVE Valkyrie from Crowd Control Productions. You saw that at E3. You may have seen it at Gamescom as well. This is the same build from Gamescom, but we’ve added positional tracking and low persistence. They’re hard at work on a future version that will have all kinds of great new gameplay features. It’ll be a full-fledged game. But you can see the difference between the older version, without positional tracking and low persistence, and the new one. We can toggle on low persistence, and it makes the experience so much better.
Of course, we’re bullish on this. It’s not as if VR is going to be the only category of gaming in the future. It’s not necessarily going to disrupt other categories. It’s going to be another category. You’ll still have PC gaming and mobile gaming and console gaming, but now you’ll have VR gaming. It’s awesome to see some of these big triple-A developers – in addition to all the incredibly smart indies – getting in on the ground floor of VR. CCP, Epic, Valve has been doing a lot of R&D on it. They’re going to be at the forefront and they’ll be able to take advantage of that, having some of the content there at the very beginning. Launch content can have a fairly high attach rate if they make a great experience. David’s been working with those developers to make sure that they’re fully in line with the consumer launch.
DeMartini: What they’re most excited about are the new design possibilities of a full immersion environment. People have been kind of designing games the same way – with higher levels of fidelity, obviously – for a long time. Move forward, play through, move forward. With the virtual immersion environment, you can be far more subtle. Positional sound, in conjunction with the ability to see 360 degrees, gives them completely new possibilities for how they design. That’s what catches their attention.
UE4 is a great example of that right now. You’re in this crazy medieval fantasy world, sitting inside a throne, looking at a table that has the strategy game on it. The thing that excites me from the content side is the layers it adds, the realities within the reality.
Iribe: It’s a new canvas, a new level of immersion. You’ll hear guys like [BioShock Infinite creator] Ken Levine and the CCP guys talk about how one of their big goals is always to make a more immersive experience. Virtual reality is the next generation of immersion. You have things like audio cues that cause you to turn. It’s harder to do that in a game where you just look through a monitor.
If you look at where gaming has gone, you had board games in the beginning, and they were a lot of fun. They were social. Everyone enjoyed playing them. Then we got computer games. Computer games started out a little less social – you had two game pads at most – and then they became more and more social, for more and more players. From board games to computer games, you’re going to get this next major step into virtual reality gaming.