It is currently Tue Sep 02, 2014 5:38 pm



Reply to topic  [ 49 posts ]  Go to page 1, 2  Next
 A day with an Oculus Rift 
Author Message
One Eyed Hopeful

Joined: Wed Apr 04, 2012 5:21 pm
Posts: 25
I am going to be giving several demos in the next month, and Palmer graciously loaned me one of his test HMDs to go with the other things I have to show. Here are my impressions after a day of working with it:

When I first powered it up, it looked like the screen was badly offset, but this turned out to be a problem with the analog VGA input that Palmer had also seen before. Making a custom display mode with different horizontal timing parameters got it fixed up. The plan is for the kits to have a panel with a digital interface, which will properly resolve the issue.

The USB cable for power was also finicky – it wouldn’t work on a USB hub or over an extension cord, only plugged directly into my computer. When I put it on a bench power supply I found that I had to give it 5.2v to get it to come up, it apparently was voltage limited rather than current limited.

There still seems to be a tiny offset in either the optics or the nose cutout, because I can sometimes just glimpse the edge of the right eye view in the left eye. Interestingly, this happens when you are looking to the left with your eye, which moves it a few millimeters to the left, allowing it to look farther over to the right in peripheral vision. This is particularly distracting when the left and right sides of the view are at very different brightness levels. I experimented with different amounts of physical blanking on the lens and leaving a gap in the rendered image, but making the flash of view completely go away required giving up too much resolution. The right solution to this is to have a thin physical divider mounted directly on the display to prevent eye view crosstalk.

I measured the horizontal field of view as a bit under 90 degrees per eye (full binocular overlap), but when you first look through the lenses you clearly feel the edge of the screen on the sides. The vertical field of view is plenty, and you really have to push into the lenses to catch a glimpse of the screen edge. With only 640 pixels horizontally versus 800 vertically per eye and symmetric optics, the vertical FOV is 33% greater than the horizontal, and all of the loss is on the outside edge. I wound up covering the outside parts of the lenses with tape to block off the edges before the optics, which maintains immersion much better than seeing the edge of the screen out at the optical focal plane. This arrangement makes the best use of the limited panel resolution, but it might be better to ignore 160 scan lines and only use 1280 x 640 with a completely symmetric field of view, if that is achievable with available lenses at the same eye spacing.

https://twitter.com/#!/ID_AA_Carmack/st ... 62/photo/1

There is a subtlety here – the perspective center of projection is NOT in the middle of each 640x800 eye view, but rather pushed more towards the outside edge, making it horizontally asymmetric. If you support rendering this properly, you can get a degree of software interoccular distance adjustment for free. It would be best to always have the eyes centered on the lenses, but there is still some benefit to be had by user adjustment of the center of projection. It would be possible to enhance the physical design with adjustable optics even with a single fixed panel, as long as the software can adjust the projection. A really slick design would have position sensors on the optics adjustment and automatically communicate the position back to the software. In any case, the existing fixed lens positions worked fine for me.

For head tracking, I am using FSRK-USB-2 inertial modules from Hillcrest Labs (http://hillcrestlabs.com/products/refkits.php ) . They directly connect to a micro USB for comm and power, and the libfreespace code to talk with it is all open source. They made a special firmware version for me that updates at 250hz instead of the default 125, and this option will be rolling into their standard products soon. At $99 for single units, the modules aren’t the cheapest MEMS around, but they have been working great for me. I do my own body orientation tracking directly from the rate gyros, their default algorithm adds an awful lot of latency.

Without head tracking, you don’t appreciate quite how much the optics warp the image, but in a good low-latency loop it really stands out. Software pre-warping the image worked out very well, and made a huge difference.

https://twitter.com/#!/ID_AA_Carmack/st ... 52/photo/1

The final warping needed to be more aggressive than that picture, and I could spend a bit more time precisely calibrating the curve, but I am very happy with the results I got. Adjusting for chromatic aberration can be done by just running the warp with slightly different parameters for each channel, but I haven’t found it objectionable enough to bother yet. During development, it was extremely useful to have a second monitor mirroring the HMD video input so I could easily look at the view with and without the optics effects.

The resolution is low for this large of an FOV; you can definitely see the pixels, and most people who have tried it on have commented about it relative to the HMZ-T1 that they had seen before. With such large pixels you want to take extra care to avoid aliasing – I wound up supersampling a fair amount in addition to using 4x MSAA for best results.

The display panel has an admirably low latency, but it is very slow switching, taking nearly 20 milliseconds to fully change colors. This makes everything pretty blurry during rapid rotations, and makes the 60 hz refresh rate obvious in the strobed ghosts. Adding explicit motion blur to get rid of the strobing might be a win, but the real answer is a 120hz super AMOLED panel when they are available in the right size and resolution. Anyone from Samsung listening?

The display also burns in a bit if you leave it looking at something bright, giving a faint ghost for a few minutes after.

Tiny specs of dust are a problem with this level of magnification; rigorous cleaning and full sealing will be important.

After attaching a ski goggle band so I could freely look around and use a joypad with the HMD, a few new issues stood out.

Cable stiffness is a factor. The power cable isn’t much, but the VGA cable and the USB cable for the tracker make the bundle stiff enough to really get in the way when you are looking around. This may get worse with a DVI based panel. Eventually this type of thing is going to be completely self-contained and cut the cords, but finding very flexible cables, or even making a custom all-in-one may be worthwhile in the near term.

The driver board gets warm right on your forehead, but hanging it out behind the display would hurt the mass properties more. Less power dissipation would be nice.

It could use more nose relief. It would be nice to iterate on the design with a 3D printer to find the minimal structure.

With everything dialed in, the immersion level is so good that I can give myself a type of simulator sickness that I’m not used to seeing --while standing up in the real world and standing on a ledge looking down in the virtual world, swaying side to side makes it feel like the whole world is swaying, because I don’t have positional data integrated with attitude data right now. This shows up subtly all the time, but it is a more forceful effect when everything else is really pushing the immersion level. I need to get back to my Sixense integration work.

Bottom line:

After dialing everything in, this is by far the most immersive HMD of the five I have here. If Palmer comes close to his price target, it will also be the cheapest. I will be including full support for this in the next new PC title we release.

The problem is that most people won’t have a custom codebase that they can freely modify for the HMD. Someone is going to have to write an intercept driver that can capture mono or side-by-side stereo game output and warp it appropriately to make it useful for anyone other than developers. Head tracking should remain a separate concern; I don’t think it makes sense to bundle one that many people won’t be able to use, but leaving some explicit mounting surfaces would be good.

John Carmack


Thu May 17, 2012 3:53 pm
Profile
Petrif-Eyed
User avatar

Joined: Sat Sep 17, 2011 9:23 pm
Posts: 2190
Location: Irvine, CA
Excellent early review. Man you get all the perks :)

So are you using a custom in-house engine to render both left and right and perform the warping? Is the warping all being accomplished on the GPU in the regular pipeline, or are you doing post processing on the CPU?

Sounds like just a couple of small issues, but certainly to be expected from a garage kit. It seems like resolution and aspect ratio are the biggest complaints - but we all knew that. It still sounds like a very exciting piece of hardware! Incredible that it beats the immersion level of any of the professional kits that you own, and it's awesome that you plan on adding id game support for this!!


Thu May 17, 2012 4:26 pm
Profile
Golden Eyed Wiseman! (or woman!)

Joined: Fri Jul 08, 2011 11:47 pm
Posts: 1445
Great review, thanks John!


Thu May 17, 2012 5:28 pm
Profile
3D Angel Eyes (Moderator)
User avatar

Joined: Sat Apr 12, 2008 8:18 pm
Posts: 10875
Sounds like there are some kinks, but overall a very promising review. Thanks JohnCarmack!

Most of the complaints are things I noticed on the older prototype I tried, but even with those short-comings the immersion is something to see.

Also great to hear you will be adding support in a future title. I wish more game developers would take an interest in this sort of stuff.

_________________
check my blog - cybereality.com


Thu May 17, 2012 6:09 pm
Profile
Petrif-Eyed
User avatar

Joined: Sat Sep 17, 2011 9:23 pm
Posts: 2190
Location: Irvine, CA
Quote:
Without head tracking, you don’t appreciate quite how much the optics warp the image, but in a good low-latency loop it really stands out. Software pre-warping the image worked out very well, and made a huge difference.

https://twitter.com/#!/ID_AA_Carmack/st ... 52/photo/1


John or Palmer: Once the device is released, it would be great if you could post a few more images (or even a video) rendered at the native resolution and including the pre-warping effect to demonstrate the potential for this device. It would also serve as a nice reference pattern for would-be driver writers.


Thu May 17, 2012 10:45 pm
Profile
Binocular Vision CONFIRMED!
User avatar

Joined: Fri Jan 27, 2012 11:24 am
Posts: 228
John,

Thanks for a such a detailed review. Can I make a slightly cheeky suggestion that Palmer might have felt was too presumptious to ask...

Could you give him one of the demos (or, say a build of Doom 3? :) ) that he can include in the Kickstarter? I really think it would make the package so much better to have something good that users can get working in the first hour, with all the pre-distortion etc.

Valve did a similar thing with the Razer Hydra and I thought it made the thing look a lot more professional, to have a big recognisable game (Portal) bundled with it. It will probably take the community a few weeks to get games up and tweaked after people start getting the Rifts, in think it would make for a much better launch.

Thanks again.


Fri May 18, 2012 1:13 am
Profile
Golden Eyed Wiseman! (or woman!)

Joined: Fri Aug 21, 2009 9:06 pm
Posts: 1644
Thanks for the very detailed feedback, John! You hit pretty much every high point and low point it has. ;) Would like to list some of the things you noticed that can be fixed, and some that cannot:

Fixable

1) You mentioned this already, but the offset problem is indeed fixed by the new control boards. I only have a single sample unit at the moment, had to keep that here. The new boards work properly with both DVI and VGA.

2) Thanks for measuring out the voltage problem, I had never messed with it like I should have since it worked with 99% of the hardware I tried. I could use a step-up regulator if the problem persists, or hope that my new control boards run off straight 5v.

3) Yes, it needs a divider to fix the crosstalk problem. Stupid that I forgot to put one in yours!

4) One thing that works about as well as a barrier in front of the lenses is a barrier built behind the lenses. Looks cleaner, and still stays way out of any sort of visible focal length. I have higher magnification lens sets that can avoid the edges, but they weigh a lot more.

5) I want to do adjustable lenses. It is hard to make something that is cheap and easy to fabricate in low quantities that also has tight enough tolerance to keep the lenses aligned, and also ends up being heavier, so I might end up going with a fixed lens mount (Though the lens plate will be removable, people can mod and replace it however they want)

6) Warping really does help a lot. Very interested in if you end up compensating for chromatic aberration, as I have not seen that done in anything post-1990s.

7) The resolution can be partially fixed by using a diffusion filter to blur the edges between the pixels, it works very well. Problem is, it hurts contrast, brightness, and black levels pretty heavily. Still experimenting with different materials, people will be able to choose if they put the filter in.

8) Cleaning and sealing are definitely important. One of the best ways to do this at home is to run a hot shower and fill your bathroom with steam, wait a few minutes, then assemble it in that environment. The moisture will attach to and pull down all the dust in the air, so you get very clean results.

9) There are some very thin HDMI cables that could be used with a DVI adapter, and I am looking into custom cables that will not break the bank. The final version will have the cables running straight back over your head, which should be a lot less strain than having them off the side.

10) The new boards are a lot smaller, so putting it out on the front or even the side would be possible. More likely, though, I am going to see if it can be built on top of the forehead pad so that the cables can go over the head attached to a strap.

11) Definitely needs more nose relief. My latest prototype has a very minimal vac-formed case that leaves plenty of room for even the largest nose, and puts the lenses sticking way out past the main body. The unit I sent to you was thrown together pretty fast to make sure I shipped it to you in time, so the physical shape is not representative of the final one in any way.

Unfixable

1) The resolution can be masked a bit with filters, but it really does need to be higher. Toshiba has that great 6.1" 2560x1600 panel, would be perfect for this if they actually mass produce them!

2) It is slow switching, no way around that. This panel is really not up to the quality of modern LCD panels, it was designed years ago, but it the only panel with suitable size and resolution on the market. :( The bright image ghosting effect seems to be largely fixed with my new control board, perhaps it is driving the panel more aggressively?


All in all, certainly not flawless gear, but it is going to be a good stepping stone toward things to come. I am confident I can keep that sub-$500 price point, too! :D Something John did not mention is that this unit is very light, well under 1lb. Beats the HMZ-T1, even with all the control hardware on board! :P

Thanks again! Going to crank out some more refined prototypes to send off to a few more people for testing and feedback.


Fri May 18, 2012 2:43 am
Profile
Binocular Vision CONFIRMED!

Joined: Wed Sep 30, 2009 8:29 pm
Posts: 236
Going to sound like a pervert here - however I would die to stick my face into one of those!!!

If you are looking for a non-technical everyday average user to test it out - I am your man lol!!!

Thanks again for all our your dedication!


Fri May 18, 2012 7:30 am
Profile
Petrif-Eyed
User avatar

Joined: Sat Sep 17, 2011 9:23 pm
Posts: 2190
Location: Irvine, CA
PalmerTech wrote:
The resolution can be masked a bit with filters, but it really does need to be higher. Toshiba has that great 6.1" 2560x1600 panel, would be perfect for this if they actually mass produce them!


If a better panel does come available at some point, how difficult do you think it will be to replace the current panel? Would it require a total redesign of the optics and packaging or do you think it could "more-or-less" be a drop-in that any of us DIY'ers could accomplish?


Fri May 18, 2012 7:45 am
Profile
One Eyed Hopeful

Joined: Wed Apr 04, 2012 5:21 pm
Posts: 25
What are the actual LCD panel dimensions? The resolution doesn't bother me nearly as much as the slow pixel response time. A 2560x1600 panel might cross over the line where the framerate suffers more than the resolution buys you, but a 1920x1080 panel would almost certainly be a win for most people.

I hope that I can provide some software to go with the kits, I'll be able to talk a bit more freely in a couple weeks. Having something that really shows it off out-of-the-box is important for how it will be perceived.

John Carmack


Fri May 18, 2012 8:43 am
Profile
Cross Eyed!
User avatar

Joined: Wed Feb 10, 2010 11:10 pm
Posts: 160
John- Great review! Very detailed! Hope you can get the software issue worked out, it would be awesome to use this thing out of the box with a game. It's great to see someone of your fame and reputation in here helping to get a project going. If you get this working with some id games maybe others would follow suit!

This project looks more promising everyday! If we can get a panel with higher resolution, this will definitely be the best option in the HMD market!

_________________
www.abcliveit.com Change your life! PM for details


Last edited by fireslayer26 on Fri May 18, 2012 9:54 am, edited 1 time in total.



Fri May 18, 2012 8:45 am
Profile
Golden Eyed Wiseman! (or woman!)

Joined: Mon Jun 22, 2009 8:36 am
Posts: 1598
Location: Stockholm, Sweden
Great review John, having you "onboard" will surly help this kickstarter!

_________________
FreePIE
My blog


Fri May 18, 2012 9:06 am
Profile
Golden Eyed Wiseman! (or woman!)

Joined: Fri Aug 21, 2009 9:06 pm
Posts: 1644
brantlew wrote:
If a better panel does come available at some point, how difficult do you think it will be to replace the current panel? Would it require a total redesign of the optics and packaging or do you think it could "more-or-less" be a drop-in that any of us DIY'ers could accomplish?


It would be trivially easy. The most it would need is a new panel mount, which would cost only a few dollars. If a new panel came out, I would probably jump on doing a group buy for people to upgrade.


JohnCarmack wrote:
What are the actual LCD panel dimensions? The resolution doesn't bother me nearly as much as the slow pixel response time. A 2560x1600 panel might cross over the line where the framerate suffers more than the resolution buys you, but a 1920x1080 panel would almost certainly be a win for most people.

I hope that I can provide some software to go with the kits, I'll be able to talk a bit more freely in a couple weeks. Having something that really shows it off out-of-the-box is important for how it will be perceived.


The rough measurements of the physical panel: 132x88x5mm
The rough measurements of the active screen area: 121x76mm
It is a 5.6" diagonal panel, so a slightly larger panel (Around 6 inches) would get rid of the edges completely.

It would be great to have a software demo that could go with this, can't wait to hear more. I think there will be at least two versions of the Rift, one with head tracking, and one without. If NRP gets that open source tracker together in the $20-30 price range he estimates, that is something a lot of people would like. If people already have a tracking system, then they can get the slightly cheaper unit and mount whatever they want.


Fri May 18, 2012 1:51 pm
Profile
Cross Eyed!
User avatar

Joined: Wed Feb 10, 2010 11:10 pm
Posts: 160
Palmer- Is Toshiba not actually selling that panel yet?

_________________
www.abcliveit.com Change your life! PM for details


Fri May 18, 2012 2:08 pm
Profile
Golden Eyed Wiseman! (or woman!)

Joined: Fri Aug 21, 2009 9:06 pm
Posts: 1644
Nope, not even engineering samples. It was just a prototype to show off what they can do.

The new Samsung Exynos 5250 chipset coming out gives me hope, though. It is the first SOC to fully support WQXGA resolution, and you can bet that there will be tablets and phones coming out to take advantage of that.


Fri May 18, 2012 2:30 pm
Profile
Golden Eyed Wiseman! (or woman!)

Joined: Mon Jun 22, 2009 8:36 am
Posts: 1598
Location: Stockholm, Sweden
PalmerTech wrote:
Nope, not even engineering samples. It was just a prototype to show off what they can do.

The new Samsung Exynos 5250 chipset coming out gives me hope, though. It is the first SOC to fully support WQXGA resolution, and you can bet that there will be tablets and phones coming out to take advantage of that.


If the kickstarter hits a certain point plus having a major company like ID showing interest I'm sure companies like LG or Samsung should look your way.

_________________
FreePIE
My blog


Fri May 18, 2012 3:45 pm
Profile
Sharp Eyed Eagle!

Joined: Sat Apr 12, 2008 9:45 pm
Posts: 376
I agree a software bundle with the HMD that can show the strength of gaming using independent head tracking would be a great selling point. Also, 1920x1080 panel would be a better match as I could use most off the shelf equipment for wireless HDMI gaming.

With Samsung now starting to push 1080p panel for handphone and tablet, maybe can checkout if they sell any demo kits.


Fri May 18, 2012 6:01 pm
Profile
Certif-Eyed!

Joined: Tue Jan 19, 2010 6:38 pm
Posts: 529
Wow. This is almost perfect.

I just hope they are still avilable when I have the cash.

(To give an idea of the recognition, even my girlfriend knows quake)

_________________
"If you have a diabolical mind, the first thing that probably came to mind is that it will make an excellent trap: how do you get off a functional omni-directional treadmill?"


Fri May 18, 2012 7:06 pm
Profile
Diamond Eyed Freakazoid!

Joined: Sun Oct 24, 2010 7:25 pm
Posts: 718
If JohnCarmack was to make a demo that runs on this unit, OH MY...Match made in heaven.
I think a one level JohnCarmack demo run on this unit would have you playing that one level, over and over and over..ect.
People would buy the RIFT, just to play ID games, if they were the only company smart enough to write code for it.
The unit already has a sort of. bad AZs doom look to it...LOL


Thu May 24, 2012 5:55 pm
Profile
Cross Eyed!
User avatar

Joined: Thu May 10, 2012 4:42 pm
Posts: 140
3dvison wrote:
If JohnCarmack was to make a demo that runs on this unit [...]
People would buy the RIFT, just to play ID games[...]


I know I would.

_________________
Image
You can also Greenlight other Rift games.


Mon May 28, 2012 5:30 pm
Profile
One Eyed Hopeful

Joined: Sun May 15, 2011 3:32 am
Posts: 2
Hi, I have just one question, instead of asking "please samsung" why don't you just get a galaxy S3 and stick that into the glasses? Super Amoled 4.8” 1280x720 screen, powerful 1.4ghz quad core cpu with 3d acceleration, 1 gig ram, it should be enough to get a quite decent game runing on it controlled with a bluetooth joystick like the PS3 one (that has accel sensors too), and should have plenty of power to handle wireless receiving and decompression of live video from a PC if you need so... It's also self contained, has it's own battery, audio out, gyro and accelerometers, and even a very nice high fps and high res camera on the back that you can use for head tracking. Maybe you can even ditch android if it eats too much resources and run directly on linux...
Yes, it cost a bit too much, but as soon as the price drops a bit it should be a very good option.
What do you think about it?
My best regards to you guys, you're making my dreams come true, and a special thank to John for just being awesome, I'm a huge fan of your work, you made history!!
Can't wait to get my pair of VR glasses. :D


Fri Jun 08, 2012 12:29 pm
Profile
One Eyed Hopeful

Joined: Thu Jun 28, 2007 9:01 pm
Posts: 27
gipmad wrote:
should have plenty of power to handle wireless receiving and decompression of live video from a PC if you need so...

I would imagine latency to be an issue with that. Weight is also a consideration.


Fri Jun 08, 2012 12:54 pm
Profile
Petrif-Eyed
User avatar

Joined: Sat Sep 17, 2011 9:23 pm
Posts: 2190
Location: Irvine, CA
Some discussion of using phone screens for HMD.

http://www.mtbs3d.com/phpBB/viewtopic.php?f=26&t=14365&hilit=phone


Fri Jun 08, 2012 1:03 pm
Profile
One Eyed Hopeful

Joined: Sun May 15, 2011 3:32 am
Posts: 2
brantlew wrote:
Some discussion of using phone screens for HMD.

http://www.mtbs3d.com/phpBB/viewtopic.php?f=26&t=14365&hilit=phone


That discussion gets to my same conclusion, now that the Galaxy 3 is out! It looks like the S3 has the 802.11n wifi, not the 5ghz one, so it may have a big lag indeed, but they claim that they can transfer a 1gb file in 3 minutes via the NFC connection, so that may help. Doing all the processing on the phone is not a bad thing anyway, with such power available.


Fri Jun 08, 2012 1:44 pm
Profile
Certif-Eyed!

Joined: Tue Jan 19, 2010 6:38 pm
Posts: 529
Another possibility

http://arstechnica.com/gadgets/2012/06/ ... ckstarter/

Maybe not this specific device, but I'm sure others with more graphics power will be created.

_________________
"If you have a diabolical mind, the first thing that probably came to mind is that it will make an excellent trap: how do you get off a functional omni-directional treadmill?"


Fri Jun 08, 2012 3:08 pm
Profile
Golden Eyed Wiseman! (or woman!)

Joined: Fri Aug 21, 2009 9:06 pm
Posts: 1644
gipmad wrote:
Hi, I have just one question, instead of asking "please samsung" why don't you just get a galaxy S3 and stick that into the glasses? Super Amoled 4.8” 1280x720 screen, powerful 1.4ghz quad core cpu with 3d acceleration, 1 gig ram, it should be enough to get a quite decent game runing on it controlled with a bluetooth joystick like the PS3 one (that has accel sensors too), and should have plenty of power to handle wireless receiving and decompression of live video from a PC if you need so... It's also self contained, has it's own battery, audio out, gyro and accelerometers, and even a very nice high fps and high res camera on the back that you can use for head tracking. Maybe you can even ditch android if it eats too much resources and run directly on linux...
Yes, it cost a bit too much, but as soon as the price drops a bit it should be a very good option.
What do you think about it?
My best regards to you guys, you're making my dreams come true, and a special thank to John for just being awesome, I'm a huge fan of your work, you made history!!
Can't wait to get my pair of VR glasses. :D


The screen is a bit too small, they use a pentile arrangement for the sub-pixels that looks awful under magnification, and even the fastest streaming out there has about 50ms of lag, and that is with pretty heavy compression that makes the game look terrible.

Not a very good option right now, but in a few years, I think most HMDs will be rendering their games with onboard ARM based SOCs anyways. :)


Fri Jun 08, 2012 4:02 pm
Profile
Cross Eyed!

Joined: Fri May 18, 2012 5:31 pm
Posts: 102
Location: Houston, TX
Quote:
Not a very good option right now, but in a few years, I think most HMDs will be rendering their games with onboard ARM based SOCs anyways. :)

I'm certainly hoping to try hook some of our Android devices with HDMI-out to the Rift when I get my hands on one. Given the somewhat flaky support of HDMI-out on those devices, I wouldn't be surprised if there's some increased latency above the already floaty timing of eglSwapBuffers() Android. I'll be interested to see what shakes loose, at the very least.

I'm going to try to snag a Hillcrest tracker/Doom 3 bundle, so I'll probably see what can be done about working with libfreespace on Android, too. That will be completely new territory for me, so hey, adventure! ;)

Ideally I'd like to work up something that's useful for other people, too. Depending on the the form that John's example code takes (and if he's given permission to release it), the most productive thing to do might be to ultimately release an Android port of that.


Fri Jun 08, 2012 5:15 pm
Profile WWW
Binocular Vision CONFIRMED!

Joined: Thu Jun 07, 2012 8:40 am
Posts: 226
Location: New York
BillRoeske wrote:
Quote:
Not a very good option right now, but in a few years, I think most HMDs will be rendering their games with onboard ARM based SOCs anyways. :)

I'm certainly hoping to try hook some of our Android devices with HDMI-out to the Rift when I get my hands on one. Given the somewhat flaky support of HDMI-out on those devices, I wouldn't be surprised if there's some increased latency above the already floaty timing of eglSwapBuffers() Android. I'll be interested to see what shakes loose, at the very least.

I'm going to try to snag a Hillcrest tracker/Doom 3 bundle, so I'll probably see what can be done about working with libfreespace on Android, too. That will be completely new territory for me, so hey, adventure! ;)


I agree with both quotes, however I'd rather not have HMD's rendering with their own SOC's but rather using our existing devices, iOS, Android (WP8 anyone?) and work with those guys to reduce latency, lag, etc... in their pipelines. They all recognize the value of games and there are many high profile engines out there that it shouldn't be a problem to get them to make some changes to accommodate this lower latency which it looks like John has made into a personal quest anyways :) Here's to this happening sooner rather than later!

_________________
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex


Sat Jun 09, 2012 1:32 pm
Profile WWW
One Eyed Hopeful

Joined: Fri Jul 27, 2012 10:44 am
Posts: 3
</delurk>
Apologies in advance for poking my nose in... I'm a long time forum lurker.

I'd like to throw some potentially important considerations out there in regards to future Rift development; in my reading the various threads I've not seen this concept brought up (again, apologies if it has).

If we start with a few basic assumptions this might help frame some concerns.

1. John C. is spot on that update latency and frequency are critical to immersion in a VR device.
2. Update frequency/latency is possibly even more important in an AR (vs VR) system where any registration problems for the CG overlayed with the real time video will immediately be immersion destroying (I only bring this up because with any luck, development of the Rift could become a driving force towards VR device interface standards).
2. A VR/AR system should be able to handle sudden changes in behaviour (i.e. the player snap-turns 180 degrees), this introduces the need for a system similar to texture streaming where a low fidelity representation is always available to cover for us.
3. Simulation/Rendering update frequency is not necessarily the same thing as display update frequency. Where 60hz is a pretty reasonable simulation update, I'm sure we'd all love to see 240hz motion tracking/camera frustum updates.
4. It's potentially unreasonable to expect game developers to put out games that complete their simulation/rendering in a sub 4ms time frame (or 8ms for a 120 hz update), as you're leaving a lot of other potentially cool features on the table in order to service the display device fast enough (physics, better AI, etc).
5. It's unreasonable to expect software developers to account for per-display variations (i.e. distortion mappings) within their software (i.e. currently the unique needs of a display are handled by the display itself and applications are expected to interface with displays through a set of standards (resolution and physical connection)).

What this ultimately leads to, is the notion that the game/software and it's requisite hardware really should be abstracted from the needs of the display.

In the long run, we might be better served by taking the approach that any given app should simply provide the same information to a VR display that it currently does, at an expected update rate of 30 (boo) to 60 (yay!) hertz (with one potential caveat regarding a potential future standard) with a standardized resolution.

This would imply a two stage system whereby the application runs as usual on its own hardware, and the submitted frame is then passed to a second, specialized hardware system whose job is to update the display orientation and frame presentation taking into account a visual "guardband" around the expected view direction.

To take this one step farther, such a system would ideally be capable of showing a low fidelity 360 degree placeholder of the current scene (possibly as a cube map) which would be cross faded with the high fidelity main view dependant on the player's movement.
By "feathering" the edge of the primary display into the low fidelity surrounding environment, we create a soft transition that removes any hard edge artifacts between the primary and secondary draw layers.

As a bonus, by running this daughter system, we gain a place to handle any VR/device specific needs, like applying hardware specific display distortion maps and computing motion blur all in one pass.

In this design, the app provides the current simulation frame to the daughter system in its own time frame, and the VR's secondary hardware simply flips the simulation image whenever it's ready. A daughter system of the required horsepower/size likely exists in the form of contemporary SOCs used in modern smart phones, and as such could be made small enough and low power enough to reside in the hardware itself.

In regards to the LOD environment.

This is the sticky part that seems to necessitate some custom development by the software developer (hopefully someone has a better idea). If a standard was adopted by which a portion of the HDMI signal could contain an encoded cube map which the daughter system could decode, then software developers could implement a low frequency update to this secondary VR environment (i.e. at 1/6 the current rendering speed) by locking a view point and rendering a single cardinal view per frame, then encoding the resulting cube faces into the frame (possibly stacked vertically along one side of the render frame and decoded by the daughter hardware?).



I don't want to overthink this, nor do I feel this is necessarily an answer to the concerns I've listed, but I was hoping it might be useful to bring these thoughts up in the interest of promoting a discussion that takes many of the Occulus Rift concepts further along the directions they're already headed. I'd also like to point out that the current approach is immediately supportable, and that I submit these thoughts more as future consideration than something that should be dealt with right now. Now is the time to hack =).


I've been following threads on VR devices here on MTBS for a long time, and my fascination with VR goes back 20 years (I owned a Virtuality system at one point... (don't ask) )...
I'd like to chip in my personal appreciation at all of you who are helping to move the ball forward in regards to pushing the technology and standards...

All the best,

TC.

</relurk>


Fri Jul 27, 2012 10:52 am
Profile
Sharp Eyed Eagle!

Joined: Sat Dec 22, 2007 3:38 am
Posts: 425
The LQ 360deg render technique won't work with a freely moving HMD. Turning your head does not just rotate your eyeballs in place individually, but moves both of them around a separate common centre of rotation, which itself is likely to be translating in multiple axes. You would have to pre-render a scene for multiple possible eye-pair positions and interpolate between them to get any benefit over just updating the view direction and sending the engine a 'render fast' flag for very rapid movements.


Sat Jul 28, 2012 3:59 am
Profile
One Eyed Hopeful

Joined: Fri Jul 27, 2012 10:44 am
Posts: 3
Completely agree about the potential inaccuracy of the LOD view (as well as the associated issue of collapsing the depth), but on the flip side, we're talking about a failsafe that's only there as a placeholder until the primary render can catch up (hopefully less than 1/30 of a second later), and one which is likely being blurred due to fast movement.

I'd argue that by the same token (though admittedly maybe to a lesser extent), texture streaming and adaptive LODs can look "wrong" too, but they can also be better than the alternatives.

I think we're likely to run into issues rendering a "fast mode", given the complexity and variety of engine design and their inherent threading and internal rendering latencies. I.e. a fast turn that pivots you into a wall still needs to reconcile the viewpoints against the collision in order to prevent you crashing through the world, so now you have to wait on the app and the followup render on top of hardware latency involved.

On top of that, we're now putting the onus on the developer to manage the issues being imposed on them by a new target platform, whereas abstracting as many issues as possible to the device could potentially get more groups on board. It's fantastic that John is all-in with customizing their work, but I'm sure we all agree that in the long run most of these details need to be abstracted away from the developer.

On almost every game I've worked on, we've gone in with the ambition to run at 60 fps (consoles), but at the end of the day, when you're looking longingly at those extra 16 ms, it's hard to give up all the bonus features you could have had. Trying to convince developers to write their games to run in one or two binary orders of magnitude less time in order to support an interesting but initially small market is going to be a hard sell.


Sat Jul 28, 2012 1:08 pm
Profile
Golden Eyed Wiseman! (or woman!)

Joined: Fri Aug 21, 2009 9:06 pm
Posts: 1644
ThomasC wrote:
5. It's unreasonable to expect software developers to account for per-display variations (i.e. distortion mappings) within their software (i.e. currently the unique needs of a display are handled by the display itself and applications are expected to interface with displays through a set of standards (resolution and physical connection)).

What this ultimately leads to, is the notion that the game/software and it's requisite hardware really should be abstracted from the needs of the display.


I definitely agree. One of the goals for Oculus is to have an SDK that is easy to integrate, and once it is integrated, hardware flexible. The idea is that the hardware could report its specs when you plug it in (FOV, resolution, tracking, warping parameters, etc), and the Oculus integration would adjust the software to fit. That way, the hard part of all this would be on Oculus, not on developers. :) You are also right about the performance vs features issue, we have a few ideas on how to mitigate that as much as possible. One good thing is that we don't have to worry all that much about 180 snaps, people can't turn very fast with their heads compared to a mouse! It will take lots of time and expertise brainpower to get everything perfect, but that is exactly what the Rift can be used for: Getting things perfect for a consumer version. :)


Sat Jul 28, 2012 2:26 pm
Profile
One Eyed Hopeful

Joined: Fri Jul 27, 2012 10:44 am
Posts: 3
I'm totally on-board with what you guys are doing, which I hope will ultimately validate VR as a "next big step". Most of my line of thought has come from thinking about how good AR might ultimately work. If we suppose that some day we end up with something along the lines of a synthetic aperture light field camera (maybe some kind of interferometer?) which can generate real time high fidelity depth information, then the next issue will be how to receive that depth input from the camera, merge it with CG, and re-display it fast enough to work.

In this particular case, it's pretty evident than any mis-registration of the graphics over the live view is going to completely destroy the illusion of the two belonging to the same world. Once you start pursuing that line of thought, you start realizing how critical bringing latency as near to zero as possible is going to be. I suspect that in reality we'll need to both reduce latency and either do something similar to current CG motion tracking techniques based off of scene contrast, or "cheat" by capturing the live video and fully reprocessing it (i.e. not actually blending CG into a live shot, but actually encoding the live view and re-imaging it interleaved with the CG).

The purpose of my butting into this thread was ultimately aimed farther down the road, once the various features are positioned as the responsibility of either the app or as specialized hardware sitting beside the display. In my mind, if a low powered SoC containing a reasonably powerful gpu could be coupled directly to the motion tracker, then by providing the primary rendered image from the app with a guardband around it, you could conceivably take a lot of the loop out of the equation and end up with blazing fast tracking by letting specialized hardware drift the view locally within the display (the LOD environment I posited was really just an extension of this line of thought... perhaps getting ahead of myself). You guys are currently focused on VR, not AR, but I can easily see your concerted push creating a set of standards and I felt it couldn't hurt to float a couple of longer term concerns.

I do still think it's worth considering, that most players are quite happy with a 60hz simulation, and that 120/240 hz will be critical for tracking believability. In my mind this implies decoupling the two by some mechanism similar to what I'm imagining in my head, but I suspect you guys have thought this through already.


Sat Jul 28, 2012 8:41 pm
Profile
Petrif-Eyed
User avatar

Joined: Sat Sep 17, 2011 9:23 pm
Posts: 2190
Location: Irvine, CA
A few minutes with an Oculus Rift

Well I finally got to try the Rift. I was at QuakeCon Thursday and half of Friday. Since Oculus was such a recent addition to the show, there were a few technical and bureaucratic issues that prevented Palmer and crew from demoing the Rift on Thursday. However, Palmer graciously granted me and another MTBS3D member a private demonstration on Thursday night. Now I didn't get to actually see the Doom 3 demo, but I did see an earlier Carmack test within a small Rage environment. On Friday, the Doom 3 demo was still unavailable so they were still showing the same Rage demo to the public. So I got to see it twice, but in total only for about 5 minutes. Not near enough for a proper and valid review - but I gotta talk about it anyway.

It's hard to be very objective. The first few minutes is really all about just wanting to look around. It's hard to really concentrate on the details: are the edges visible, can I see the pixel structure, what is the resolution like - all the stuff that I wanted to look at closely just got shelved as soon as I stuck my face in the Rift, because the coolness of the whole thing is just so overwhelming. You just can't stop looking around and admiring just how "real" it all feels. At one point I found myself just grasping my hand in midair because it looked so much like there was a cartoon pipe right in front of me. The sense of depth is just amazing - so far beyond 3D (but not in a silly pop-out way). Everything just somehow has a tangible "weight" and depth to it. When you move to the edge of a ledge and look down - you feel the vertigo in your gut. I experienced simulator sickness for the first time - not just some general discomfort. But a strong and instant gutteral feeling as I was looking down a hole and swaying back and forth. Oh, and for you guys that are concerned about the resolution... With the strong antialiasing, it didn't bother me at all. Now I can certainly imagine for reading text and HUDs it would be noticeable, but with pure scenery watching - it is not a big deal. I forgot to even try to find the pixels because I was so enthralled with the experience.

If I have to nitpick, I would say my main issue was a small tracking latency that I observed. One of the major points that came across to me at QuakeCon was the important of low latency. I've heard Carmack talk about it endlessly and I sort of discounted what he was saying a bit - assuming he was just obsessing over the last 2%. But when it comes to this level of immersion I completely understand his point now. You can get away with all kinds of delays and inaccuracies with non-immersive displays. But the moment you start to feel like you are "in there", you can't ignore those things anymore. I believe Carmack in the keynote claimed the Doom 3 Rift latency was around 40 or 50 milliseconds. Now I saw an older version of that code base, so it may have been even a bit more on this demo. But it was definitely noticeable to me. I wouldn't call it a "stutter" necessarily - that's overstating it. More like a "vibration" as I panned my head. You wouldn't think about it twice if it was your frame rate on a normal screen. But on the Rift, the effect is amplified.

And that's one potential pitfall I see with the Rift. The device is so good, that it amplifies any other problems with latency and inaccuracy. It forces perfection in every other aspect of the simulation. Forget trying to play games with all the effects turned up and running at low frame rates. You'll need all the frame rate you can muster. And it forces me to seriously reconsider inaccuracies in my own projects. Currently I can tolerate all types of motion inaccuracies, stuttering, and latency problems. With the best consumer HMDs those problems just sort of look crummy, but I can deal with it. But I suspect the Rift is not so forgiving - and instead of just looking crappy it might actually make me throw up! Another subtle detail - the way that Carmack modeled the head translation as you rolled your head was sort of funny. The first time I tilted my head sideways the wall in front of me sort of stretched and sheared. The reason is because I was rotating my head around my chin (sort of lopping it to the side onto my shoulder). Well the Carmack interpretation was more like rotating around my nose. Once I consciously rotated around my nose, everything looked correct. That's the sort of thing that you would never-ever notice on a desktop screen, but is so obvious on the Rift.

Ok, well I've gone on long enough - much longer than I actually even used the device. To sum it up, I think it's just fantastic. I would be completely satisfied even if this was the consumer version. Well done Palmer. I can't wait to get one at home to start tinkering with it.


Last edited by brantlew on Sat Aug 04, 2012 11:13 am, edited 3 times in total.



Sat Aug 04, 2012 12:44 am
Profile
Golden Eyed Wiseman! (or woman!)

Joined: Fri Jul 08, 2011 11:47 pm
Posts: 1445
Awesome brantlew, its really nice to get an objective opinion on the Rift.

Quote:
You just can't stop looking around and admiring just how "real" it all feels.


I was like that on the first virtuality system I used, looking at the gun in my hand, and the fact that it felt almost like putting a scuba goggle on and going diving. With higher FOV, you really get a sense of 'being there' that you dont get with lower FOV HMD's.

The latency thing has ALWAYS been an issue with VR unfortunately. I wonder how much extra latency CyberReality and emersons warping drivers will introduce as well?

Re the translation issue, I think this is probably more of a concern due to the choice of tracker. I'd imagine we were using an absolute tracker like a magnetic one, then roll and rotation should work much more naturally.


Sat Aug 04, 2012 7:21 am
Profile
3D Angel Eyes (Moderator)
User avatar

Joined: Sat Apr 12, 2008 8:18 pm
Posts: 10875
@brantlew: Thanks man. Sounds pretty good to me.

@WiredEarp: The driver certainly has overhead but I haven't done any benchmarks yet. The main issue right now is that I am rendering the eyes sort of like page-flipping (ie frame 1 is left, frame 2 is right) but I am not caching draw calls. So basically it means if you want the full 60FPS in 3D you need to be running at over 120FPS with vsync off. For games like HL2 and L4D this is not a problem and looks fine. But with more intensive games it would be a problem. Beyond that, when I integrated the Hillcrest tracker the framerate dropped a bunch. I think if I pull it into a separate thread that would speed things up, but right now the tracking seems intensive.

_________________
check my blog - cybereality.com


Sat Aug 04, 2012 7:36 am
Profile
Sharp Eyed Eagle!
User avatar

Joined: Tue Feb 21, 2012 11:57 pm
Posts: 428
Location: Irvine, CA
Does anybody remember what the Rift's latency is, from video cable to screen? I vaguely remember JC saying that he measured it, but I can't recall the number.

EDIT: He mentioned during the keynote, it's between 40 and 50ms.


Last edited by FingerFlinger on Sun Aug 05, 2012 9:51 am, edited 1 time in total.



Sat Aug 04, 2012 7:55 pm
Profile
One Eyed Hopeful

Joined: Sun Aug 05, 2012 1:06 am
Posts: 30
I'll be investing with your kickstarter project -- I'm excited. Gaming aside, I would be interested in using this with flight simulation (Microsoft FSX). As you know, FSX is an ancient program and development stopped a longtime ago. It is largely the community and third party software which keeps it alive (check out http://fullterrain.com for beautiful realistic scenery, and PMDG http://www.precisionmanuals.com for a very realistic rendition of the Boeing 737-800).

Right now tracking can be done within the cockpit with hardware such as TrackIR (leds are mounted to your headset or your cap) and the software directly interprets the coordinates for use within FSX.

Would there be a way to develop a software so Oculus Rift could be universally used with software that do not have active developer support?

There are already drivers that can load into the background to perform what Oculus Rift may need to offer full immersion for all programs --

DIY Stereo 3D Driver: http://www.youtube.com/watch?v=Ovf5TLiIfZ8
Image Warping: http://www.fly.elise-ng.net/index.php/i ... splaylite2 (I believe this directly interfaces with d3dx9.dll)


Sun Aug 05, 2012 1:20 am
Profile
Petrif-Eyed
User avatar

Joined: Sat Sep 17, 2011 9:23 pm
Posts: 2190
Location: Irvine, CA
@OmniAtlas: There are a couple of developers on this site that are working on drivers to convert preexisting games to the Rift format. One of them is open source. I would check there.

http://www.mtbs3d.com/phpBB/viewtopic.php?f=138&t=15086

http://www.mtbs3d.com/phpBB/viewtopic.php?f=26&t=14970


Sun Aug 05, 2012 7:15 am
Profile
Cross Eyed!

Joined: Fri May 18, 2012 5:31 pm
Posts: 102
Location: Houston, TX
A few minutes with Doom 3 on the Rift

As a lot of you know, Palmer and company were able to demo the E3 version of Doom 3 BFG for the public with the Rift on Saturday at QuakeCon. The short version is that it is awesome and demoed quite well. The hall opened at 9:00am, and by 11:30am the line was already an hour long. Occasionally people who were coming off the demo station would shout back over to the rest of the line, "it's worth it!" I had to wonder how many of them realized that some of us had been standing in that line for twenty years. :)

First up, some thoughts on the hardware.

They were actually using JohnCarmack's modification of the Rift (as seen in his E3 interviews with the blinders, black tape on the lens edges, and single elastic ski goggles band to hold it to your head). I was actually pretty happy to see that since it closely resembles the developer kit design. It felt reassuringly secure and lightweight. A few of my friends complained that it sat just a little too close (they felt their eyelashes brushing the plastic lenses), but I didn't have that problem. Lack of ventilation could be an issue for long-term use, but the blackout effect is really, really effective. Time to start researching DIY ways to allow air in, but not light?

I'm near-sighted and didn't have any trouble getting a focused image, but for my other friend (that requires glasses for pretty much everything), it was basically unusable. The glasses question was on everyone's mind at QuakeCon and I can understand Palmer's hesitation to create much more space between the eye and the lens; to do so would diminish the FOV pretty quickly. Larger lenses could fix that, but then you're in the game of trying to source new parts. Palmer has pitched a few ideas in interviews, so we'll see what they ultimately settle on there.

The display panel is probably the weakest part of the package. Resolution is on everyone's mind, so I'll get that out of the way first: it honestly wasn't a big deal. Yes, the grid and sub-pixels are there when you look for them. Yes, I will happily trade up when a higher-resolution panel is available. As it was, though, the grid was gone whenever I became engaged. My wish is for a panel with better contrast and a faster response time. Those two issues combined flattened out the feeling of depth somewhat, and more than the resolution, reminded me that I was in a simulation. Happily, the panel is also the part with time (and the massive arms race of the mobile industry) on its side. The panel that's there should be perfectly adequate for a developer kit, and even a heck of a good time in Doom 3. In the mean time, I would just author content on a traditional monitor and trust that the display in consumer HMDs will be up to spec.

Playing Doom 3 itself was great fun. Even though my friends and I knew that there wasn't any positional tracking, we still couldn't help dodge to the side in our chair a little bit as a fireball arced toward us. The segment on demo was from a very action-heavy part of the game when you have a drone helping you take down the constant stream of imps. It's a smart choice since it's relatively hard to die, and the player has some time to get oriented and be a bit of a tourist. Having a game pad input mapped to handling large turning worked well for seated play, as did decoupling the pitch of the weapon from the view.

I'm used to seeing VR demos with terrible graphics, so playing a polished game was actually a real treat. The Rift and Doom 3 do enough things noticeably better than anything else at the consumer level that I felt like I was experiencing a lot of "firsts" all over again.


Tue Aug 07, 2012 10:04 am
Profile WWW
Display posts from previous:  Sort by  
Reply to topic   [ 49 posts ]  Go to page 1, 2  Next

Who is online

Users browsing this forum: Google [Bot] and 9 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Jump to:  
Powered by phpBB® Forum Software © phpBB Group
Designed by STSoftware.