VR/AR Windows "Desktop" development

This is for discussion and development of non-commercial open source VR/AR projects (e.g. Kickstarter applicable, etc). Contact MTBS admins at customerservice@mtbs3d.com if you are unsure if your efforts qualify.
Post Reply
LeeN
Cross Eyed!
Posts: 140
Joined: Sat Jul 17, 2010 10:28 am

Re: VR/AR Windows "Desktop" development

Post by LeeN »

I've been thinking lately of running sensors in a separate process and storing their current state in IPC shared memory.

A while ago, I tried using PTAM for head tracking, and found their code a bit heavy to integrate with vr x, so instead I modified their code to shove the generated matrix into shared memory, which vr x could map the shared memory and use the mayrix.

Another cool thing about this is that you can run a sensor process separately from your rendering process (ibex, vr x), which means you can keep the rendering process running while modifying/rebuilding the sensor process and vice versa.

I'm not sure about the performance since both processes will be kind of out of sync, one process in theory could eat a lot of cpu time unnecessarily.

Another aspect of this is that some sensors(bundles) provide other data that could be useful, such as the kinect. This could also be copied into shared memory.

An alternative to shared memory would be sockets.

This is in a way, the start of a 3d protocol, for applications to start sharing 3d data.

What do you guys think?
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

In addition to sockets, what about named pipes? You can run them in a separate process as a kindof user-space driver of sorts, and just have filenames after each device and just stream data there. Shared memory is a bit heavy handed, I think that sockets might work better or publishing onto a message bus like D-Bus or something (though I'm not sure the latency on that would be best).
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
LeeN
Cross Eyed!
Posts: 140
Joined: Sat Jul 17, 2010 10:28 am

Re: VR/AR Windows "Desktop" development

Post by LeeN »

I'm not familiar enough with named pipes, I'll have to research that more to know if there are any caveats.

I'm curious though why Xorg chose shared memory over named pipes. I know shared memory reduces copies. We use shared memory for graphics where I work also, that is why I am more familiar with it. The only nasty thing I don't like about it, is you need to have a special ID and hope no one else uses that ID.

I'll also have to look at how your ibex iPhone code works, since technically you are already transmitting sensor data.
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

Named pipes basically act like data streams but have a filename just like the rest of the devices on /dev on Linux, except you can place it anywhere. You can write to it from one process and read from another. I think sockets have the benefit that if you are writing all your data in single messages and are doing it locally, you'll basically get similar performance to pipes and you have the added benefit of having all your data on the message. With a named pipe you'll have to ensure you read only starting from the beginning of a data structure or something, so there's that bit of work. I was going to use Bonjour for service discovery but didn't want to complicate things just yet, but with that I could have advertised the iPhone as an orientation device and connected to it easily instead of listening to a hardcoded port and having to type the IP of my machine on the iPhone.
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
NickK
Two Eyed Hopeful
Posts: 55
Joined: Thu Jun 14, 2012 10:59 pm

Re: VR/AR Windows "Desktop" development

Post by NickK »

@LeeN:
Why do you need multiple processes for sensors? Wouldn't multithreading be sufficient to off-load execution to different cores? Multithreading comes with shared memory by default, so that problem would be solved. Xorg deals with multiple processes because it deals with independent applications and some of these applications can run without Xorg. Do you anticipate a need to have sensors running outside of VRX like having an application to interact with sensors directly?
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

@LeeN: thanks for that tip about the camera. I think it is fixed and rotations look more reasonable now. I don't want to spam everyone with videos so I'll try to record something tonight and send it to you and NickK in case you want to see if the 3D is improved. I've also added a test ground layer so that you are around 1.5 meters above the ground and so is the monitor, that can be toggled with G. I did notice that it was uncomfortable when watching the video on YouTube with my 3D Vision glasses last time, hopefully this will be better.

Also, at some point I'm thinking of adding a touch pad soon for directional walking on the iPhone app as well since the screen space is wasted. That means that it could work as a controller for ibex even if there is no Leap Motion or alternative out there to walk around and you don't have to give up control of your mouse or keyboard, the Rift will provide the orientation :) Other gestures can be enabled easily as well, eg: swipe between virtual desktops with two fingers, two/three finger taps to perform different actions like jump for example or some control feature. Pinch on the phone might let you zoom in and out or adjust the interocular distance or some parameter... we'll see :)
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
NickK
Two Eyed Hopeful
Posts: 55
Joined: Thu Jun 14, 2012 10:59 pm

Re: VR/AR Windows "Desktop" development

Post by NickK »

druidsbane wrote:@LeeN: thanks for that tip about the camera. I think it is fixed and rotations look more reasonable now. I don't want to spam everyone with videos so I'll try to record something tonight and send it to you and NickK in case you want to see if the 3D is improved. I've also added a test ground layer so that you are around 1.5 meters above the ground and so is the monitor, that can be toggled with G. I did notice that it was uncomfortable when watching the video on YouTube with my 3D Vision glasses last time, hopefully this will be better.

Also, at some point I'm thinking of adding a touch pad soon for directional walking on the iPhone app as well since the screen space is wasted. That means that it could work as a controller for ibex even if there is no Leap Motion or alternative out there to walk around and you don't have to give up control of your mouse or keyboard, the Rift will provide the orientation :) Other gestures can be enabled easily as well, eg: swipe between virtual desktops with two fingers, two/three finger taps to perform different actions like jump for example or some control feature. Pinch on the phone might let you zoom in and out or adjust the interocular distance or some parameter... we'll see :)
If we keep adding hardware support it will quickly turn into a mess when different hardware tries to control the same basic functions. Perhaps, we should isolate these 4 basic functions (view position, view orientation, text input, pointer input) into separate blocks that keep track of their 4x4 transform matrices. Then we can have a hardware selector that binds different hardware to these 4 basic functions. IBEX will be abstracted away from hardware to avoid the potential mess. Here is a quick structure of what I am proposing:

Image
LeeN
Cross Eyed!
Posts: 140
Joined: Sat Jul 17, 2010 10:28 am

Re: VR/AR Windows "Desktop" development

Post by LeeN »

NickK wrote:@LeeN:
Why do you need multiple processes for sensors? Wouldn't multithreading be sufficient to off-load execution to different cores? Multithreading comes with shared memory by default, so that problem would be solved. Xorg deals with multiple processes because it deals with independent applications and some of these applications can run without Xorg. Do you anticipate a need to have sensors running outside of VRX like having an application to interact with sensors directly?
Nick, there are several benefits I can think of, of doing it this way. This isn't just about doing multiprocessing or shared memory.

Also note (in case this might get confusing), when I say sensor process I don't mean raw sensor data. I mean processed data that could just be used directly with out custom code in ibex/vrx. In fact you could have a sensor process which does a sensor fusion of Kinect and Hydra for example, so a single sensor process can use the Hydra to get pretty accurate and low latency head tracking and then use kinect for the hands and then push that out for vrx/ibex to use.

These are the benefits over multithreading in the same process:

1) Sensor developers can use any libraries they want. They don't need to build custom versions of ibex/vrx, or be limited in anyway by ibex/vrx. This also works with versioning, some software may be written with different versions of libraries that might not be compatible with in the same program (OpenCV has this problem I've noticed). Again also, I go back to what I saw working with PTAM, it is deeply integrated into it's libraries, and I didn't want to integrate all those libraries, the fastest way to test things was to use IPC. (You can actually see my IPC code, commented out, in vr x).

2) If a sensor process crashes or has to be restarted for some reason, it can be restarted with out having ibex/vrx also crash or be restarted and vice versa.

3) Some sensors need to be recalibrated when a program starts, this recalibration would mean that during development everytime you restart ibex/vrx you have to recalibrate the sensors. You can cache calibration data and some libraries allow this but there are other special cases. I've found for example, for some reason, that the Hydra on one of my computers can take several minutes to restart even with the controller properly seated.

4) When ibex/vrx have to restart, they may lose 3d positions of windows. So when doing sensor development, if I move windows in 3d around myself and I then find I need to restart the sensors, in a single process I will end up restarting ibex/vrx and my 3d window positions will be lost. We can probably save the state of these 3d matrices some where (properties on an x window for example), but that does add some overhead of synching that data.

5) ibex, vrx, wayland and any other 3d program can also benefit by being able to use any libraries they want for doing 3d graphics and still be able to use 3d sensors with out having to rebuild them with their architecture/libraries and with out being forced to use some vector library or opencv or openni or gvars3 or TooN, etc.

These are just the things I can think of, reflecting upon my experience, there are probably several other good reasons to split this to multiple processes.
NickK
Two Eyed Hopeful
Posts: 55
Joined: Thu Jun 14, 2012 10:59 pm

Re: VR/AR Windows "Desktop" development

Post by NickK »

druidsbane wrote: @NickK: That's great. I was about to ask if we were using the FreePIE library that I keep reading about on the forums before I looked it up and discovered it was C# only. Probably dependent on Windows API's as well. Still, excellent work :)
Hydra code depends on Sixense library that ships with its SDK. I have reused LeeN's code in CMakeLists.txt to initialize proper CMake variables and copy all the relevant libraries and headers into the installation directory. I will also include LeeN's full name in the code to give him credit for it.

He also raised a good point that it may be better to keep the sensor code in a separate application.

@LeeN:
A separate application for sensor processing can also be used to improve sensor accuracy. For example, you may have several independent sensors feeding their data streams into 1 process. Within the process you can use multiple data samples to reduce noise and smooth out motion. You can also use the Kalman filter algorithm (optimal 2nd order statistical predictor) to reduce latency by sending the predictor first and then correcting it when the data becomes available. Any dependent application like VRX or IBEX won't even know that you added an extra sensor or if you do any fancy noise reduction.
NickK
Two Eyed Hopeful
Posts: 55
Joined: Thu Jun 14, 2012 10:59 pm

Re: VR/AR Windows "Desktop" development

Post by NickK »

Guys, I'm having a problem with Hydra mouse emulation. Basically, I can use Hydra to rotate the view, move around, move mouse pointer but clicking remains a problem. I send out button press and buttom release commands into the X event queue. It works on windows, on the application panel to launch applications, but anything XFCE related doesn't work. Applications Menu opens up but it ignores my clicks when I try to select anything within the menu. Also, window minimize/maximize/close buttons on the window title bars don't work. Any ideas what I may be missing here?
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

Are you sure that the coordinates are correct? Did you try a drawing application for example to make sure the coordinates look like they are in the right spot? I haven't seen the code nor do I have a hydra so I can't comment 100%, for my part I left the mouse to send events directly to applications which is why I didn't run into such issues. Also, one thing is, how do you find out which windows to send the events to? There are some InputOnly windows out there that you might need to include, etc...
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
NickK
Two Eyed Hopeful
Posts: 55
Joined: Thu Jun 14, 2012 10:59 pm

Re: VR/AR Windows "Desktop" development

Post by NickK »

druidsbane wrote:Are you sure that the coordinates are correct? Did you try a drawing application for example to make sure the coordinates look like they are in the right spot? I haven't seen the code nor do I have a hydra so I can't comment 100%, for my part I left the mouse to send events directly to applications which is why I didn't run into such issues. Also, one thing is, how do you find out which windows to send the events to? There are some InputOnly windows out there that you might need to include, etc...
Yes, I'm sure the coordinates are correct because clicking works on the main panel, on windows themselves, and to open Application Menu. I can also see that the pointer correctly raises focus without any clicks. Below I show the function for mouse button press emulation. There is a similar one for the button release event.

Code: Select all

void hydraMouseClickPress(Display* dpy, int button) {
  XEvent event;

  if (!dpy)
    return;

  memset(&event, 0x00, sizeof(event));

  event.type = ButtonPress;
  event.xbutton.button = button;
  event.xbutton.same_screen = True;

  // Find the pointer over the whole display
  XQueryPointer(dpy,
                XRootWindow(dpy, 0),
                &event.xbutton.root,
                &event.xbutton.window,
                &event.xbutton.x_root,
                &event.xbutton.y_root,
                &event.xbutton.x,
                &event.xbutton.y,
                &event.xbutton.state);

  event.xbutton.subwindow = event.xbutton.window;

  // Find the pointer over the top most window
  while (event.xbutton.subwindow) {
    event.xbutton.window = event.xbutton.subwindow;

    XQueryPointer(dpy,
                  event.xbutton.window,
                  &event.xbutton.root,
                  &event.xbutton.subwindow,
                  &event.xbutton.x_root,
                  &event.xbutton.y_root,
                  &event.xbutton.x,
                  &event.xbutton.y,
                  &event.xbutton.state);
  }

  // Dispatch the button press event to the top-most window
  event.type = ButtonPress;
  event.xbutton.state = 0x100;
  if (!XSendEvent(dpy, PointerWindow, True, 0xfff, &event))
    std::cerr << "Error in Hydra mouse click\n";

  XFlush(dpy);
}
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

Do you also add a: ButtonRelease event? I'm sure you do but just asking. Maybe it responds to a different event that the button press? I know that to close a window I need to release the click not press it only.
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
NickK
Two Eyed Hopeful
Posts: 55
Joined: Thu Jun 14, 2012 10:59 pm

Re: VR/AR Windows "Desktop" development

Post by NickK »

druidsbane wrote:Do you also add a: ButtonRelease event? I'm sure you do but just asking. Maybe it responds to a different event that the button press? I know that to close a window I need to release the click not press it only.
Yes, I have a similar function for the button release event. Hydra's button press looks like this: 0000111110000. So, I'm tracking the gradient 01 transition triggers the button press event and 10 transition tiggers the button release event. Without the button release event I wouldn't be able to launch applications from the main panel, since it is the release event that triggers their launch and it works for me.

Alright, I've tested Application Menu clicks with the xdotool and it works alright. Also, xev shows that some of Button Release events that I send out don't make it into the event queue. It means something is broken in my code. Please, give me a little time to figure it out.
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

It is worth checking to see if somehow we're eating up those missing events in ibex itself? Another thing that's worth trying is XSync(dpy, false) to ensure that you flush the event cache in X11. Also, we eat up the error codes from X11 so maybe that has something to do with it? I doubt it but worth a shot maybe turning off any of our event consumption for example (comment out that bit in main in the while loop) and just doing the render and letting the hydra pass everything through.
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
NickK
Two Eyed Hopeful
Posts: 55
Joined: Thu Jun 14, 2012 10:59 pm

Re: VR/AR Windows "Desktop" development

Post by NickK »

druidsbane wrote:It is worth checking to see if somehow we're eating up those missing events in ibex itself? Another thing that's worth trying is XSync(dpy, false) to ensure that you flush the event cache in X11. Also, we eat up the error codes from X11 so maybe that has something to do with it? I doubt it but worth a shot maybe turning off any of our event consumption for example (comment out that bit in main in the while loop) and just doing the render and letting the hydra pass everything through.
I've done it! Victory shall be mine! All menus work correctly and I can move windows around with Hydra alone, without using my mouse.

Actually, I've looked into xdotool source code and found out that they are using X extension function XTestFakeButtonEvent() which takes care of all the window hierarchy and stuff. Just a single call to XTestFakeButtonEvent() plus XFlush() does the job. In your opinion, is it OK if I introduce this dependence on Xtst extension library like that? It's damn convenient.
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

I think it's great! Just because something is called test doesn't mean we can't use it :) we can make a note that this is a dependency we want to get rid of later if necessary and take a look at the code which seems to be BSD licensed. Honestly as long as the library doesn't go away it seems like a great abstraction layer we can use in other plugins too. Good job!
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
LeeN
Cross Eyed!
Posts: 140
Joined: Sat Jul 17, 2010 10:28 am

Re: VR/AR Windows "Desktop" development

Post by LeeN »

Vrx uses xtest also. It's an extension, so part of it's source code may be in x server code and may not be as useful to x clients.
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

@LeeN: does that mean that you don't recommend using Xtest or can we expect it to generally be there and we should use it?

@NickK: I looked at your diagram up top and I agree in general with the hardware selector.

Combining your idea and LeeN's idea its possible you can have an option to activate the devices you want and if there are any conflicts the library or server process tells us. For example, on the iPhone you can turn on multiple sensors and I believe it does the sensor fusion to get more accurate results. If you don't have a device it just does nothing. In this case you can offer a list of each class of device and you can activate it for each of those functions using library calls. If something conflicts we let the user know and don't activate it.

Also, LeeN, for your suggestion, I'm guessing you want to have two parts: a server process that can talk to the various sensors and even keep them up all the time (there can be a control panel for users to configure), and second part is a user library that communicates using shared memory? I think that if you go the server model you can use either shared memory or sockets to achieve the same effect, whichever is easier and still gain the nice abstraction you mentioned like if the user is in the middle of the room when the program crashes and reconnects the user is still in the middle and doesn't need to recalibrate. That also means that users can enable or disable layers of filtering or sensor fusion so they can tweak for greater or lesser latency and apps don't need to think about. Lastly, maybe a straight passthrough of the raw data so applications can process themselves if they want for each device, the only difference being a slight translation layer to ensure everything looks right. I think we're getting ahead of ourselves here considering we don't have that many devices being supported but once we have a few I'm sure a common API will present itself.
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

I've updated the distortion shader and pushed it out. As you can see even in this lower resolution than the rift image, text is *much* more readable now! It will also allow for variable distortion in both the horizontal and vertical directions, hope that's good enough. I know the Rift SDK will include shaders or documentation, but it is more fun to try to research and implement this independently!

Image
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
NickK
Two Eyed Hopeful
Posts: 55
Joined: Thu Jun 14, 2012 10:59 pm

Re: VR/AR Windows "Desktop" development

Post by NickK »

@druidsbane:
We will adjust the distortion later when Carmack's shaders are available. My understanding is that the actual shader depends on the curvature of the lenses that they will select. Since the info is not available yet, it is sufficient to have an approximate solution.

@druidsbane and @LeeN:
As a next step I propose to clean up my Hydra code and commit into IBEX repo. We can add other Hydra features later or remove XTest dependence if it becomes a problem. I will try to create the hardware independent classes and plug in Hydra and iPhone code into it. (Will probably leave the mouse and keyboard intact at the moment to simplify it.) I will try to finish the code by the next weekend.

To implement LeeN's idea we'll need to generalize the hardware selector class. For my first commit I will use something simple to bind hardware output. But LeeN can implement the hardware selector that establishes an IPC link between sensors and the hardware independent view position/orientation.

Before the RIFT ships we still have a lot of work to do. IMHO, 4 major remaining items are:
+ multiple virtual monitors like in LeeN's code (we can probably reuse his code as well),
+ virtual keyboard for typing,
+ tablet input for mouse control, window zoom in/out, and other features that LeeN is working on,
+ enable gnome and KDE in IBEX.
LeeN
Cross Eyed!
Posts: 140
Joined: Sat Jul 17, 2010 10:28 am

Re: VR/AR Windows "Desktop" development

Post by LeeN »

I don't think Xtest is a big deal, in fact we may not have a choice in some cases.

I agree a multiprocess sensor architecture is to early, especially that we don't know yet all the features that we would need to support. Like gestures for example.
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

That is true, the shaders are lens dependent. Others on the forum have been able to match the original Rift's parameters, but the final lens remains to be seen.

Glad to hear the hydra code is done. If it gets cleaned up so it is optionally built if one doesn't have sixsense that would be cool. At some point we may need to decide if we should statically link that or just make it a separate package as in one package per sensor type the user wants to support. Also, for now, the glue layer should be very thin as we don't really have much to go on to abstract things away and so it wouldn't be obvious yet what the best abstraction is, right?

Lastly, I'd recommend that if we are supporting sensors, we should make an effort to take all their input into account if they are active, so it shouldn't necessarily be either or, the same way you wouldn't deactivate the mouse if there was a hydra, there may not be a need to deactivate the motion control from the hydra from the motion control of the iPhone for example. Just thoughts. The iPhone app will be updated soon to allow for walking as well. As you can see the code is pretty short for what it does so we should try to keep it as short or even shorter if possible :) I can help with that cleanup when you are ready!

Lastly, one thing about the shader that I realized is that b/c the lens distortion shrinks the view, that means that we can actually increase the size of our rendering surfaces based on the largest side of the render so that it fills the screen more optimally, should be nice when it is done :) Looking forward to seeing this and other features added soon!
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
danielbln
One Eyed Hopeful
Posts: 48
Joined: Tue Sep 11, 2012 6:34 am

Re: VR/AR Windows "Desktop" development

Post by danielbln »

Hope this hasn't been posted yet:

[youtube-hd]http://www.youtube.com/watch?v=_FjuPn7MXMs[/youtube-hd]
3D compositor written using the QtCompositor API (http://qt.gitorious.org/qt/qtwayland), capable of mapping input to and compositing the rendered output from Wayland applications. With QtWayland we're no longer limited to compositing of in-process widgets as in WolfenQt: http://www.youtube.com/watch?v=MXS3xKV-UM0
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

danielbln wrote:Hope this hasn't been posted yet
I haven't seen this one posted before. Really great find. Any idea how this will play out considering the changes proposed for Wayland's handling of input? I don't know much about those changes but I heard they might make it harder to do stuff like this in the future. Looks very polished what's there right now and looking forward to seeing how it progresses.
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

The more I think about it the more this seems to be closer to an ideal: it has separate apps laid out in 3D, input works great, no need for a window manager. Interesting how it was done in Wayland and seems to handle mouse and inputs fine. Whatever we do in ibex will be modular so if there is anything worth keeping then adding it to something like that Wayland QtCompositor maze demo or something wouldn't be too hard. Very interesting though :) The lack of a window manager/desktop may not be an issue as I'm sure that KWin and others will run fine on Wayland this way at some point, the other issue is performance, currently this may not run ideally, though with the right hardware it could. It also solves the issue I have with ibex (though not with vr-x) which is the need for physical displays to get virtual desktops from, this elminates it. With vr-x you don't need those, you can create as many virtual displays as needed though Xephyr isn't hardware accelerated... neither is Wayland for now. Very interesting times for those of us wanting a virtual desktop!
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
NickK
Two Eyed Hopeful
Posts: 55
Joined: Thu Jun 14, 2012 10:59 pm

Re: VR/AR Windows "Desktop" development

Post by NickK »

@danielbln:
Looks nice. This is where I see things are going when W makes it into the major Linux distros. Are you the developer of that Wayland 3D compositor?
druidsbane wrote:The more I think about it the more this seems to be closer to an ideal: it has separate apps laid out in 3D, input works great, no need for a window manager. Interesting how it was done in Wayland and seems to handle mouse and inputs fine. Whatever we do in ibex will be modular so if there is anything worth keeping then adding it to something like that Wayland QtCompositor maze demo or something wouldn't be too hard. Very interesting though :) The lack of a window manager/desktop may not be an issue as I'm sure that KWin and others will run fine on Wayland this way at some point, the other issue is performance, currently this may not run ideally, though with the right hardware it could. It also solves the issue I have with ibex (though not with vr-x) which is the need for physical displays to get virtual desktops from, this elminates it. With vr-x you don't need those, you can create as many virtual displays as needed though Xephyr isn't hardware accelerated... neither is Wayland for now. Very interesting times for those of us wanting a virtual desktop!
@druidsbane:
In Wayland the compositor and the window manager are one. You can build your own protocol on top of Wayland to introduce additional restrictions like unifiied look and feel and this is how widget libraries will work on W. You can also introduce other restrictions (like 3D) that will interpret client surface data in a certain way (like 3D objects). W is a different architecture which is why it's causing some controversial discussions.

I've received Kristian's response regarding the global XY coordinates. He feels strongly that the protocol will not change now. He favors the strategy for exporting global coordinates to clients on request. For X clients XWayland supports global coordinates as a special case for compatibility reasons.
Kristian wrote: X is a special case, since there are a lot of clients that rely on popping up windows based on where they are on the screen and what else (panels etc) is on the screen. We can't go back and change those clients. For toolkits or apps ported to wayland we can make them use the popup placement protocol or a session management protocol to place windows where they were the last etc. None of these problems require the app to know where it is, it's just how X used to work and how people still think.
That's basically the same popup problem LeeN is working on. If we have the popup placement protocol, the problem can be easily solved in 3D.
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

Quick update! I've gotten the Irrlicht 3D rendering engine working as a plugin to Ibex. The idea is that we want to be able to render our world in any way we wish and I for one don't intend to write my own complete 3D engine for Ibex :) What this means is that I have a much better idea of the kinds of abstraction and interfaces I want to have for the video component of Ibex. I've attached the screenshot below. There may still be issues getting SBS rendering and the shaders for post-processing but it was at least worth a shot. It loads a complete Quake 3 level and you can walk around at least one level, I don't allow flying around just yet :)

@NickK: you've done some great work with the Hydra code, I intend to merge that in as soon as possible now and integrate my code changes for the Irrlicht plugin as well. There is also some code for the Ogre 3D engine I'm debating including. It may be useful in the future rather than throwing it out if we can get more native rendering of textures and stuff within. We'll see.

Image
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
NickK
Two Eyed Hopeful
Posts: 55
Joined: Thu Jun 14, 2012 10:59 pm

Re: VR/AR Windows "Desktop" development

Post by NickK »

druidsbane wrote:Quick update! I've gotten the Irrlicht 3D rendering engine working as a plugin to Ibex. The idea is that we want to be able to render our world in any way we wish and I for one don't intend to write my own complete 3D engine for Ibex :) What this means is that I have a much better idea of the kinds of abstraction and interfaces I want to have for the video component of Ibex. I've attached the screenshot below. There may still be issues getting SBS rendering and the shaders for post-processing but it was at least worth a shot. It loads a complete Quake 3 level and you can walk around at least one level, I don't allow flying around just yet :)
This looks very nice. Can you create an interface between IBEX core and the 3D rendering engine? Users will probably want to change them like backgrounds on their 2D desktops. Please, don't discard the skybox with the lake you had before. It was also a very nice theme.
druidsbane wrote: @NickK: you've done some great work with the Hydra code, I intend to merge that in as soon as possible now and integrate my code changes for the Irrlicht plugin as well. There is also some code for the Ogre 3D engine I'm debating including. It may be useful in the future rather than throwing it out if we can get more native rendering of textures and stuff within. We'll see.
It is not ideal but it is ready for testing and integration. In case LeeN is interested the repo of IBEX with Hydra is located here:
https://bitbucket.org/nickk/imwi

To enable Hydra, open top level CMakeLists.txt file and change the following flag from OFF to ON:

Code: Select all

option (ENABLE_HYDRA "Add Hydra hardware support" OFF)
On the first launch, Hydra takes about 5 seconds to initialize. During that time you'll see a poisonous green screen and nothing else. To speed up the initialization, place Hydra controllers on the base.

Some comments on the Hydra integration:
Usage:
View orientation: left controller tilt
Roaming in 3D: left controller joystick
Mouse pointer: right controller tilt
Left/right mouse buttons: button 1 and button 2 on the right controller

License:
Sixense SDK is generally rather permissive for inclusion into end user products. The license for Linux/Mac is accessible here: http://sixense.com/eula
The key statements that permit us distribution within IBEX (as a publisher of IBEX) are:
LICENSE: Sixense grants Developer a non-exclusive, non-transferable, worldwide license to (i) download and use one (1) copy of the Licensed Software to create End-User Product(s); (ii) use, reproduce, publicly display, promote, publish and distribute End-User Product(s) over the Internet or otherwise; and (iii) sublicense Publisher(s) to publicly display, promote, publish and distribute End-User Product(s) over the Internet or otherwise, subject to and in compliance with the terms of this Agreement.

Developer has no right to, and agrees not to, display, perform, copy, distribute, license or sub-license copies of the Licensed Software or the Embedded Code except as a part of or within an End-User Product.
In compliance with the license I have included the full statement of the license in the Hydra directory and made a reference to it from the new top level file LICENSE.

Important: As we start integrating other people's code (e.g. Irrlicht engine) we'll need to include references to this software in the top level LICENSE file as well. There is also a problem with LeeN's code integration since his code is published under LGPL3 while IBEX is under GPL2.

SDK:
Unfortunately, the Sixense SDK is not complete. For example, they have MousePointer class interface in their header file, while the libraries don't actually contain the implementation. So linking will fail if you try to use MousePointer class. Interestingly, LaserPointer class is present and implemented in the libraries. For now, I implemented my own mouse pointer.

Another thing is that Hydra orientation has significant jitter. (Maybe that's the reason why MousePointer was stripped down from the libraries?) To get rid of the jitter I enabled a Hydra's internal filter (integrator?). It makes the mouse pointer control feel more like an FPS gun aiming rather than a mouse itself. As I reduce the effect of the filter, the pointer response time decreases but the jitter increases rather rapidly. For now, I've selected the filter parameters that seemed like the best tradeoff. I believe I can write my own filter algorithm that will do a better job but it is already reasonably good.
druidsbane wrote: There is also some code for the Ogre 3D engine I'm debating including. It may be useful in the future rather than throwing it out if we can get more native rendering of textures and stuff within. We'll see.
Ideally, you may want to create an interface that allows you to swap different 3D engines with minimal changes in IBEX code. Unfortunately, my knowledge of 3D engines is rather weak. I looked into Ogre 3D half a year ago but came away with the impression that it didn't properly support the programmable GPU pipeline and was still mostly based on the fixed function rendering. I may be wrong about it though.
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

@NickK, everyone else: do you think we should package the SDK along with ibex or just have users/developers download and install themselves? It isn't clear from the highlights that we can distribute it in our repo. I'll read up a bit more and email them if confused.

Also, finally got SBS rendering done on the Irrlicht engine. This will make it easier to load levels and maps that others have built and walk around. I haven't had a chance to demo anything but the skybox is added and SBS rendering works. I haven't had a chance to add a flag that lets you run in regular mode instead of SBS. Still impressed that I'm rendering all the world twice and then texturing it on two halves of the screen and am still getting 60FPS on my virtual machine!

Image
You do not have the required permissions to view the files attached to this post.
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
shakesoda
One Eyed Hopeful
Posts: 28
Joined: Mon Sep 24, 2012 11:46 am

Re: VR/AR Windows "Desktop" development

Post by shakesoda »

Tried out the latest updates on my laptop, I'm just getting white for the screen when using irrlicht. I've also been unable to navigate with the mouse in either mode (can't even use the keyboard to navigate in irrlicht mode), not sure if I'm just missing something though.

Have you already started at all on supporting hillcrest trackers? Just wondering before I give it a go myself. I've got the parts together for my DIY rift so I can get things working in my engine before the kit shows up in december. :D
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

Have you done a 'make install' to ensure all the files are installed? Also, I haven't gotten around to figuring out the latest crashes on my NVidia box, now that I've gotten shaders and barrel distortion working in irrlicht as well on my Parallels VM that's up next :)

For the mouse I just grab the main pointer. To navigate press ctrl+y, maybe you're using an international keyboard?

No hillcrest tracker as I have no device, but if you just plug in the orientation matrix in to the 'get_orientation' function it should work in all plugins.

Lastly, any thoughts on the stereo vision/warping with your makeshift Rift? I have two distortion methods and I think the older uglier one might be more correct as the new one seems only to warp the outside parts in not blow up the center like I thought was needed. You can see a screenshot below to see what I mean:
Image
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
shakesoda
One Eyed Hopeful
Posts: 28
Joined: Mon Sep 24, 2012 11:46 am

Re: VR/AR Windows "Desktop" development

Post by shakesoda »

I didn't run make install, I figured everything should run out of the folder. I'm not using an international keyboard, but I was using ctrl+shift+y instead of just ctrl+y. The mouse works fine for navigating in the windows, just not at all for rotating the view.

Integrating the hillcrest tracker sounds straightforward, will report back as soon as I can figure out why libfreespace doesn't want to play nice with my FSRK-USB-2. All of my boxes have nvidia cards in them, if you need any help debugging there I'd be glad to poke at it.

As for the distortion, I don't think you want to blow up the middle. I still have to assemble the parts so I'm not sure which one looks more correct in practice, but in my messing around with the lenses I think you only want to bring the corners in.
StreetRat
Two Eyed Hopeful
Posts: 65
Joined: Sun Oct 24, 2010 11:11 pm

Re: VR/AR Windows "Desktop" development

Post by StreetRat »

Thats coming along nicely.
How does rendering the complete desktop work with 2 or more screens? Is the desktop one complete thing or can it be cut up?
I was trying to do individual programs and the ability to move each one separately but i think thats asking for more trouble than its worth.
The complete desktop dosent seem to bad, but under windows, that sort of limits you to one display floating in space.

Unfortunately my tests resulted in about 3 - 5 frames a second tops.
Wrong language i think.
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

StreetRat: the desktop is currently a single unit. I'm still not sure how I can get a larger desktop, actually I think I do: create a desktop in your X configuration, but set a "Virtual" desktop size that is much larger. That should work b/c my software works with the virtual desktop space, so if you have something larger than the native resolution it would render fine. The real question is how to get the desktop to switch to that size... once that happens I can try to create virtual monitors by making the desktop really wide and just cutting them up in the middle.

What language are you using to do your compositing on Windows? C#? Are you using the DirectX call? How do the remote desktop programs like Citrix and VNC manage to get decent performance? Maybe what you need to do is only update your texture for damaged areas. Not quite the same as fast hardware acceleration but maybe that could improve the performance to useable levels? Also, don't forget that you can create a program that runs silky smooth on Windows and respond virtually to your every move, however just have the texture update slowly, so it is a laggy display so to speak, but in a responsive world so you don't get dissoriented.

Now on to my issues. I ran into a problem with the Irrlicht plugin on my desktop. I fixed the crashes on NVidia, however it renders pure white. Spent the better part of a week debugging it to no avail. So I put it aside for now, it may work on other cards like Intel which I'd like to test on my laptop or ATI on another one, but I know it works on my virtual machine for example. If anyone is interested or gets it to work please let me know, otherwise maybe we'll just see if it works on other cards.

Instead, I've moved on to trying to update the Ogre3D engine once again. I managed to get it working finally and honestly I'm happy b/c I think it is a more advanced engine overall. Currently the desktop runs on a spinning cube. I have it installed on Ubuntu 12.10 (dev version) but you can use any version you want, what is needed are two things to run though:

1) Ogre 1.8, and manual updates of the plugins.cfg (in the resources folder)
2) The BaseApplication.cpp file in the ogre3d_plugin folder to point to the location of your Ogre3D libary.

The last part I'll try to auto-configure based on the build, but we can't go more dynamic than that b/c you do need to compile with something after-all :) Here is a screenshot showing what I got finally. Looking forward to now loading in some blender models and terrain then working on the dual view and distortion shaders.

NickK still hasn't integrated his code as he's a bit busy, but once that happens I'll make sure the orientation works with his stuff as well.

Image
You do not have the required permissions to view the files attached to this post.
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
User avatar
cybereality
3D Angel Eyes (Moderator)
Posts: 11406
Joined: Sat Apr 12, 2008 8:18 pm

Re: VR/AR Windows "Desktop" development

Post by cybereality »

Ha! I know that head!
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

How so? Don't leave us hanging :)
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

Re: VR/AR Windows "Desktop" development

Post by Fredz »

Cyber talked about the ogre head in the last screenshot I guess, it's from the Ogre 3D engine.
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: VR/AR Windows "Desktop" development

Post by druidsbane »

Yeah, I know, I'm demonstrating the Ogre3D engine integration with ibex :) Did you work on/with Ogre3D before cyber? I'm just trying to get as many options open as possible to design the world around your desktop without having to write a complete engine myself. Especially since Irrlicht and Ogre both have tools and communities around them. Exciting times :) Just another month or two for the Rift!!!!
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

Re: VR/AR Windows "Desktop" development

Post by Fredz »

Oups, sorry. :)

Great engine anyway, I used it several years ago and even wrote a tutorial about game states on their Wiki.
Post Reply

Return to “VR/AR Research & Development”