Testing out how my system, the rift, and the leap play together. Positional tracking is the enabling technology that allows the leap to be used properly in VR since you have to know where the leap is relative to the rift. At the beginning I calibrated where the leap was by placing the rift on top of it.
http://www.youtube.com/watch?v=sAtwW6pzqGY
Update: 7/17/2013
It took a while but I finally got around to recording a more informative video of my system. In this video the tripod is on my desk and the camera is out of the frame above my head.
http://www.youtube.com/watch?v=6kCjvF2bKO0
The top left is a 3rd person view, the bottom left is rift-eye view, and the right is me. In the beginning you can see there are only two fiducials visible on the top of the rift. I then move through a wide range of motions that are possible with my system. The only flaw in this iteration are that the red LEDs appear dimmer to the camera than blue LEDs would. My next iteration will use brighter blue LEDs to fix the jitter (Brighter=Bigger Blob=More Pixels to Average for position)
To quickly recap for newcomers: What makes my system unique is that it can robustly determine the position of a user's head across a wide FoV and set of angles while using only a single camera and two tracked points on the Rift. The innovation (over similar systems like Johnny Chung Lee's) comes from the fact that I am using the Rift's IMU to fill in the missing orientation information that would normally be captured Optically (but would restrict the freedom of movement). I am using a PSEye and Processing to funnel the point tracking data to Unity at 125 Hz.
OLD:
I recently implemented support for the PSEye.
http://www.youtube.com/watch?v=XIbaxnpYUsM
OLD:
I've invented a new technique for 6 DoF positional tracking that uses a wii remote, a wireless sensor bar, and an Oculus Rift.
This technique is unique because it requires only two tracked points, and supplants the rest of the information with the Rift's IMU.
The goals of this system are affordability and flexibility. Using a single Wii remote and a wireless sensor bar (attached to the Oculus Rift) I am able to obtain absolute 6 dof positioning.
Soon I'll be able to replace the wii remote with a Leap because they will be implementing a blob tracking API compatible with Unity (increasing the resolution enormously).
Here's a picture of the current prototype:
http://i34.photobucket.com/albums/d144/Zalo10/photo_zps9a7f1cc3.jpg
The wii remote tracks the points on top of the rift. This perspective allows for full 360º yaw, a little more than 180º for pitch, and a little less than 180º for tilt.
Future prototypes using the leap will be mounted on a flexible arm attached to the monitor. The wii remote requires that height because of its limited 33ºx23º FoV.
Here's a video of it in action:
http://www.youtube.com/watch?v=RFrCf-O5Bck
Notice how the user is able to look in all directions.
Jerkiness is due to the low real resolution (128x96) of a wii remote IR camera and will disappear with a higher resolution camera.
I realize that now that Sixense has announced their solution, my solution will be competing with theirs. Here are the Pros and Cons comparing the two systems:
- Pros:
- Cost (>$3 for the wireless sensor bar, $30 for a wiimote or $80 for a leap)
- Latency (Leap has ridiculously low latency)
- Weight (LEDs can weigh next to nothing)
- No Magnetic Interference
- Cons:
- Requires Line of Sight
- Conical Capture Volume
Once the Leap is integrated, I suspect the resolution of the two systems will be comparable.
Any questions or suggestions?