Rift Accelerometer to Position Visualizer Demo

Post Reply
zalo
Certif-Eyed!
Posts: 661
Joined: Sun Mar 25, 2012 12:33 pm

Rift Accelerometer to Position Visualizer Demo

Post by zalo »

Rift Accelerometer to Position Visualizer

Image

What?
Using the Rift Accelerometer which was just exposed in the most recent SDK update, and a little math, you can integrate the acceleration data to approximate position.

Why?
The accelerometer can output data to Unity at 100hz, significantly reducing latency in other motion based systems. For example, fusing high speed relative accelerometer data and low speed absolute Hydra data will give you the best of both systems.

How?
Just using the new accelerometer data and subtracting gravity. A few more details on that here.

Where?
You do not have the required permissions to view the files attached to this post.
Last edited by zalo on Mon Oct 14, 2013 11:09 am, edited 1 time in total.
User avatar
cybereality
3D Angel Eyes (Moderator)
Posts: 11406
Joined: Sat Apr 12, 2008 8:18 pm

Re: Rift Accelerometer to Position Visualizer

Post by cybereality »

Interesting work.

Still, it looks like this is not a very robust method (as many assumed) but kudos for giving it a try.
User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

Re: Rift Accelerometer to Position Visualizer Demo

Post by Fredz »

Nice that you had a try at that.

You could also try to use the magnetometer to give some sort of reference to get better positional tracking. Brantlew mentioned this some months ago (see : http://www.mtbs3d.com/phpBB/viewtopic.p ... 48#p107048) and a quick test showed that it could be an interesting approach. I think geekmaster was working on this as well, no idea if he came with something usable though.

Could be useful to detect crouching or jumping like it was done in V^Bert also.
cegli
One Eyed Hopeful
Posts: 36
Joined: Thu May 16, 2013 5:35 pm

Re: Rift Accelerometer to Position Visualizer Demo

Post by cegli »

In university I tried to use accelerometers to calculate the 2D position of box. It did not work very well and the coding was brutal. I had so many hacks/weird math in place to get it working at all. It drifted and was very inaccurate. I also think it needed to be smoothed a lot for it not to be jittery. Maybe accelerometers have progressed a lot since then, but that was a lesson in what not to attempt with accelerometers to me.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: Rift Accelerometer to Position Visualizer Demo

Post by geekmaster »

Fredz wrote:Nice that you had a try at that.

You could also try to use the magnetometer to give some sort of reference to get better positional tracking. Brantlew mentioned this some months ago (see : http://www.mtbs3d.com/phpBB/viewtopic.p ... 48#p107048) and a quick test showed that it could be an interesting approach. I think geekmaster was working on this as well, no idea if he came with something usable though.

Could be useful to detect crouching or jumping like it was done in V^Bert also.
I put a lot of thought into it, but still developing a code base. I have a number of different methods that I plan to fuse, which each constrain the postition in various ways. Think "simulated friction" and "quadrature amplitude modulation decoding", coupled with a skeletal model that uses your feet or posterior as a fixed reference point to anchor into the virtual world model.

For a point of reference, watch the video here, then think what you could do with a single Rift sensor and a good skeletal model sitting in a chair:
http://www.kickstarter.com/projects/yei ... eas-moving

I wish I had more free time (or a full time job in VR)...

I am using a razer hydra as a feedback reference while "training" my positional tracking methods. Regarding usefulness, I did figure out how to eliminate the large noise and large latency from the hydra, and now have noise free almost zero-latency. I will publish something about how I analyzed the HID packet noise looking for a pattern. I found a clear signal buried in the noise, so I can detect and subtract out the noise. It is the noise that drags the filter into latency, and no noise means no filter latency. I need to implement a software phase detector to select the correct leading phase though. For now when my filter latches onto the wrong phase I need to manually sequence to the next phase using the 'p' key-press in my program. There is about a 20 msec lag between the leading edge and the trailing edge of the hydra sample point cloud, and I want to stay latched onto the leading edge. It works quite well. I have some nice data plots to demonstrate it. This shows some data points in a sequence of hydra HID packets:
hydra-lag2a.png
Red = data samples
Green = average of most recent four samples
-- I think you can see one of the four phase patterns my code recognizes here. One early sample at the leading edge of the motion profile curve, and three samples much later (perhaps output from some high-latency filter?)... There is also a reflected pattern with one sample very late instead of three samples very late...

I only use the points from the leading phase to anchor my oppressively filtered snap-vector hyper-prediction data points that replace the three sample point phases that I discard, giving a zero-lag motion profile curve that works extremely well no matter how I shake and jerk my hydra.

Anyway, this is the beginning of my Rift-only position tracking. More to follow...

HTH :D
You do not have the required permissions to view the files attached to this post.
Last edited by geekmaster on Wed Oct 16, 2013 7:19 pm, edited 5 times in total.
User avatar
FingerFlinger
Sharp Eyed Eagle!
Posts: 429
Joined: Tue Feb 21, 2012 11:57 pm
Location: Irvine, CA

Re: Rift Accelerometer to Position Visualizer Demo

Post by FingerFlinger »

This will only be tangentially related, but there is a technique for measuring neutral winds in the atmosphere using accelerometers. A team at my old company wrote a paper about it here.

Essentially, they house a bunch of seismic and MEMS sensors in a sphere, which they then drop from a rocket in the upper atmosphere. The accelerometers actually track the flight path of the sphere as it falls into the ocean, and from that data one can deduce the strength and directions of the winds.

Obviously, they are using supremely expensive (and heavy) sensors that aren't practical for VR, but some of the mathematical techniques and sensor arrangements may be useful. I've only skimmed the paper myself, but had a nice conversation with the project lead a few months ago.

@geekmaster

Cool. I've been wondering if anybody was working on this. I think training against a ground truth is the way to go for skeletal/postural/gestural recognition.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: Rift Accelerometer to Position Visualizer Demo

Post by geekmaster »

Here is another data plot from the geekmaster hydra zero-latency filter:
hydra-hyperpred2.png
To make this curve I jerked my hydra quickly toward and away from me. For a distant observer this "jerking the hydra" motion may appear as obscene. :o

The red samples are discarded hydra data packets from delayed phases that are of no interest. In fact, the clustered data points tend to travel PERPENDICULAR to the actual direction of travel, so they can derail motion prediction attempts and distort any filter results. The 20msec delayed outlier distorts any "all points" filters even farther giving more latency.

The pink samples are the data points from the leading edge that I use as input to my "hyper prediction" filter. The output of my prediction filter (from a test point preceding my "oppressive filter" step) is the set of blue points. The yellow points are the average of the most recent four predictions, just there as a reference to show the trend of the prediction curve. You can see that the blue prediction point cloud precedes the motion reported by the Rift data leading edge by a wide margin. This is true even for seemingly random jerks while swinging my arms forward and back. Human muscles tend to filter real physical motion into a low-order sum of sine waves, which is probably why my motion prediction works so well.

The blue points are predicted by snap-vectors, where snap = delta jerk, jerk = delta acceleration, acceleration = delta velocity, and velocity = difference between consecutive input sample points (pink on this plot). I actually multiply my snap vector prediction velocity by 32 before plotting it, which is what gives such a large blue point spread, but also greatly enhances early "detection" of motion "sum of sine wave" motion profile curve trends. Note that 32x is about a half second into the future (based on only the curvature of the most recent small handful of samples).

Then I feed that blue point cloud into my "oppressive filter" which would add HUGE latency for "normal" data. I use "new point = (input point + previous output point * 63) / 64. Note that 64 samples is a full second here:
hydra-filt-hyperpred.png
Notice how BAD the red hydra motion sample point cloud is spread out and delayed when repetitively shaking my hydra as fast as I can! A quarter cycle or more delay on the trailing edge of the point cloud. And yet, look at my filter output even at that rate of herky-jerky change! Surprisingly good, and 250Hz too. Brutal coolness, eh?

Now, you would think that such wild prediction followed by such heavy filtering would be useless, but remember that I am only using it to predict virtual points until we receive the next "good" data point from a "leading phase" packet.

Note that I shook my hydra quickly with my wrist while swinging it rapidly forward and back with my arm. The points are sampled at 250Hz, using the feature report data commonly used in projects like VRPN (and others). The red points are discarded hydra data from the three "late phases". The yellow points are the leading phase of input data that I feed to my "snap vector hyper-prediction" filter. The blue points are the output of my "oppressively filtered" "wild predictions".

Notice that the blue points closely follow the yellow points (actually using them when available). Anyway, 240Hz virtually zero latency head tracking using a hydra isn't that bad, and I plan to use this to make my Rift-only tracking code do as good as this as I can make it. I know I can do it. It is only a matter of time (a precious commodity)...

Oppressively filtered hyper-prediction sounds wacky, but actually works, and works well IMHO. That heavy filter makes it much less noise-sensistive, and the hyper-prediction prevents prediction point clustering I was seeing when using only 1x prediction instead of 32x prediction. even 16x did not spread the prediction points as evenly as 32x, but that yielded great results so I did not want to go farther into the future on my predictions, just to stomp them back to the present with a heavy filter.

And another sample, showing that my oppressively-filtered hyper-prediction points (blue) can actually precede the leading phase of the hydra sample data (yellow). Negative latency?
hydra-rev.png
Impressive results, eh? I think so... ;)

Opinions? Comments? Am I wasting my time doing this kind of research and development? Should I be working with Unity 3D instead?

Now, what deep magic can I extract from analyzing my Rift tracker data? ;)

EDIT: I am reading raw USB HID packets for my Rift Tracker DK and for my Razer Hydra using the signal11 hidapi library:
http://www.signal11.us/oss/hidapi/
https://github.com/signal11/hidapi
You do not have the required permissions to view the files attached to this post.
Last edited by geekmaster on Thu Oct 17, 2013 8:51 am, edited 3 times in total.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: Rift Accelerometer to Position Visualizer Demo

Post by geekmaster »

zalo wrote:Rift Accelerometer to Position Visualizer

Image

What?
Using the Rift Accelerometer which was just exposed in the most recent SDK update, and a little math, you can integrate the acceleration data to approximate position.

Why?
The accelerometer can output data to Unity at 100hz, significantly reducing latency in other motion based systems. For example, fusing high speed relative accelerometer data and low speed absolute Hydra data will give you the best of both systems.

How?
Just using the new accelerometer data and subtracting gravity. A few more details on that here.

Where?
Okay, I tried it. As soon as I picked up my Rift, I got an almost immediate "Elvis has left the building"... I was outside floating in the void with the building way off in the distance!

With a modicum of constraint, there is no reason to allow your position to move beyond a short "tether" to your current anchor point. Like where you are standing, or sitting in a chair, for example. No need for a trip to mars just because I leaned forward a bit too fast, eh?

But it is a nice start. Now, where do we go from here?
zalo
Certif-Eyed!
Posts: 661
Joined: Sun Mar 25, 2012 12:33 pm

Re: Rift Accelerometer to Position Visualizer Demo

Post by zalo »

geekmaster wrote:As soon as I picked up my Rift, I got an almost immediate "Elvis has left the building"... I was outside floating in the void with the building way off in the distance!
Heheh, your rift might have PTSD from all that shaking. I notice that after some intensive accelerations, the accelerometer reports some faulty accelerations for a couple minutes. Or perhaps that's gyro drift affecting the gyro-based gravity compensation I'm doing. I only set the drag on the object to be just enough to keep it from moving when it is sitting on a table, so you can get a feel for how responsive it is.
geekmaster wrote:Brutal coolness, eh?
Your Hydra stuff looks AWESOME. Sometimes the most audacious ideas have the biggest payoff. You say you want a full-time job in VR? Sixense would most definitely like to have you: http://sixense.com/company/jobs check out the last two job on this list.

How do you expect the rift-only tracking code to work with only the ambient magnetic fields? I feel like you'd suffer the same kinds of issues that a stereo camera tracker would suffer i.e. poor performance in magnetically uninteresting environments. If you could get it to work... skip Sixense and go straight for Oculus!

EDIT: After reading more carefully, I'm impressed with the chain of reasoning that went behind your algorithm. But I'm a little worried that emphasizing snap vectors will only be primarily good at predicting oscillatory motions, like the ones you were plotting. Can you plot a graph where the motion is more jerky and random?
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: Rift Accelerometer to Position Visualizer Demo

Post by geekmaster »

zalo wrote:... I'm a little worried that emphasizing snap vectors will only be primarily good at predicting oscillatory motions, like the ones you were plotting. Can you plot a graph where the motion is more jerky and random?
The snap vectors are heavily filtered, and reset every fourth (real) sample. They are just for extrapolating a new estimated position for the next three 4ms steps until a new good (leading edge) value comes in (every 16 msec). My testing showed that snap vectors worked better than jerk vector, jerk better than acceleration, etc. I also tried using 1x extrapolation with no filter, but the predicted points did not accelerate fast enough to evenly fill the gap between real data points.

Also, in real life, the nature of your muscles and limbs (and head mass) limits you to a summation of only a few (approximate) sine waves. The only way to add some randomness in their would be without your head mass (just the rift) and perhaps tapping it against a table or something (not likely to happen in VR). Those upward bumps at the bottom of the "sine" waves in one plot were from he vigoursly shaking my hydra while swinging my arm (an approximate sum of sine waves). In any case, during rapid sudden motions and direction changes your perception will likely overlook minor deviations from expectation that only last a few millisenconds before getting reset to a "real" measured value. They just give a little extra smoothness (i.e. 250Hz framerate).

Again, the predictions are just to allow faster sampling than the "normal" 60-62Hz rate (depending on where you read that spec -- I should actually MEASURE it one of these days)... Nothing to worry about. Just one of my MANY experiments to satisfy my own curiosity, and I decided to publish it now that I read your thread related to this arm of my R&D.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: Rift Accelerometer to Position Visualizer Demo

Post by geekmaster »

Hmm... I just experimented with plotting the output from Hydra HID packets again, but this time while moving slowly diagonally along two axes (back and up, then forward and down, then repeat), using the same code as above. Here is a mysterious result:
HydraRedux.png
All Red and Yellow dots plotted above are raw data for ONE axis in the Hydra HID packets. The blue dots are just some prediction points in my code, as mentioned in previous posts.

It appears that BOTH axes that I am moving on are being plotted, even though they are being read from the SAME axis in the HID packets. It appears in this case that two out of every four packets have swapped axis problems. This could certainly be part of the "noise" problem people complain about, where filtering hopefully helps cause less problems.

Also, there are complaints of positioning errors (up to 20cm), which could CERTAINLY be caused by this problem of sometimes reporting data from the WRONG AXIS!

I wonder if the Sixense drivers detect and correct for this problem, or if they partially (but not completely successfully) somehow mostly almost seem to filter out most of the obvious wrong axis errors. I do not have source code for Sixense drivers, so who knows for sure? Or does this problem go away if I send extra (unknown) configuration feature reports to the Hydra controller?

Clearly a firmware bug (IMHO), but unless we get new Hydra firmware, I need to continue my investigation, and add code to detect and eliminate the errant phases (every fourth HID packet, or in this case two out of four packets) that contain WRONG data... Or is it three out of four packets that are wrong (in some cases as shown in previous posts)? Whatever, it is more than a little annoying, but I have had to code software workarounds for unfixable firmware bugs many times in my career, so not much of a surprise here...

EDIT: When I move the controller diagonally along two axes, tangential to the base instead of toward it, those two waveform phases plotted above converge, so they are clearly behaving as belonging to two different axes even though the hydra reports them as being the same axis.
You do not have the required permissions to view the files attached to this post.
zalo
Certif-Eyed!
Posts: 661
Joined: Sun Mar 25, 2012 12:33 pm

Re: Rift Accelerometer to Position Visualizer Demo

Post by zalo »

Perhaps it outputs it like that to make the math easier for the software when accounting for rotation.

It's clear they didn't design the HID output for use by anyone but themselves.

Maybe you accidentally connected the two coils in the midst of hardware hacking (shorting two pins on the connector?) and it reports two different voltages as it goes coil-by-coil each cycle?
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: Rift Accelerometer to Position Visualizer Demo

Post by geekmaster »

zalo wrote:Perhaps it outputs it like that to make the math easier for the software when accounting for rotation.

It's clear they didn't design the HID output for use by anyone but themselves.

Maybe you accidentally connected the two coils in the midst of hardware hacking (shorting two pins on the connector?) and it reports two different voltages as it goes coil-by-coil each cycle?
I did not open my Hydra. Only hacking HID packets.

I guessed after posting my previous message that the four-phase variation is probably caused by sampling at four-times normal rate, when the field coils output four phases (each axis independently, then a quiescent phase for synchronization).

Perhaps there is a cycle phase flag in the HID packets. I do not have official HID packet record layouts, so I had to see what changed in HID packet hex dumps while manipulating all the controls (i.e. reverse-engineering). BTW, even though the HID config packet is small, you can grab a full 4096-byte Hydra controller memory snapshot with it (buffer overflow?) just by requesting a 4K report size. "Interesting" data patterns there...
;)

EDIT: I do plan to mod one of my Hydras (which is why I have multiple of them). I plan to use larger base coils (perhaps 8-inch diameter) to see how far I can extend the range. I may also add amplifier drivers to the base coils for more current, to see how far we can extend the range that way. I will experiment with longer controller cables, and also doing the mod PalmerTech did of separating out the controller coils for minimal weight. Another plan is to multiplex (with analog switches) a bunch of controller coil sets to measure position and orientation of many positions (perhaps each body joint) at the expense of time-sliced access. This is why I was interested in the faster 250Hz sample rate I am using... I would like to try modulating RF with the receiver coils too, to see how simple a wireless conversion can be done. Only the coils need to be wireless for position tracking, right? The RF receiver would feed decoded analog to the connections where the coils used to be in the controller hand units. The only thing holding me back is time. I need apprentices to help me realize my ideas...
User avatar
davyfirst
One Eyed Hopeful
Posts: 6
Joined: Tue Jun 17, 2014 1:37 am

Re: Rift Accelerometer to Position Visualizer Demo

Post by davyfirst »

Thanks zalo,I’m learning this, try it tonight. Best wishes :woot
LOVE D&D(Dad and Dog, Dungeons & Dragons) L.L.B
Email:davy4job@163.com

Image
Post Reply

Return to “Oculus VR”