Positional Tracker Requirements?

Post Reply
ElectroPulse
Cross Eyed!
Posts: 121
Joined: Mon Oct 08, 2012 3:50 pm
Location: East Coast, U.S.A.

Positional Tracker Requirements?

Post by ElectroPulse »

Hello, all!

So I was just looking over the documentation that comes with the OR SDK, and I see this:
"2.2 Tracker Specifications
-Up to 1000Hz sampling rate
-Three-axis gyroscope, which senses angular velocity
-Three-axis magnetometer, which senses magnetic fields
-Three-axis accelerometer, which senses accelerations, including gravitational"

So this got me thinking... Why can't it do positional tracking? Based on what I've heard about the Razer Hydra, it seems that it uses magnetic fields to tell where it is located in relation to the base station. Couldn't someone construct a base station that would put out a magnetic field, and use this as to do motion tracking?

Even if that didn't work, what about the accelerometer? Based on what I had heard of them in the past, I had thought that accelerometers would allow for motion tracking, rather than just rotation.

I'm guessing I'm completely off with this, so could someone explain why it isn't possible with the current tracker?

Thanks!
zalo
Certif-Eyed!
Posts: 661
Joined: Sun Mar 25, 2012 12:33 pm

Re: Positional Tracker Requirements?

Post by zalo »

Magnetometers aren't reliable enough to do what the hydra does. The hydra-motes have 3 coils (one for each axis) coupled to 3 transmitting coils in the base station. Both sets of coils are MUCH larger than whatever the magnetometer uses. Plus, magnetometers are prone to being affected by environmental magnetic interference (earth's magnetic field, speakers, etc.) Plus Plus, the Hydra outputs an oscillating magnetic field and I don't think the sample rate on the rift's magnetometer can handle that.

Accelerometers can do a reasonable amount of positional tracking for a very short time, but the noise in the sensor accumulates over time. It's not a good feeling having your head gradually displaced 10 feet to the right of the character while playing for a couple minutes!
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: Positional Tracker Requirements?

Post by geekmaster »

The problem is that we do not measure position directly (such as can be obtained from GPS). Instead, we calculate position by adding the current velocity vector to it for each iteration, making the position move according to its velocity. If there were any error in velocity measurement, that would cause the position to drift away from where we expect it to be, with more velocity error causing faster positional drift. Even the tiniest velocity error causes the position to drift over time, losing your frame of reference between virtual objects and physical props. Even if we stop moving in the real world, our velocity error will make us appear to be moving in the virtual world.

The second problem is that we do not measure velocity directly either (which can indirectly be obtained from differences in GPS coordinates). Instead, we calculate velocity by adding the current acceleration vector to it for each iteration, making the velocity change according to its acceleration. If there were any error in acceleration measurement, that would cause he velocity to drift away from how fast we expect to be moving, with more acceleration error causing increased velocity error. Even if we stop accelerating in the real world, our acceleration error will make us appear to be speeding up or slowing down in the virtual world.

Because computed position is the first derivative of velocity, and computed velocity is the the first derivative of acceleration, that makes positional accuracy dependent on the second derivative of measured acceleration. Second order derivatives are unstable, and bound to drift with even the absolutely tiniest amount of error in physical measurement.

Although acceleration-based position measurement can still be useful over small periods of time, and it can be "normalized" to a constrained workspace (such as sitting or standing), it gets a little complicated trying to determine when the body is at rest so you can zero the computed velocity to remove that major source of error. In fact, such continuous recalibrations during periodic rest positions (such as measuring foot acceleration and determining when the foot is resting on the floor via human locomotion gait analysis so you can zero the velocity) has been shown to cut positional drift down to a fraction of one percent, but only in specially constrained motion analysis. Any error in determing when the acceleration sensor is at rest will cause significant drift.

That is why direct position measurement using magnetic or optical tracking is desirable, to eliminate positional drift altogether. However, it still may be useful to incorporate acceleration-based position tracking into the fusion algorithm along with real positional tracking, to help increase its accuracy (especially during optical oclusions or magnetic signal dropouts or field anomolies, or when at the limits of accurate range).

HTH. :)

EDIT: I have been playing a bit with second-derivative computed head position based on the acceleration vector obtained from my Rift, but I am having a bit of trouble at the moment accurately removing the gravity vector from the independent acceleration sensor axes. The slightest calculated error on each independent acceleration error is amplified by velocity computation, making it difficult to determine when each axis is "at rest" so you can periodically zero-out velocity for that axis to remove drift. This is very difficult because while the Rift is worn on the face, it is not level, and the slightest changes from level cause rapid drift in velocity, which chauses HUGE drift in position. A foot-mounted acceleration sensor is FAR EASIER to track than a head that keeps tilting in continuous but subtle ways. My goal if (or when) I can achieve such acceleration-based constrained position tracking is to recreate the Johnny Lee "wiimote head tracker" video inside my Rift, using only its built in head tracker.

[youtube]http://www.youtube.com/watch?v=Jd3-eiid-Uw[/youtube]

That video shows just how HUGE an addition to our VR immersion adding positional head tracking will be, when the Rift get positional head tracking.
User avatar
nateight
Sharp Eyed Eagle!
Posts: 404
Joined: Wed Feb 27, 2013 10:33 pm
Location: Youngstown, OH

Re: Positional Tracker Requirements?

Post by nateight »

geekmaster wrote:My goal if (or when) I can achieve such acceleration-based constrained position tracking is to recreate the Johnny Lee "wiimote head tracker" video inside my Rift, using only its built in head tracker.
One wonders how the Xsens guys do it without an optical component. Is there a major RF element to their sensing rigs, or is there some secret sauce in their algorithms? :geek:

I'd naturally love to see some kind of positional tracking solution entirely built into the dev Rift v2 and the consumer model, but I'm still unsure if this is even possible. One thing to pin hopes on is an Oculus/Sixense business alliance; a base station, two wands, and a third coil sensor built into the HMD would appear to kill a whole flock of birds with one stone. Such a solution could run into patent problems if Sixense wasn't involved, however.

Frankly, for seated tracking, I still don't see why we don't just implement Johnny Lee's exact solution, particularly while we're waiting for the "official" positional tracker that may not arrive for a year or more. The very first mod I'm doing to my Rift is attaching two IR LEDs and pointing a Wiimote at it. Is there a reason this hasn't already been widely adopted? Sure, it's a stopgap solution, but don't pretty much all the available solutions fit that description? :lol:
Shameless plug of the day - Read my witty comments on Reddit, in which I argue with the ignorant, over things that don't matter, for reasons I never fully understood!
zalo
Certif-Eyed!
Posts: 661
Joined: Sun Mar 25, 2012 12:33 pm

Re: Positional Tracker Requirements?

Post by zalo »

I'm pretty sure the Xsens guys just take the orientation data from IMUs that are mounted at every joint as you go from your leg to your head. Since the human skeleton can only be arranged a certain number of ways, constraining each bone to the orientation of the sensor, while still having all the bones attached can lead to only one possible head position. But that takes too many IMUs.

geekmaster, have you tried using the SDK's accelerometer compensated orientation to subtract 0.00981 downward from the raw accelerometer data? It takes some trig, but It should be a pretty reliable way to compensate for gravity. Maybe in the next SDK Oculus can release a function that takes care of that. I've never coded in C++, but I imagine it might look a little like this:

Code: Select all

" X=" << ((acc.x / STD_GRAV)*cos(RadToDegree(pitch))*sin(RadToDegree(roll))) << 
" Y=" << ((acc.y / STD_GRAV)*sin(RadToDegree(pitch))*cos(RadToDegree(roll))) << 
" Z=" << ((acc.z / STD_GRAV)*cos(RadToDegree(pitch))*cos(RadToDegree(roll))) << endl;
Last edited by zalo on Sat Apr 13, 2013 6:41 pm, edited 3 times in total.
ElectroPulse
Cross Eyed!
Posts: 121
Joined: Mon Oct 08, 2012 3:50 pm
Location: East Coast, U.S.A.

Re: Positional Tracker Requirements?

Post by ElectroPulse »

Thanks for the replies!

Ok, so it sounds like the biggest issue is drift? Hmm... Would something like a Kinect work for real-time recalibration? I don't like the idea of using it for positional tracking due to the latency (I've only used one once for a short amount of time in a store, but it seemed nearly unplayable), but for the purpose of eliminating drift, would this work? Pair it with the accelerometer (for low latency), and that sounds like it could work, perhaps? (might also be nice for crouching and different body motions)

So, what is currently the cheapest solution for positional tracking? I'm wanting to order a Razer Hydra in the next day or two, and have been thinking about messing around with using that for positional tracking while using an XBOX controller for actual control (however this would tether you to the base station... I think it'd be neat to set up some sort of self-contained vest with a backpack, that'd hold the laptop, battery for the Rift/Hydra (unless the Hydra is powered by USB), and mount the hydra base station on the front of the vest of the hand tracking. Then have something else for positional tracking...).

I remember going on a field trip for a day to another university with the engineering department (I'm a freshman taking Computer Science, and the engineering department and the computer department are closely related here), and they had a VR demo there that used some sort of small grey box mounted above the user, and something mounted on the HMD (I'm pretty sure it was an eMagin Z800 3DVisor... Wasn't very impressed with it), which I believe used magnetic fields for the tracking. They used it so they could walk around 3D models of molecules to look at them from all angles... Was pretty neat, but as mentioned I wasn't impressed with the HMD.

@nateight: I saw a demo a little while back (here: https://www.youtube.com/watch?v=MSTge5IDxF4) of something similar to it, and had been under the impression that it just used accelerometers. On their website they say "Each YEI 3-Space Sensor uses triaxial gyroscope, accelerometer, and compass sensors in conjunction with advanced processing and on-board quaternion-based Kalman filtering algorithms to determine orientation relative to an absolute reference in real-time." It sounds like these are the same types of sensors on the Rift? I'm not sure how they figure out what the "orientation relative to an absolute reference" is, though (I'm guessing that basically means having no drift, correct?).
MSat
Golden Eyed Wiseman! (or woman!)
Posts: 1329
Joined: Fri Jun 08, 2012 8:18 pm

Re: Positional Tracker Requirements?

Post by MSat »

@geekmaster

Your best bet at removing the gravity vector from your position calculations is to compare it with gyro data. So say that your accelerometer reports 1G in the Z axis (vertical) and .2G in the X axis (side to side) while your gyro doesn't report any rotation along the X axis (roll) you can deduce that the 1G in the Z axis is the gravity vector, and the .2G in the X axis is a translational acceleration. Of course, you'll always have the issue with random noise which will make it practically useless.

That said, I'm very much interested in an almost entirely IMU-based positional tracking approach utilizing multiple identical sensors to create a virtual sensor which I have discussed in another thread before. Consider that MEMS accelerometers now cost less than $2 per unit, an array of 8 chips would still be cheap enough for a consumer device. A weighted average of the values reduces the noise floor substantially over a single device.

There's other interesting aspects to using multiple sensors. For instance, with accelerometers, resolution per G is dependant on the range setting of the device. So for a given bit depth, +/- 2G will yield double the resolution per G (neglecting noise) than if it were set for +/- 4G. It's a similar scenario of gyros where you specify the angular rate. What I'm wondering is that if there would be any benefit to having some of the devices in an array set to higher sensitivity at a reduced ceiling while keeping others at a lower sensitivity in the event that the lower ceiling is exceeded while in use.

Another potentially useful aspect is that some sensors (the IMU chip in the Rift's tracker for example) can have its MEMS components locked to an external clock source. So besides being able to assume that all samples are taken at the same time among an array, you can also clock them uniquely to increase the sampling rate. So lets say you have two sensors operating at 1000Hz each. If you have two different clocks operating 180 degrees out of phase with one another, you effectively double the sampling rate of the entire system to 2000Hz, which perhaps should not only help reduce latency, but also increase spatial resolution (should be useful for accelerometers at the very least).

I have some more thoughts on this, but I guess I'll save it for another time.

@op

The workings of a magnetic tracker using coils (like the Hydra) and one using IC-based magnetometers (don't know of any commercial product using this) would be quite different from one another. While an IC-based system may be trickier to implement, I think it has benefits in terms of accuracy, and might even be inherently immune to positional ambiguity (haven't thought this one through completely). That said, you wouldn't simply be able to tie in a magnetometer to a Hydra base station, as the requirements would be quite a bit different from one another.

At any rate, I'm not sure what the ideal positional tracking solution would be. Maybe it's magnetic, optical, intertial, a combination of some of those or something else.
MSat
Golden Eyed Wiseman! (or woman!)
Posts: 1329
Joined: Fri Jun 08, 2012 8:18 pm

Re: Positional Tracker Requirements?

Post by MSat »

ElectroPulse wrote:Thanks for the replies!

Ok, so it sounds like the biggest issue is drift? Hmm... Would something like a Kinect work for real-time recalibration? I don't like the idea of using it for positional tracking due to the latency (I've only used one once for a short amount of time in a store, but it seemed nearly unplayable), but for the purpose of eliminating drift, would this work? Pair it with the accelerometer (for low latency), and that sounds like it could work, perhaps? (might also be nice for crouching and different body motions)

I tend to side this way as well. I'm not sure if a depth camera system like the Kinect is even necessary. Could probably even get by with a relatively low frame rate camera (which tends to have the benefit of higher resolution). I think with a two point sensing solution - one on the head, and one on the upper chest/back - will go a long way towards limiting translational drift. Positional errors over time are also not a big deal as long as you implement a robust tracking model. This biggest errors would probably come from vertical drift (are you crouching or standing), which is where a camera would come in handy.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: Positional Tracker Requirements?

Post by geekmaster »

zalo wrote:geekmaster, have you tried using the SDK's accelerometer compensated orientation to subtract 0.00981 downward from the raw accelerometer data? It takes some trig, but It should be a pretty reliable way to compensate for gravity. Maybe in the next SDK Oculus can release a function that takes care of that. I've never coded in C++, but I imagine it might look a little like this:

Code: Select all

" X=" << ((acc.x / STD_GRAV)*cos(RadToDegree(pitch))*sin(RadToDegree(roll))) << 
" Y=" << ((acc.y / STD_GRAV)*sin(RadToDegree(pitch))*cos(RadToDegree(roll))) << 
" Z=" << ((acc.z / STD_GRAV)*cos(RadToDegree(pitch))*cos(RadToDegree(roll))) << endl;
Not yet. I have other things occupying my time right now. But thanks for the code so I do not have to figure it out myself. I will start with what you provided and go from there... I want to play with Johnny Lee's demo in my Rift (some day RSN)...
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: Positional Tracker Requirements?

Post by geekmaster »

MSat wrote:@geekmaster

Your best bet at removing the gravity vector from your position calculations is to compare it with gyro data. So say that your accelerometer reports 1G in the Z axis (vertical) and .2G in the X axis (side to side) while your gyro doesn't report any rotation along the X axis (roll) you can deduce that the 1G in the Z axis is the gravity vector, and the .2G in the X axis is a translational acceleration. Of course, you'll always have the issue with random noise which will make it practically useless.
Thanks, I will try that out too.

To use IMUs for free walking position, you need an IMU mounted on your foot, so that it can be completely stopped during a portion of every stride while walking. During that rest period, the calculated velocity is forced to zero so that there is no accumulated drift. The only positional drift is the small amount accumulated while the foot is in motion. Experimental studies showed that positional drift for constrained static state gate analysis as described is only about 0.3-percent. To compensate for that slow position drift, you add GPS to the fusion algorithm to maintain accurate average position. Another source of outdoor positioning information is triangulation of relative signal strength for various Wifi SSIDs in the neighborhood, similar to what google uses. With mobile wifi, perhaps you can use the WiGLE wardriving database:
http://www.irongeek.com/i.php?page=secu ... ve-mapping

I want to try using a similar method without a foot IMU, by making simple assumptions about being seated in a chair or standing in one position, which limits how far you can move the Rift (not much beyond your hips, to maintain balance). And I plan to assume you spend most of your time in a fully upright centered position. That should allow us to look around objects naturally, and may help reduce motion sickness.

And the Rift SDK does set the range to +/- 2G, which is clearly visible in the output data from my modified version of the "Minimal Oculus" demo from the OculusVR wiki. And the SDK headers set the default range to 4G, so that matches.

What is not clear is the SDK headers say the acceleration data should be 1000 times larger than what is being returned, which is why I had to set the standard gravity so small to compensate. Perhaps that will be fixed in the next SDK version.
zalo
Certif-Eyed!
Posts: 661
Joined: Sun Mar 25, 2012 12:33 pm

Re: Positional Tracker Requirements?

Post by zalo »

geekmaster wrote:thanks for the code so I do not have to figure it out myself. I will start with what you provided and go from there...
Disclaimer: There's practically no chance the code I posted will work, or is even on the right track.
Your best bet is to look up sensor fusion tutorials, and just take the bit where they do gravity compensation.

...or bug Oculus.
LukePoga
Two Eyed Hopeful
Posts: 89
Joined: Mon Aug 20, 2012 3:49 am

Re: Positional Tracker Requirements?

Post by LukePoga »

you get a logarithmic benefit to accuracy as number sensors increase. Is adding 20 sensors and reducing the error by a factor of 5 enough to work? Just how long does it take to drift?

Has anyone actually enabled positioning on their rift and measured it? if not hurry up for those of us who are waiting for theirs :/
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: Positional Tracker Requirements?

Post by geekmaster »

If the head tracker position drift has an offset that varies over time or between units, the software could have a "trim" control to compensate for such drift, similar to trim controls on radio control aircraft transmitters.
LukePoga
Two Eyed Hopeful
Posts: 89
Joined: Mon Aug 20, 2012 3:49 am

Re: Positional Tracker Requirements?

Post by LukePoga »

each sensor is a random walk around the true value, but any particular sample group will not average to the true value. so drift is reduced but it still occurs.
doktor3d
One Eyed Hopeful
Posts: 4
Joined: Wed Feb 13, 2013 4:35 am

Re: Positional Tracker Requirements?

Post by doktor3d »

Hi all, I've also been thinking about how we might be able to use accelerometer data to obtain position data. As geekmaster suggests, for now it is reasonable to restrict the problem space to just the typical movements that players in the Rift will make. These are sideways and forwards leaning motions and vertical crouching motions. Also we can assume that players are more-or-less rooted to the spot: either sitting or standing.

The approach that I've been contemplating is developing a gesture recognition model, rather than integrating the IMU data to give the positional displacement. First step would be to record people performing movements while wearing the Rift *and* also tracking their absolute position using a Razer Hydra or other low latency positional tracking device. We log both the Rift's tracking data and the (ground truth) positional data to a file, we then synchronize the data in the file to cater for the differing latencies between the Rift and Hydra tracking.

So then we have ground truth position and corresponding accelerometer data. Once we have that I'm not exactly how to analyze these two data sets, but what we would want is to generate a series of gestures that we can associate with accelerometer data in real-time. In-engine, we would also probably want to make a kinematic model of the spine for more natural movements.

Any thoughts? :geek:
Mystify
Certif-Eyed!
Posts: 645
Joined: Fri Jan 11, 2013 5:10 pm

Re: Positional Tracker Requirements?

Post by Mystify »

make a machine learning model, using the acceleration data within a timespan as the input, and the ground truth as your gold value.

If you do assume a fixed central position, you could detect what "Recentering" motions look like- I was leaning over, and stood back up, or otherwise returning to your rest stance. smooth your motion such that it ends in the rest position and it should correct drift that has accumulated- at the very least, this should constrain the drift to a reasonable distance.
One potential hitch is that the rest position is not constant. If someone is sitting up in their chair, and then ends up leaning back, their rest position has shifted.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: Positional Tracker Requirements?

Post by geekmaster »

LukePoga wrote:each sensor is a random walk around the true value, but any particular sample group will not average to the true value. so drift is reduced but it still occurs.
I typically use IIR filters instead of FIR, so there is no problem with "sample group" size. An IIR filter is simple (adding input and output feedback), similar to an analog R/C filter. Of course, IIR filters only asymptotically approach the true value, never quite exactly reaching it (which does not matter as long as it is closer than other methods). One cool use of IIR filtering is for framebuffer video feedback effects (like the classic "Dr.Who" opening sequence).
:D

For a quick little demo of a framebuffer IIR filter effect, check out my "bump" demo:
http://www.mtbs3d.com/phpBB/viewtopic.p ... 65#p104470
nateight had this to say about that video feedback demo:
At http://www.mtbs3d.com/phpBB/viewtopic.php?f=140&t=16565&start=40#p104484, nateight wrote:
geekmaster wrote:Let me know what you think...
I've never tried peyote but it somehow just kicked in, that's what I think...
:D

I have thought a bit about sensor fusion lately, and it would seem that the ultimate (so far) in low-latency high-accuracy filters is RLS (Recursive Least Squares), but the math behind that is mind-bendingly complex (for me, these days)... RLS is said to be vastly superior to Kalman filters for sensor fusion applications, and is so fast it is used for real-time audio dynamic echo-cancellation in telco applications. There are some nice GPL solutions that surpass Kalman as well, but I try to avoid getting a GPL infection in my code.
:mrgreen:

To learn more about RLS filters (the ultimate in low-latency adaptive sensor fusion), here is a good place to start:
http://www.ehu.es/ccwintco/uploads/5/50 ... 10_800.pdf
And this one shows some sample (FORTRAN) RLS source code:
http://iaac.technion.ac.il/workshops/20 ... ectKF3.pdf
And then go here for more in-depth information:
http://kdpu.edu.ua/download/library/Cou ... tering.pdf
User avatar
KBK
Terrif-eying the Ladies!
Posts: 910
Joined: Tue Jan 29, 2013 2:05 am

Re: Positional Tracker Requirements?

Post by KBK »

Part of my job description (one might say), is signal quality and noise floor, as a combination. And, I have worked at it down to the molecular level, in the world of 'end use practical solutions'. I'm better at this than anyone or any company or group that I'm aware of. I've spent a lifetime concentrating in these areas. Optically, mechanically, and electrically.

I believe I can tame this problem, enough to get it to work. I'm as sure as I can be that I can stabilize this better and increase the quality of the signal and drop the noise floor. How well I can do this, remains to be seen. The improvement will be there, yes I can do that - but to what effect, is the deal.

No Rift yet, so I can't experiment.
Intelligence... is not inherent - it is a point in understanding. Q: When does a fire become self sustaining?
User avatar
3dRat
Two Eyed Hopeful
Posts: 88
Joined: Sun Jan 13, 2013 4:26 pm

Re: Positional Tracker Requirements?

Post by 3dRat »

KBK wrote:Part of my job description (one might say), is signal quality and noise floor, as a combination. And, I have worked at it down to the molecular level, in the world of 'end use practical solutions'. I'm better at this than anyone or any company or group that I'm aware of. I've spent a lifetime concentrating in these areas. Optically, mechanically, and electrically.

I believe I can tame this problem, enough to get it to work. I'm as sure as I can be that I can stabilize this better and increase the quality of the signal and drop the noise floor. How well I can do this, remains to be seen. The improvement will be there, yes I can do that - but to what effect, is the deal.

No Rift yet, so I can't experiment.
if you get it before OVR then you can become the new hardware design manager!!! (and be millionare...)
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: Positional Tracker Requirements?

Post by geekmaster »

@KBK: perhaps that could be worded in a better way...
Last edited by geekmaster on Tue May 14, 2013 2:07 pm, edited 1 time in total.
User avatar
KBK
Terrif-eying the Ladies!
Posts: 910
Joined: Tue Jan 29, 2013 2:05 am

Re: Positional Tracker Requirements?

Post by KBK »

Would you like me to change the wording?

I'm sure there are some things you are very good at yourself, Geek. Perhaps the best you know of.

Is it ego to say so? I don't think so - Depends on the context. It's an age old problem, where self assurance in one's self comes off as ego to those who's viewpoint is without that self assurance.

Full circle here, Geek. It's very simple. Don't 're-act'.
Intelligence... is not inherent - it is a point in understanding. Q: When does a fire become self sustaining?
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: Positional Tracker Requirements?

Post by geekmaster »

I reworded my post. I often go back and edit posts to remove stuff that might sound boastful (or potentially deprecating to others), despite how true it may be. But I try to avoid speaking in absolutes. You have been doing a MUCH better job of toning down your posts to sound more "realistic" recently. This one just seemed to contain a claim of "self-worth" that could be interpreted as a bit "overboard" for a public forum.

I am sure we have both gained our skils from devoting an insane amount of research and investigation to our persuits of self interest, combined with "above-normal" intellect. But we need to try to avoid boastful statements that DETRACT from what we are trying to say (in the eyes of the general public) despite how true those claims may be.

As far as being "the best", I doubt it. Things that I "invent" and that others find "brilliant", I later discover were previously used by others (documented using different terminology from another field) sometimes before I was even born. Most inventions as "discovered" because their time has come, and others are working on the same things. But "the best that I know of", yes, for awhile, until I discover others who preceded me...
LukePoga
Two Eyed Hopeful
Posts: 89
Joined: Mon Aug 20, 2012 3:49 am

Re: Positional Tracker Requirements?

Post by LukePoga »

must be good being the greatest living mathematician in the world.
Mystify
Certif-Eyed!
Posts: 645
Joined: Fri Jan 11, 2013 5:10 pm

Re: Positional Tracker Requirements?

Post by Mystify »

geekmaster comes off as brilliant because he doesn't talk about how he is brilliant, he just demonstrates it with well-thought out posts, practical theories, demonstrations, and basically acting in a highly competent manner
KBK comes off as merely arrogant since he talks about how awesome he is at everything, but doesn't seem to actually produce much to support this while making assertions that everyone else is close minded and can't see why his unconventional thinking is so superior to what actually works.
If you talk about how awesome you are, everyone will just think you are a blowhard. If you do awesome things, people will realize you are awesome.
User avatar
KBK
Terrif-eying the Ladies!
Posts: 910
Joined: Tue Jan 29, 2013 2:05 am

Re: Positional Tracker Requirements?

Post by KBK »

LukePoga wrote:must be good being the greatest living mathematician in the world.
I suck at math. Not my best subject. Not everyone knows everything. kinda autistic, one might say. Everybody's got something going. Cane one play baseball, or write papers on it?

What if I'm not interested in 'doing', in order to prove..whatever? All I said was, something that I came to, as as point of realization, a while back (years) I never said it made me happy or if it petted my ego or not. It did neither. Nor does it mean that one gets to do funky things or make oodles of cash or whatever the hell. Nor do i have to prove fecal to anyone.

You are free to interpret and say as you will. As am I. Simple enough. Your reactions are yours, not mine. :)

I'd like to play with a Rift, and I don't have one yet. Nor am I interested in competing with Oculus or any HMD maker. I don't need the karma of that device hanging over me.
Intelligence... is not inherent - it is a point in understanding. Q: When does a fire become self sustaining?
Mystify
Certif-Eyed!
Posts: 645
Joined: Fri Jan 11, 2013 5:10 pm

Re: Positional Tracker Requirements?

Post by Mystify »

edit:thought better about what I said
Post Reply

Return to “Oculus VR”