IR Tracking Prototype (with proof/image!)

This is for discussion and development of non-commercial open source VR/AR projects (e.g. Kickstarter applicable, etc). Contact MTBS admins at customerservice@mtbs3d.com if you are unsure if your efforts qualify.
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

IR Tracking Prototype (with proof/image!)

Post by MemeBox »

I have found the time to put together a prototype infra-red tracking application. The jist is:

I have used a structure from motion toolkit (VisualSFM) to calculate the position of 3 webcams.
I have then used the camera positions, rotations and focal lengths exported from this application to back-project a tracked IR Led (x,y) in each image to a ray in 3D space.

Where the rays generated from each camera intersect give the application the position (x,y,z) of the IR Led.
I have removed the IR filters from the webcams (£4.00 from Argos) and put a visible light filter over them instead (photographic negative).
This allows me to track the IR Led more precisely. I am using a very simple threshold on the image pixels, and the finding the average x,y of the remaining pixels.
The results are good, I haven't checked but I think I'm getting about 15 -20fps.
I have been warned before that because common webcams do not time-stamp frames that point localisation will be erratic, I don't see this. Didn't think it would be an issue for relatively slow moving objects.

The rays intersect fairly precisely, so the calibration looks good.
The next step will be to find the closest point to all the rays, to find the actual x,y,z, I'm expecting this to be a relatively straightforward bit of linear algebra (I hope!)

The system could theoretically scale to 10s of webcams, but I will have to sort out adding more USB buses to my PC.
The cameras would be placed further apart, but I still need to order some USB extension cables from the internet, the high street is way too expensive!

The code is in C#, using AForge for webcam access and XNA for visualisation.
The aim of all this is to get cheap accurate and fast absolute position tracking out to the world (If indeed such a thing is currently possible)(and also because I want to play with the oculus in a real VR space).

I can supply further details and will be setting up a github project shortly, I'd like to put it on my CV :)

Bring on the Oculus!!! :) :) :)

There is an image along with this post, a quick explanation:
The three windows at the top are the webcam feeds. The little red boxes are the tracked locations of the IR Led I am holding in each image.
The bottom window is an XNA game window. The big guns (The only mesh I had to hand) near the camera are the webcams, the way they are facing reflects the way the webcams are facing.
The lines coming out of the guns are the rays projected from the camera centre through the tracked IR Led pixel coordinates. Where the lines intersect is the estimated location of the IR Led. The little guns in the background are the image points used in the camera calibration step from VisualSFM, they are mostly of the cushiony side of the sofa(more texture there). The really big gun is attached to me (actually the third webcam position is obscured by the really big gun, but ho hum).
You do not have the required permissions to view the files attached to this post.
User avatar
cybereality
3D Angel Eyes (Moderator)
Posts: 11407
Joined: Sat Apr 12, 2008 8:18 pm

Re: IR Tracking Prototype (with proof/image!)

Post by cybereality »

Very cool.
User avatar
Callezetter
Two Eyed Hopeful
Posts: 71
Joined: Thu Jul 05, 2012 3:09 pm
Location: Stockholm, Sweden

Re: IR Tracking Prototype (with proof/image!)

Post by Callezetter »

Great work, Memebox. Love it!
Krenzo
Binocular Vision CONFIRMED!
Posts: 265
Joined: Tue Sep 07, 2010 10:46 pm

Re: IR Tracking Prototype (with proof/image!)

Post by Krenzo »

Very nice
zalo
Certif-Eyed!
Posts: 661
Joined: Sun Mar 25, 2012 12:33 pm

Re: IR Tracking Prototype (with proof/image!)

Post by zalo »

Very cool! I suspect that this will be of some use to you: Distance from Point to Line in 3 Dimensions

From there you can iterate across different points on one of the lines to home in on where the other two get closest!
User avatar
Chriky
Binocular Vision CONFIRMED!
Posts: 228
Joined: Fri Jan 27, 2012 11:24 am

Re: IR Tracking Prototype (with proof/image!)

Post by Chriky »

I don't think there's necessarily a unique solution to the closest point between 3 lines, but I would find the closest points between each pair of lines (they'll be 6 of them) and the average. This is a constant time calculation that doesn't need any iterations.
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

Ok, I've been distracted a little bit recently. Been thinking about search engines.

I managed to get the closest point to all rays projected from the cameras.

I was having a few problems with calibration of the cameras over wide baselines.

I'm thinking of tackling this by forming the point correspondences between the images using the ir dot tracking.
This should produce very clean point matching between the images and allow high angle, wide baseline camera calibration. With the added bonus of allowing the camera calibration to be done without removing the IR filters and entirely within the application.

This would also allow me to do the whole thing with wii remotes, do you think wii remotes would be a better option than webcams? I think the latency would be lower and there would be no wires.
But I'm not sure about the number of wii remotes that can be connected at one time...

Any thoughts?
User avatar
brantlew
Petrif-Eyed
Posts: 2221
Joined: Sat Sep 17, 2011 9:23 pm
Location: Menlo Park, CA

Re: IR Tracking Prototype (with proof/image!)

Post by brantlew »

I personally love using Wiimotes. They simplify a lot of the details by offloading the basic blob/point detection and the resolution and 100Hz rate is plenty for most applications. The only downside is the 4 IR point limit and that the Wiimotes must be blutooth paired on each session. I think the Wiimote limit is 4, and I'm pretty sure I've use 3 simultaneously before without problems.
pierreye
Sharp Eyed Eagle!
Posts: 377
Joined: Sat Apr 12, 2008 9:45 pm

Re: IR Tracking Prototype (with proof/image!)

Post by pierreye »

For webcam, I would prefer PS3 Eyetoy which can capture at 120fps@320x240 which is consider pretty high resolution. To run on PC, you need CL-Eye driver (free version seems to be limited to one camera). Wiimote is running at lower resolution but with an advantage of build in image processing.
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

The issue with pairing all devices for each session is great to know, a real gotcha, thanks.

If you need to pair each remote each time, you would end up ruining the calibration every time you restarted your computer.

I'll stick with the web cameras then, with the intention of moving to ps3eyes once I have things working, at £20 a pop they are not cheap!

Calibrating the system will involve wandering around the target space with the IR Led for a while and then clicking on a calibrate button. This would use the point correspondences from all of the different IR Point locations across the images to generate the camera locations and translations. Quite neat I think. Setup could just then be a matter of placing the cameras around a space and starting to use the system, since the calibration can be done on the fly, after a sufficient number of point correspondences across the images from the IR Led have been gathered, they can be fired off to the calibration code. Possibly even periodically, with points gathered from a rolling time window, updating the calibration where more accurate estimates have been generated...

Looking ahead I'm thinking that the system is not going to deal with rotational drift from the Oculus sensors, in it's current form. Absolute rotation readings from my tracking solution will not be easy to build in. I'd need multiple markers and have to track those markers from one frame to the next, dealing with occlusions gracefully. Not an inviting prospect, I've got some experience with applicable AI techniques, Neural networks/SVM, but I suspect this would get hairy, quickly.

Any thoughts on that last point?
User avatar
brantlew
Petrif-Eyed
Posts: 2221
Joined: Sat Sep 17, 2011 9:23 pm
Location: Menlo Park, CA

Re: IR Tracking Prototype (with proof/image!)

Post by brantlew »

MemeBox wrote:The issue with pairing all devices for each session is great to know, a real gotcha, thanks.

If you need to pair each remote each time, you would end up ruining the calibration every time you restarted your computer.
Well you can reuse the calibration data between sessions as long as the Wiimotes don't move, but getting all the bluetooth pairings can be tedious. On the other hand, the alternative of dealing with 3 or 4 full video streams is going to be a resource hog - especially if you intend to scale up to several PSEyes. So it depends on the number of cameras you want to run. One or two cameras might be better with video streams, but with 3 or 4 I think there is an advantage to having the built-in processing of the Wiimotes (and of course being wireless is advantageous as well).

MemeBox wrote:Calibrating the system will involve wandering around the target space with the IR Led for a while and then clicking on a calibrate button. This would use the point correspondences from all of the different IR Point locations across the images to generate the camera locations and translations.
It's going to be difficult to calibrate scale that way without some known real-world measurements. The IR path you trace out could be 2 meters wide or 4 meters wide but you can't tell without some type of measurement data added. An easy solution is to create a calibration marker - like a specially crafted IR triangle with known edge lengths. Then you can just snap one shot of it and calibrate your orientations and distances in one fell swoop.


MemeBox wrote:Any thoughts on that last point?
Palmer has repeatedly mentioned that drift is not going to a big problem for the Rift tracker.
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

Well you can reuse the calibration data between sessions as long as the Wiimotes don't move, but getting all the bluetooth pairings can be tedious. On the other hand, the alternative of dealing with 3 or 4 full video streams is going to be a resource hog - especially if you intend to scale up to several PSEyes. So it depends on the number of cameras you want to run. One or two cameras might be better with video streams, but with 3 or 4 I think there is an advantage to having the built-in processing of the Wiimotes (and of course being wireless is advantageous as well).
This is kind of what I was saying, if you need to touch the wii remotes to pair them (which I think you do), you will ruin the calibration. Small movements in the cameras produce a noticeable increase in error. Video streams are proving tricky to get running smoothly, I've got 4 running at vga, but lag is becoming an issue. I'm down to 10fps. Having done some timings, I believe I'm maxing out the USB controller bandwidth, I think adding extra ports on a PCI card should help. I've considered offloading the IR tracking to some other processing device, raspberry pi for instance, for each camera, but the cost is a little too much for me.
It's going to be difficult to calibrate scale that way without some known real-world measurements. The IR path you trace out could be 2 meters wide or 4 meters wide but you can't tell without some type of measurement data added. An easy solution is to create a calibration marker - like a specially crafted IR triangle with known edge lengths. Then you can just snap one shot of it and calibrate your orientations and distances in one fell swoop.
Using the technique you describe for camera calibration is likely to give poor results. With only a few points across the cameras, there will be ambiguity about the camera extrinsics and slight errors in the estimated IR locations. Taking many measurements and then optimising over those measurements will produce better results. I've seen camera calibration done with a checker board pattern, that was effective. I am intending to use the height of the user for absolute scale calibration. The user could place the IR led on the floor and then at the top of their head, enter their height, giving the required conversion from world units to real world units.
Palmer has repeatedly mentioned that drift is not going to a big problem for the Rift tracker.
I know, but I am a little sceptical, does he mean, not a big problem for the kinds of usages he envisions, or not a big problem for this kind of usage? In this scenario, rotational drift could see you walking into a wall. I'm prepared to take the hit, but I don't think my gf would.
User avatar
brantlew
Petrif-Eyed
Posts: 2221
Joined: Sat Sep 17, 2011 9:23 pm
Location: Menlo Park, CA

Re: IR Tracking Prototype (with proof/image!)

Post by brantlew »

MemeBox wrote:This is kind of what I was saying, if you need to touch the wii remotes to pair them (which I think you do), you will ruin the calibration. Small movements in the cameras produce a noticeable increase in error. Video streams are proving tricky to get running smoothly, I've got 4 running at vga, but lag is becoming an issue. I'm down to 10fps.
If you are only concerned about camera disturbance, surely you can manage that mechanically. There are several commericial mounting brackets plus many DIY Wiimote mounting solutions that allow you to secure a Wiimote. I would hate to see you battle with bandwidth and latency just to avoid touching the Wiimote. (you just need to press the 1 and 2 buttons simultaneously) I set up a dual-Wiimote Whiteboard at my office and it maintains calibration like this.

Image
MemeBox wrote:Using the technique you describe for camera calibration is likely to give poor results. With only a few points across the cameras, there will be ambiguity about the camera extrinsics and slight errors in the estimated IR locations. Taking many measurements and then optimising over those measurements will produce better results.
A large enough marker should solve that. But either way will work.
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

It's a combination of factors really, I just don't fancy wii remotes anymore, having to go around and register 5,6,7 wii remotes every time you fire up the system is a big turn off. On the other hand I think I can get this number of webcams running, we shall see if I am successful. Bear in mind that compression takes place on the webcam and most of the scene will be static, reducing bandwidth.

I'm really not a fan of the calibration jig idea, I just can't see how it's going to be accurate enough and provide enough points for un-ambiguous camera position estimation (please prove me wrong!), also anyone who wants to use the system is going to have to create their own calibration jig, a big turn off for most I would think.
bobv5
Certif-Eyed!
Posts: 529
Joined: Tue Jan 19, 2010 6:38 pm

Re: IR Tracking Prototype (with proof/image!)

Post by bobv5 »

You could combine something like the brackets Brantlew showed with physically removing the 1+2 buttons from the controler with an extension. With any type of camera, they are going to move some amount anyway, even the walls of your house expand and contract with temperature, humidity etc. Enough to be a problem? That depends how accurate you want the tracking to be. Probably not enough to care about with such low res cameras, but I can't give numbers as I don't have first hand experience with it.

It would be possible for somebody to make a big lot of jigs and sell them, I would be willing to do it if nobody else wants to.

"I know, but I am a little sceptical, does he mean, not a big problem for the kinds of usages he envisions, or not a big problem for this kind of usage? In this scenario, rotational drift could see you walking into a wall. I'm prepared to take the hit, but I don't think my gf would."

He knows the sort of stuff we want to use it for, and was talking about Hawken, so I think it will be good. I will also be skeptical untill it is strapped to my face. The real question is if he meant dev kit or consumer.

PS, is it skeptical or sceptical? Google says both are right, but now I looked at it too much and both seem wrong :(
"If you have a diabolical mind, the first thing that probably came to mind is that it will make an excellent trap: how do you get off a functional omni-directional treadmill?"
User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

Re: IR Tracking Prototype (with proof/image!)

Post by Fredz »

bobv5 wrote:PS, is it skeptical or sceptical? Google says both are right, but now I looked at it too much and both seem wrong :(
UK : sceptic (-al, -ism) ; US : skeptic (-al, -ism)

"The American spelling, akin to Greek, is the earliest known spelling in English.[187] It was preferred by Fowler, and is used by many Canadians, where it is the earlier form.[188] Sceptic also pre-dates the European settlement of the US, and it follows the French sceptique and Latin scepticus. In the mid-18th century, Dr Johnson's dictionary listed skeptic without comment or alternative, but this form has never been popular in the UK;[189] sceptic, an equal variant in the old Webster's Third (1961), has now become "chiefly British". Australians generally follow the British usage (with the notable exception of the Australian Skeptics). All of these versions are pronounced with a hard "c", though in French that letter is silent and the word is pronounced like septique."

Source : http://en.wikipedia.org/wiki/American_a ... ifferences
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

Hmm, yes I suppose you could take out the buttons and wire it up to a central location. while your at it you could plumb the wii devices into a power source.

But it is still making things more complicated to setup, I'd like to keep things as simple as possible.
Of course if webcameras prove too laggy, I may have no choice. I'm hoping the bill of materials required will be something like:

Webcam * 7 * £5 = £35
USB Leads * 7 * £3 = £21
USB PCI Card * 1 * £20 = £20
= ~£75

You can get very accurate calibration using http://phototour.cs.washington.edu/bundler/, without a calibration jig. This is part of the working of packages like PTAM, it produces good results, without much faff.

I'm in the UK, so I guess sceptical is appropriate, although I feel that spelling is usually given too much consideration.
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

I've managed to get the auto calibration step working.
So now it is possible to simply set up the webcams in the room (with the IR filters in place) and calibrate them all by moving around with the IR Led.
All that is required is to start the cameras, wave the led around for a while and then click calibrate.

Worked nicely wherever I placed the 3 web cams last night.
You can then start the tracking and get the xyz coordinates of the Led, job done :)

I still have to do the part where we work out the scaling to apply to go from model coordinates to world coordinates, but I think the height of the user idea should work well.

I suppose the last stage will be fusing the Oculus sensor readings with the xyz from my tracker.

If I can also work out how to multiple blob tracking (doesn't need to track from frame to frame) you could do rotation estimation by having an asymmetrical distribution of Leds on the Oculus and then fitting a model of their distribution to the points read from the tracking system. For instance if the tracking sees 4 points with xyz coordinates and you have a store of the relative positions of the Led's (formed from a modeling stage) then you could do ICP (iterative closest point) of the model with the visible points at each frame. Should be fast enough and I think would give you a reliable reading of the head orientation. While you're at it you could do the same for a handheld model gun. So within each frame of data, cluster the points into two using maybe KNN and then perform ICP on the two clouds with the two different models (one for the gun, one for the oculus) and then use the best fit for the Oculus model for the head position and orientation and the best fit for the gun model for the position and orientation of the gun. I think it might work well.

Things are looking rosy. I have some time off from the day job coming up, god I'm such a nerd...
User avatar
brantlew
Petrif-Eyed
Posts: 2221
Joined: Sat Sep 17, 2011 9:23 pm
Location: Menlo Park, CA

Re: IR Tracking Prototype (with proof/image!)

Post by brantlew »

Great progress! What's the load on the CPU for dealing with 3 cameras?
User avatar
Namielus
Certif-Eyable!
Posts: 957
Joined: Thu Aug 02, 2012 8:49 am
Location: Norway
Contact:

Re: IR Tracking Prototype (with proof/image!)

Post by Namielus »

I am curious to how many identical webcams you can run on one computer.
Having the same driver etc, I've read that you cant run too many webcams at the same time.
Riftoholic

My precious 6 month project the Oculus Virtual Lounge:
Image
If you help me in any way I will be forever grateful.
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

With 3 webcams CPU load is at ~13%, cpu = i3 3.1 Ghz and 8Gig ram.
The CPU load does'nt change much from when the app isn't running.
Two of the webcams are running at 14-15 fps, occasionally droping to 13fps.
Thats 76ms response time, a little laggy but this is probably the best I can do until I am prepared to shell out for better hardware and see if higher fps video streams can be processed effectively/higher fps webcams can be connected on mass to the USB ports.

The third webcam is having a little trouble, it's running at 8-9fps.
I suspect this is because it was very cheap, or because this model has trouble running along side others or because it happens to be the last to connect and gets too small a slice of the USB bandwidth.
The two webcams which are running smoothly are of the same type.
I hope adding another USB controller and buying more of the webcams which work (£5 from argos) will resolve this and allow me to add more.
Time will tell.

@Namielus I had trouble connecting two of the first £3 webcams to certain combinations of ports on the computer including getting a blue screen of death (this is the third webcam connected, in the current setup). I suspect that in some cases you will get problems, in others not. It is probably going to be a try it and see thing. In building my setup I can let others know which ones are workable, if they feel like using this or their own system.

Two pics attached, the first is the current calibration app. Very prototype at the mo, the 3 top text boxes is the x,y tracking from each camera. The 3 labels below that are the frame rates in frames per second. There are 3 images right next to each other at the bottom, with a red square around the tracked IR Led The rest is mostly debug buttons, the number in the big box is the number of points captured so far for the calibration routine. The second image shows a visualisation of the points captured (black dots) and the camera positions (coloured dots two dots for each camera, showing the direction in which they point). You can see the path that I moved the IR Led along in the tracks of black dots :)
You do not have the required permissions to view the files attached to this post.
druidsbane
Binocular Vision CONFIRMED!
Posts: 237
Joined: Thu Jun 07, 2012 8:40 am
Location: New York
Contact:

Re: IR Tracking Prototype (with proof/image!)

Post by druidsbane »

Looks great :) Quick question: so basically you're autocalibrating from a single LED and your plan is to get the scale based on user height not based on some absolute measurement like using more than one LED stuck together at a small fixed distance from each other?
Ibex 3D VR Desktop for the Oculus Rift: http://hwahba.com/ibex - https://bitbucket.org/druidsbane/ibex
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

That is correct, I am automatically calibrating the cameras from a series of Led measurements. Then I'm planning on using the user height to get the scale to convert distances in the model to distances in the real world, correct.

Whether this will work well enough is anyone's guess!
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

Does anyone have any experience of tracking light points? I am currently just taking a threshold over the image and clustering the points based on proximity, then simply taking the centre of the largest cluster to be the tracked point location. I am thinking about doing some further processing over these clusters to try to discount clusters which might just be bright points in the room (I am trying to get things working in a bright-ish room). I am considering taking the cluster centres and feeding an image patch around each point to a neural network, or SVM. Any thoughts (should I do a DoG type approach before feeding to the NN/SVM)?

Cheers...
User avatar
brantlew
Petrif-Eyed
Posts: 2221
Joined: Sat Sep 17, 2011 9:23 pm
Location: Menlo Park, CA

Re: IR Tracking Prototype (with proof/image!)

Post by brantlew »

I think a neural net is overkill and unnecessarily slow.

Could you just take a calibration image of the room before the player and use this as a sort of "subtraction mask" to be applied before blob detection? Also I would consider down-sampling the image to a lower res with a simple averaging filter (maybe 8x8) to subtract out a lot of the noise before attempting blob detection.
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

Thanks, I had thought that a NN risked being a little slow. I've tried the background subtraction but if large bright patches move around, it is possible to pick these up rather than the blob. It's fine at dusk or night, or with thick curtains, but in bright daylight it fails completely, perhaps this is just the way it is...
User avatar
brantlew
Petrif-Eyed
Posts: 2221
Joined: Sat Sep 17, 2011 9:23 pm
Location: Menlo Park, CA

Re: IR Tracking Prototype (with proof/image!)

Post by brantlew »

Perhaps a contrast filter might help. It's pretty easy to generate a map of high contrast changes in the image. That might help filter out some of the natural lighting effects which tend to change more gradually. So you would be looking for areas of high intensity plus high radial contrast. Of course reflections are still going to be problematic. Glass = false positives.
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

I have uploaded a video to YouTube of the demo app I have built. It takes you from webcam calibration by moving the LED around to writing "hello" in mid air (twice).

[youtube-hd]http://www.youtube.com/watch?v=IBRGOyzjMfA[/youtube-hd]

http://www.youtube.com/watch?v=IBRGOyzjMfA

I am really really sorry about the music, I like a good ol jig.

I look to be getting accuracy in the range of millimeters, perhaps 2-3mm.

I'm quite happy with it. I'm getting a solid 60fps from my ps3 eye and 30fps from each of two other webcams. I have 3 more PS3 eyes arriving soon :)

Tracking is smooth and accurate, in artificial light (day light makes it throw a total wobbly).
User avatar
FingerFlinger
Sharp Eyed Eagle!
Posts: 429
Joined: Tue Feb 21, 2012 11:57 pm
Location: Irvine, CA

Re: IR Tracking Prototype (with proof/image!)

Post by FingerFlinger »

Great job, MemeBox
User avatar
brantlew
Petrif-Eyed
Posts: 2221
Joined: Sat Sep 17, 2011 9:23 pm
Location: Menlo Park, CA

Re: IR Tracking Prototype (with proof/image!)

Post by brantlew »

Cool. Good work.
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

Things are going great. I think I'm getting millimeter accuracy, in the sweet spot, over maybe 3 meters cubed.

I'm trying to do rotation readings aswell as position, using multiple led tracking.

I am able to track multiple leds. I have put 4 leds onto a plastic helmet, I am tracking each led.

My plan is to get the readings of the led placements and then fit a model of the led layout to the readings, yeilding a 3 axid reading for the head orientation. I have some code I wrote to do point cloud matching, it should do the job, but I am worried that the symmetry of the circular helmet will throw it off.
Does anyone have any thoughts on this, or knowledge of prior work?

I will also be needing to integrate this with the tracker on the oculus, does anyone have any thoughts on the sensor fusion techniques which will be required?

Thanks for any help...
User avatar
cadcoke5
Binocular Vision CONFIRMED!
Posts: 210
Joined: Mon May 24, 2010 8:43 pm
Location: near Lancaster, PA USA

Re: IR Tracking Prototype (with proof/image!)

Post by cadcoke5 »

If two of the LED's are mounted much closer together than the others, then that becomes a way to orient the direction the helmet is pointing. Since theoretically, the helmet might be upside down, and give the same orientation from the visual processing, you just have to assume in code that the person is not upside-down.

Another way is for one of LED's to be a distinct color from the rest.

Joe Dunfee
User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

Re: IR Tracking Prototype (with proof/image!)

Post by Fredz »

cadcoke5 wrote:Another way is for one of LED's to be a distinct color from the rest.
That's going to be hard to do with IR I guess, isn't it basically B&W for the camera ?

Another option could be to place the LED inside balls of varying diameters, like what is done on the PS Move. You can differentiate between different sizes to know which ball the camera is looking to and the diameter also give a distance information because of the varying apparent size. That's the technique used by this project : http://moveonpc.blogspot.fr/
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

I've been sidetracked by the idea of using the scattered IR light from a kinect to roughly map out my lounge...
Since I have multiple point tracking I should be able to spray the IR points around my lounge to build up a 3d map...

This way I can get a feel for whether I have the accuracy necessary to render views for the Oculus. Then when the oculus comes I could potentially walk around my virtual lounge, with added virtual elements.

I'm just refactoring the code to make this possible. This should get my code ready for the rotational head tracking.

Thanks for the input on the IR led, I think I'm going to try putting two close together on one side and just the one on the other side. If this doesn't work well, I will go for the different sized blobs, this sounds like it might work nicely, thanks guys :)
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

I'm thinking about releasing the source onto Github, following the maxim release early and release often.

It also currently only works with PS3 eyes, although I will re-enable this function shortly. The code is also a mess.

If people are interested I can upload it in the next few days, or do you think I should wait until I get to the plug and play stage?
User avatar
brantlew
Petrif-Eyed
Posts: 2221
Joined: Sat Sep 17, 2011 9:23 pm
Location: Menlo Park, CA

Re: IR Tracking Prototype (with proof/image!)

Post by brantlew »

I don't think you need to make it plug and play, but I would do a clean-up (or at least a documentation) pass before you release it or you might be involved with a lot of Q/A.
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

You can now find the source code for the tracker at
https://github.com/MarcusRobbins/Free3DTrack

It's not pretty, but it does the job.
Feel very free to tinker and amend as you will.

Works with PS3 Eyes only at the moment...
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

I have a new video:

[youtube-hd]http://www.youtube.com/watch?v=3IGdW2Vx4VI[/youtube-hd]

I have roughly mapped out my lounge using a laser pointer (The bright dot can be triangulated with the cameras) and then set the camera location to be at the IR location. There is a model imported into the space, to give us something interesting to look at :) This with some tweaking would be the viewpoint rendered for the Oculus.

The frame rate in the video is poor, this is due to the video encoding. Capture is at 60fps and, honestly, looks as smooth as silk, except when the batteries fall out of my IR jig (near the end of the vid), lol...
MemeBox
Two Eyed Hopeful
Posts: 55
Joined: Sat Sep 01, 2012 9:23 am

Re: IR Tracking Prototype (with proof/image!)

Post by MemeBox »

I've just thought of an excellent way to identify individual LEDs in each camera :)

If you vary the intensity of the each of the LEDs in different ways, with an arduino or something.
You can then match the Fourier transform of each light source to the known Fourier transform of each signal :)

You can then track multiple objects and know which is which, ie:

This 3d point is the location of the front of the head, this is the location of the back of the head, this is the front of the gun, this is the back of the gun.

Absolute rotation of the head then becomes easy!!!

Soz, quite excited. am going to get myself an arduino and an LED kit....
zalo
Certif-Eyed!
Posts: 661
Joined: Sun Mar 25, 2012 12:33 pm

Re: IR Tracking Prototype (with proof/image!)

Post by zalo »

That's a great idea! I just hope you can reliably measure led intensity over time, since it looks like Chriky and some others a couple of threads over are having trouble measuring something as "simple" and static as color.

Speaking of color, I don't suppose there are any filters out there that can alter the hue of an IR LED (or even cameras that can detect a difference in IR hue).

As always, I'm rooting for you and for cheap absolute optical position tracking. Good luck.
Post Reply

Return to “VR/AR Research & Development”