[DIY] homemade NVIDIA 3dVision interface code
-
- One Eyed Hopeful
- Posts: 18
- Joined: Wed Apr 21, 2010 8:31 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Could you please post the compiled tool?
-
- One Eyed Hopeful
- Posts: 5
- Joined: Sat Apr 10, 2010 1:58 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Hi, I'm new to this forum, but I thought I might share this.
I have hacked together a small linux library for the NVIDIA 3d stereo controller using the libusb library to access the device. As libusb is also available for Windows, I assume it might work as well for OpenGL under Windows (if it compiles). You will need the nvstusb.sys from the Windows installation (currently only 190.62 is supported, I'll fix that). If you want to give it a try, you can find it on my blog: http://codesklave.de/2010/04/26#nvstusb. Use at own risk!
I might also have an explanation for the timer values that are sent, when setting up the device, as well as any time the eyes are toggled. By looking at the data it seems the upper word is always 0xFFFF, and the lower word is the cycle count of a 23.592 MHz or 23.592960 MHz clock. I came to that conclusion after noticing that at 120Hz refresh rate they the four timer values from the setup code and the eye toggle command always seemed to add up to around 0x30000. It might actually be 0x2FFF8 because that would mean a 23.592 MHz clock for which there are crystals that are cheaper than the 23.592960 MHz ones. I don't know yet, which timer value does what, but I suspect they correlate to the durations of on/off states.
I also had a look at the firmware and it seems to me like there is a 80c51/80c32 microcontroller in the device. I'll have a closer look at that to confirm my findings.
All in all a very hackable device, in my opinion.
I have hacked together a small linux library for the NVIDIA 3d stereo controller using the libusb library to access the device. As libusb is also available for Windows, I assume it might work as well for OpenGL under Windows (if it compiles). You will need the nvstusb.sys from the Windows installation (currently only 190.62 is supported, I'll fix that). If you want to give it a try, you can find it on my blog: http://codesklave.de/2010/04/26#nvstusb. Use at own risk!
I might also have an explanation for the timer values that are sent, when setting up the device, as well as any time the eyes are toggled. By looking at the data it seems the upper word is always 0xFFFF, and the lower word is the cycle count of a 23.592 MHz or 23.592960 MHz clock. I came to that conclusion after noticing that at 120Hz refresh rate they the four timer values from the setup code and the eye toggle command always seemed to add up to around 0x30000. It might actually be 0x2FFF8 because that would mean a 23.592 MHz clock for which there are crystals that are cheaper than the 23.592960 MHz ones. I don't know yet, which timer value does what, but I suspect they correlate to the durations of on/off states.
I also had a look at the firmware and it seems to me like there is a 80c51/80c32 microcontroller in the device. I'll have a closer look at that to confirm my findings.
All in all a very hackable device, in my opinion.
-
- One Eyed Hopeful
- Posts: 6
- Joined: Tue Mar 02, 2010 4:09 pm
- Location: Northern California, USA
- Contact:
Re: [DIY] homemade NVIDIA 3dVision interface code
Thanks for the links - I had just found some of them a minute before. Great work going on!mickeyjaw wrote:bwheaton:
Hey if you can get this ported to linux that would be amazing, as there are projects ongoing for 3d picture and movie players on Linux e.g http://www.mtbs3d.com/phpBB/viewtopic.php?f=21&t=5723&. At the moment we are only supporting Interlaced and anaglyph output. Do you have any idea how you intend to implement this i.e will you make it a kernel module and device node, or do you intend to do everything from userspace using libusb or similar?
I was thinking of something like libUSB, since installs should be easier, and hopefully I can use the same code for Mac too.
Let me dig around a bit, and I'll report back. Be helpful if anyone using the windows version that gets a better idea of what the parameters do would let us know. I'm also interested in whether the emitter will run it's own clock, instead of getting explicit swap commands.
Bruce
-
- One Eyed Hopeful
- Posts: 6
- Joined: Tue Mar 02, 2010 4:09 pm
- Location: Northern California, USA
- Contact:
Re: [DIY] homemade NVIDIA 3dVision interface code
Oh, nice job kolrabi. I was just starting to hack around in libusb. I guess, without the firmware, I wouldn't have got too far anyway. Is it one of those devices that needs to get his whole firmware every boot?
Can I double-check some things with you? When I queried the device, I came up with no endpoints. Just a mistake here?
Then there's a lot of stuff about rate, and yet switching seems to be manual. What do you think about that?
Also, when you look at timing, some of what you're seeing that's odd may be related to dark periods - blanking when neither eye is on. It's also possible that they have built in a delay from when you trigger the swap to when it starts - there's a definitely a delay somewhere to correct for all different sorts of monitor, and it may be in the box, driver or back in the OpenGL driver. Not sure.
Anyway, would you mind sharing any snooped information also? I ran into a block trying to convert the windows Activator version when it starts using an array of ints instead of bytes, and working out which bytes to swap.
Bruce
Can I double-check some things with you? When I queried the device, I came up with no endpoints. Just a mistake here?
Then there's a lot of stuff about rate, and yet switching seems to be manual. What do you think about that?
Also, when you look at timing, some of what you're seeing that's odd may be related to dark periods - blanking when neither eye is on. It's also possible that they have built in a delay from when you trigger the swap to when it starts - there's a definitely a delay somewhere to correct for all different sorts of monitor, and it may be in the box, driver or back in the OpenGL driver. Not sure.
Anyway, would you mind sharing any snooped information also? I ran into a block trying to convert the windows Activator version when it starts using an array of ints instead of bytes, and working out which bytes to swap.
Bruce
-
- One Eyed Hopeful
- Posts: 12
- Joined: Fri Apr 02, 2010 3:30 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Compiled version of the activator:
http://www.mediafire.com/?jqw3ncxktmo" onclick="window.open(this.href);return false;
http://www.mediafire.com/?jqw3ncxktmo" onclick="window.open(this.href);return false;
-
- One Eyed Hopeful
- Posts: 5
- Joined: Sat Apr 10, 2010 1:58 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Yes. The windows driver uploads it at boot time if it is connected. If it's connected later the firmware is uploaded when 3d mode is being entered.bwheaton wrote:Oh, nice job kolrabi. I was just starting to hack around in libusb. I guess, without the firmware, I wouldn't have got too far anyway. Is it one of those devices that needs to get his whole firmware every boot?
The endpoints become available as soon as the firmware is loaded.Can I double-check some things with you? When I queried the device, I came up with no endpoints. Just a mistake here?
I think this is to prevent the device and the monitor to get out of sync due to clock drift. Clock rates may change over time.Then there's a lot of stuff about rate, and yet switching seems to be manual. What do you think about that?
When switching eyes something that looks like a timer value is transmitted in addition to the selection of the active eye. I didn't implement it in my code because I'm not yet sure how to calculate that value correctly. I suspect it's related to the time that the device should wait before changing the eye.Also, when you look at timing, some of what you're seeing that's odd may be related to dark periods - blanking when neither eye is on. It's also possible that they have built in a delay from when you trigger the swap to when it starts - there's a definitely a delay somewhere to correct for all different sorts of monitor, and it may be in the box, driver or back in the OpenGL driver. Not sure.
I used SnoopyPro and you should be able to use it to open this file: http://codesklave.de/files/usblog.zipAnyway, would you mind sharing any snooped information also? I ran into a block trying to convert the windows Activator version when it starts using an array of ints instead of bytes, and working out which bytes to swap.
-
- One Eyed Hopeful
- Posts: 6
- Joined: Tue Mar 02, 2010 4:09 pm
- Location: Northern California, USA
- Contact:
Re: [DIY] homemade NVIDIA 3dVision interface code
That makes sense. They need a specific time to trigger the swap, and I suspect the best time would actually be at the vertical interval, but to trigger the next swap, if that makes sense. So if they're about to issue the GL commands to draw left, they would trigger left, even though it's going to be displayed at the next swap. Then they would/do also build in some sort of delay for the latency of display devices.kolrabi wrote:When switching eyes something that looks like a timer value is transmitted in addition to the selection of the active eye. I didn't implement it in my code because I'm not yet sure how to calculate that value correctly. I suspect it's related to the time that the device should wait before changing the eye.
I suppose, if you assume there's always some latency, you could trigger at the VBI and the glasses would swap 'in time' for the actual video to start arriving. Hmm. My point really is that finding accurate times in between vertical intervals is tricky. I think in your version, you tell the glasses to swap when you issue the swap command. You probably know that it won't actually swap right then, you're just queuing the command, right? That's especially true when v-sync is enabled, which it pretty much has to be for 3D.
In glut (which is the best way to code this example, for sure - it's actually cross-platform), I'm not sure you really get any kind of accurate, frame referenced timer - just a render timer.
Thanks for the snoop, I'll check it out.
Bruce
-
- One Eyed Hopeful
- Posts: 6
- Joined: Tue Mar 02, 2010 4:09 pm
- Location: Northern California, USA
- Contact:
Re: [DIY] homemade NVIDIA 3dVision interface code
Oh, and Kolrabi - did you try putting a Vesa sync signal into the emitter? That's the way it's meant to work with Quadro.
Bruce
Bruce
-
- One Eyed Hopeful
- Posts: 18
- Joined: Wed Apr 21, 2010 8:31 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Can't run it =/asdfqwer wrote:Compiled version of the activator:
http://www.mediafire.com/?jqw3ncxktmo" onclick="window.open(this.href);return false;
-
- One Eyed Hopeful
- Posts: 5
- Joined: Sat Apr 10, 2010 1:58 am
Re: [DIY] homemade NVIDIA 3dVision interface code
That's easy for NVIDIA to do, as they also developed the video driver. They know exactly what to do when and how.bwheaton wrote:That makes sense. They need a specific time to trigger the swap, and I suspect the best time would actually be at the vertical interval, but to trigger the next swap, if that makes sense. So if they're about to issue the GL commands to draw left, they would trigger left, even though it's going to be displayed at the next swap. Then they would/do also build in some sort of delay for the latency of display devices.
Yes you're right, I didn't think of that. Unfortunately there is no OpenGL command to explicitly wait for the v-sync. But there should be a way to do it. Maybe one could use glFinish() after issuing the swap to wait for it to happen and then either swap the eyes (assuming the swap itself is quick enough and the monitor is still in vertical blank at that time) or take some time measurements that could be used to swap the eyes in time. Hmm... tricky...I suppose, if you assume there's always some latency, you could trigger at the VBI and the glasses would swap 'in time' for the actual video to start arriving. Hmm. My point really is that finding accurate times in between vertical intervals is tricky. I think in your version, you tell the glasses to swap when you issue the swap command. You probably know that it won't actually swap right then, you're just queuing the command, right? That's especially true when v-sync is enabled, which it pretty much has to be for 3D.
I also have the problem on my machine that the swap doesn't happen exactly a the time of the vertical blank. On the top of the screen I get some flickering where a seemingly random number of lines a swapped too late. But I think that's a configuration issue. (I'd also like to get my 2233rz into the same mode it is in when I use 3d on Windows, so far no luck)
Time measurements would ideally in units of raster position. But that would require access to the video adapter. On Linux one could use read the CLOCK_MONOTONIC clock for a high resolution timer, or QueryPerformanceCounter on Windows. Or write a real driver that can use timer interrupts.In glut (which is the best way to code this example, for sure - it's actually cross-platform), I'm not sure you really get any kind of accurate, frame referenced timer - just a render timer.
No, I don't have a Quadro. I also don't know the pinout or voltage levels to create my own signal.Oh, and Kolrabi - did you try putting a Vesa sync signal into the emitter? That's the way it's meant to work with Quadro.
- Fredz
- Petrif-Eyed
- Posts: 2255
- Joined: Sat Jan 09, 2010 2:06 pm
- Location: Perpignan, France
- Contact:
Re: [DIY] homemade NVIDIA 3dVision interface code
On Linux there is new a function in the kernel since version 2.6.32 or 2.6.33 which is called KMS page flipping Ioctl. As the name suggests, it should be able to give a user interrupt when a page flip is occuring. I don't know if it's supported by graphic cards drivers for now though.kolrabi wrote:Time measurements would ideally in units of raster position. But that would require access to the video adapter. On Linux one could use read the CLOCK_MONOTONIC clock for a high resolution timer, or QueryPerformanceCounter on Windows. Or write a real driver that can use timer interrupts.
-
- One Eyed Hopeful
- Posts: 1
- Joined: Fri Mar 26, 2010 8:55 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Nice find kolrabi !
http://www.keil.com/dd/docs/datashts/cy ... x2_trm.pdf" onclick="window.open(this.href);return false;
I believe the controller is based on this uC because:
- it supports USB firmware uploading directly to RAM (without any eeprom), and it is 8051 based.
- has 8KB RAM for both code and data.
- clock is 24MHz.
- reset and booting sequences match exactly those Kolrabi found.
I will later check out the firmware in more detail. Maybe we can figure out the extract clock info
sent along with the flipping command, and even write our own firmware .
Indeed, it seems to me that the the uC is EZ-USB FX2. Checkout the datasheet here:kolrabi:
I also had a look at the firmware and it seems to me like there is a 80c51/80c32 microcontroller in the device. I'll have a closer look at that to confirm my findings.
http://www.keil.com/dd/docs/datashts/cy ... x2_trm.pdf" onclick="window.open(this.href);return false;
I believe the controller is based on this uC because:
- it supports USB firmware uploading directly to RAM (without any eeprom), and it is 8051 based.
- has 8KB RAM for both code and data.
- clock is 24MHz.
- reset and booting sequences match exactly those Kolrabi found.
I will later check out the firmware in more detail. Maybe we can figure out the extract clock info
sent along with the flipping command, and even write our own firmware .
-
- One Eyed Hopeful
- Posts: 6
- Joined: Tue Mar 02, 2010 4:09 pm
- Location: Northern California, USA
- Contact:
Re: [DIY] homemade NVIDIA 3dVision interface code
I use this:kolrabi wrote:Yes you're right, I didn't think of that. Unfortunately there is no OpenGL command to explicitly wait for the v-sync. But there should be a way to do it. Maybe one could use glFinish() after issuing the swap to wait for it to happen and then either swap the eyes (assuming the swap itself is quick enough and the monitor is still in vertical blank at that time) or take some time measurements that could be used to swap the eyes in time. Hmm... tricky...
No, I don't have a Quadro. I also don't know the pinout or voltage levels to create my own signal.
Code: Select all
// this is the actual sleep/wait
glXGetVideoSyncSGI (&retraceCount);
// Since the retrace is apparently unreliable, get the current count
glXWaitVideoSyncSGI (2, (retraceCount + 1) % 2, &retraceCount);
VESA sync is a super simple signal - +5V square wave at eye frequency.
Bruce
-
- One Eyed Hopeful
- Posts: 5
- Joined: Tue Dec 15, 2009 10:18 pm
Re: [DIY] homemade NVIDIA 3dVision interface code
Last night, drawing inspiration from Kolrabi's code, and some quality time with a USB sniffer, I got our 3d vision glasses working on a Mac.
Rock solid 120Hz on a recent Viewsonic projector (PDJ6251), very little ghosting, no glitches, HDMI. This Viewsonic has two 120Hz 3d modes — the contrast ratio trashing "DLP-link" mode and a almost transparent "3d vision" mode. But the projector itself has not yet been "whitelisted" by Nvidia's drivers which, for me, was just the last straw.
I'll try to write this up properly soon, but some key facts for likeminded hackers:
1) I know hardly anything about the current state of OpenGL on Linux, but I suspect that the route being pursued here in this forum to sync the buffer swap to the glasses might be a bit of a dead end. I don't think there's any contemporary hardware on which you can say anything about how long it's going to take between a GLUT or a window system swap call and when your back buffer actually ends up as photons.
2) On the mac the solution is straightforward — we can busy loop on CGDisplayBeamPosition to pick the spot to swap (note: CGDisplayWaitForBeamPositionOutsideLines seems to be completely decoy, it returns instantly). The brute force approach of course takes most of a core, but it does give you a completely independent program that you can run in the background sending sync out while you start and stop various opengl programs (you have a 50% chance of having the eyes flipped at the start). Of course, being a little smarter about sleeping at the right time will reduce the CPU usage of this approach.
3) Like Kolrabi's code I'm currently using '0' as the offset in the eye switching packet since I can't compute any better value and I don't seem to need to. Staring closely at the USB dump it appears that both the left and right eye signals are being sent with no delay in between them, and then, around 16ms later the same thing happens. Here the offset is clearly required. In my code, I'm sending left and right with an 8.333ms delay between each.
4) It appears that all of the recent "real" graphics cards on Mac's support genuine quad buffered stereo that would appear to work with this technique without inserting additional FBO's into your code's swap routines. All in all, it seems like OS X + 3d vision is an ideal home-brew stereo hacking platform.
5) But I was unable to use Kolrabi's firmware extractor to extract the firmware from nvstusb.sys — either the most recent version or 190.62. It doesn't locate the right spot in the .sys file to make the extraction. Right now I'm booting into bootcamp once — this gets the "green light" on the unit and then rebooting into OS X. As long as the device isn't unplugged it stays useable. Of course, this means that you need to be able to install 3d vision on a windows partition in the first place, so short of illegally distributing the firmware file, ATI users remain out of luck. If it wasn't for that, I imagine it would work just fine on those cards as well.
6) And I needed a different set of magic timing numbers from Kolrabi (different from both sets that they included). The original set inside that code cause both shutters of my glasses to turn almost completely opaque (I can see a little color from the color wheel of my projector phasing in and out). The "correct" set is easily sniffed from the USB exchange that happens when you enter the "medical test image" from the 3d vision control panel. But I'm no closer to understanding the timing formulae — I have it working at 120Hz (actually 119.96Hz) and that's that.
7) Multi-screen stereo is a no-go since the clocks out of both ports of the cards we've tested have an arbitrary phase offset. If you are feeling up to it, you can check and uncheck "mirror displays" until you happen to get them close enough (you can call CGDisplayBeamPosition to find out). That can actually be automated, but seems like a ridiculous solution.
We're going to continue testing the setup here — including purchasing a new transmitter, just to make sure that there aren't multiple versions of this hardware out there (ours is from December).
In any case, it's working and working well — comparable, albeit on a far smaller scale, to our experiences working with high end digital cinema projectors and XPand x101's. It has far better contrast ratio than the XPanD x102's with consumer DLP-Link on the projectors I've tested.
It's probably time to set up a group & repository somewhere to formalize this project, no?
Rock solid 120Hz on a recent Viewsonic projector (PDJ6251), very little ghosting, no glitches, HDMI. This Viewsonic has two 120Hz 3d modes — the contrast ratio trashing "DLP-link" mode and a almost transparent "3d vision" mode. But the projector itself has not yet been "whitelisted" by Nvidia's drivers which, for me, was just the last straw.
I'll try to write this up properly soon, but some key facts for likeminded hackers:
1) I know hardly anything about the current state of OpenGL on Linux, but I suspect that the route being pursued here in this forum to sync the buffer swap to the glasses might be a bit of a dead end. I don't think there's any contemporary hardware on which you can say anything about how long it's going to take between a GLUT or a window system swap call and when your back buffer actually ends up as photons.
2) On the mac the solution is straightforward — we can busy loop on CGDisplayBeamPosition to pick the spot to swap (note: CGDisplayWaitForBeamPositionOutsideLines seems to be completely decoy, it returns instantly). The brute force approach of course takes most of a core, but it does give you a completely independent program that you can run in the background sending sync out while you start and stop various opengl programs (you have a 50% chance of having the eyes flipped at the start). Of course, being a little smarter about sleeping at the right time will reduce the CPU usage of this approach.
3) Like Kolrabi's code I'm currently using '0' as the offset in the eye switching packet since I can't compute any better value and I don't seem to need to. Staring closely at the USB dump it appears that both the left and right eye signals are being sent with no delay in between them, and then, around 16ms later the same thing happens. Here the offset is clearly required. In my code, I'm sending left and right with an 8.333ms delay between each.
4) It appears that all of the recent "real" graphics cards on Mac's support genuine quad buffered stereo that would appear to work with this technique without inserting additional FBO's into your code's swap routines. All in all, it seems like OS X + 3d vision is an ideal home-brew stereo hacking platform.
5) But I was unable to use Kolrabi's firmware extractor to extract the firmware from nvstusb.sys — either the most recent version or 190.62. It doesn't locate the right spot in the .sys file to make the extraction. Right now I'm booting into bootcamp once — this gets the "green light" on the unit and then rebooting into OS X. As long as the device isn't unplugged it stays useable. Of course, this means that you need to be able to install 3d vision on a windows partition in the first place, so short of illegally distributing the firmware file, ATI users remain out of luck. If it wasn't for that, I imagine it would work just fine on those cards as well.
6) And I needed a different set of magic timing numbers from Kolrabi (different from both sets that they included). The original set inside that code cause both shutters of my glasses to turn almost completely opaque (I can see a little color from the color wheel of my projector phasing in and out). The "correct" set is easily sniffed from the USB exchange that happens when you enter the "medical test image" from the 3d vision control panel. But I'm no closer to understanding the timing formulae — I have it working at 120Hz (actually 119.96Hz) and that's that.
7) Multi-screen stereo is a no-go since the clocks out of both ports of the cards we've tested have an arbitrary phase offset. If you are feeling up to it, you can check and uncheck "mirror displays" until you happen to get them close enough (you can call CGDisplayBeamPosition to find out). That can actually be automated, but seems like a ridiculous solution.
We're going to continue testing the setup here — including purchasing a new transmitter, just to make sure that there aren't multiple versions of this hardware out there (ours is from December).
In any case, it's working and working well — comparable, albeit on a far smaller scale, to our experiences working with high end digital cinema projectors and XPand x101's. It has far better contrast ratio than the XPanD x102's with consumer DLP-Link on the projectors I've tested.
It's probably time to set up a group & repository somewhere to formalize this project, no?
-
- One Eyed Hopeful
- Posts: 5
- Joined: Sat Apr 10, 2010 1:58 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Actually it seems the uC runs at 48 MHz from a 24MHz crystal. But I also think it's an EZ-USB FX2(LP?). In fact most of the routines from the development kit (http://www.cypress.com/?rID=14321" onclick="window.open(this.href);return false;) framework can be found in the firmware, as well as some of the library routines from the small device c compiler (http://sdcc.sourceforge.net/" onclick="window.open(this.href);return false;).andrei123 wrote:Indeed, it seems to me that the the uC is EZ-USB FX2. Checkout the datasheet here:
http://www.keil.com/dd/docs/datashts/cy ... x2_trm.pdf" onclick="window.open(this.href);return false;
I believe the controller is based on this uC because:
- it supports USB firmware uploading directly to RAM (without any eeprom), and it is 8051 based.
- has 8KB RAM for both code and data.
- clock is 24MHz.
- reset and booting sequences match exactly those Kolrabi found.
I will later check out the firmware in more detail. Maybe we can figure out the extract clock info
sent along with the flipping command, and even write our own firmware .
I like the idea of writing our own firmware, it might be made to be compatible with other brands of infrared shutter glasses.
The connector looks like a 2.5mm stereo jack. Which contact carries the signal, and which one carries ground? I wouldn't want to it by applying the wrong polarity.bwheaton wrote: VESA sync is a super simple signal - +5V square wave at eye frequency.
Very nicemmmmmmmm wrote:Last night, drawing inspiration from Kolrabi's code, and some quality time with a USB sniffer, I got our 3d vision glasses working on a Mac.
Hmm.. what version is your .sys? I could have a look at it and why it doesn't work.5) But I was unable to use Kolrabi's firmware extractor to extract the firmware from nvstusb.sys — either the most recent version or 190.62. It doesn't locate the right spot in the .sys file to make the extraction. Right now I'm booting into bootcamp once — this gets the "green light" on the unit and then rebooting into OS X. As long as the device isn't unplugged it stays useable. Of course, this means that you need to be able to install 3d vision on a windows partition in the first place, so short of illegally distributing the firmware file, ATI users remain out of luck. If it wasn't for that, I imagine it would work just fine on those cards as well.
Those numbers are still a great mystery. But I found out that two of the bytes sent during initialization of 3d mode are more or less directly loaded into the reload registers of timer 2 of the uC. But I need to read more of the uC manual to really understand those.6) And I needed a different set of magic timing numbers from Kolrabi (different from both sets that they included). The original set inside that code cause both shutters of my glasses to turn almost completely opaque (I can see a little color from the color wheel of my projector phasing in and out). The "correct" set is easily sniffed from the USB exchange that happens when you enter the "medical test image" from the 3d vision control panel. But I'm no closer to understanding the timing formulae — I have it working at 120Hz (actually 119.96Hz) and that's that.
Good idea.. I created a project on sourceforge and moved my files there: https://sourceforge.net/projects/libnvstusb/" onclick="window.open(this.href);return false;It's probably time to set up a group & repository somewhere to formalize this project, no?
I can give you write access to the repository if you want.
-
- One Eyed Hopeful
- Posts: 47
- Joined: Wed May 02, 2007 12:17 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Hey Kolrabi and CMS,
This is some fantastic work you guys are both doing. I'm going to setup my development environment some time soon, but i wonder how complicated it would be to get your library Kolrabi working under windows. Assuming I have libusb installed any chance of me getting it to run? I noticed on Sunday you commited some changes that said you were removing win32 support as USB was different on windows.
"- removed code for win32 from nvstusb.c, usb support on windows is more complicated"
Thanks for any updates you can give! You guys have done some great work!!
This is some fantastic work you guys are both doing. I'm going to setup my development environment some time soon, but i wonder how complicated it would be to get your library Kolrabi working under windows. Assuming I have libusb installed any chance of me getting it to run? I noticed on Sunday you commited some changes that said you were removing win32 support as USB was different on windows.
"- removed code for win32 from nvstusb.c, usb support on windows is more complicated"
Thanks for any updates you can give! You guys have done some great work!!
-
- One Eyed Hopeful
- Posts: 5
- Joined: Sat Apr 10, 2010 1:58 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Indeed. It's not as easy as I thought. Just using the Windows version of libusb didn't work. That version is based on an older version of libusb, that has a different API and if I understand the documentation correctly, requires the installation of a special driver. I think it's possible to get it running using cms's code somehow, but I'm not sure how to integrate Windows support in a clean (without using #ifdefs every other line) manner yet. It's something I plan to do, but first I'd like to try getting it to work right. At the moment it is very experimental at best and everyone who wants to try it might need to hack the timings they need directly into the source code to make it work.peter64 wrote:Hey Kolrabi and CMS,
This is some fantastic work you guys are both doing. I'm going to setup my development environment some time soon, but i wonder how complicated it would be to get your library Kolrabi working under windows. Assuming I have libusb installed any chance of me getting it to run? I noticed on Sunday you commited some changes that said you were removing win32 support as USB was different on windows.
"- removed code for win32 from nvstusb.c, usb support on windows is more complicated"
I think I made some progress analyzing the firmware and added my findings as comments in the source code.Thanks for any updates you can give!
ThanksYou guys have done some great work!!
-
- One Eyed Hopeful
- Posts: 3
- Joined: Wed May 19, 2010 1:38 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Guys I can't get the activator to run in Win7. I'm getting "The application has failed to start because its side-by-side configuration is incorrect.
I've got a Samsung HL67A750 67" LED DLP which has only 1 3D enabled HDMI port so I need to either force the emitter on or lock it on once activated. I want to play Monsters Vs. Aliens with my new Panasonic 3D Blu-Ray player, but need the emitter on... Please help!!!
I've got a Samsung HL67A750 67" LED DLP which has only 1 3D enabled HDMI port so I need to either force the emitter on or lock it on once activated. I want to play Monsters Vs. Aliens with my new Panasonic 3D Blu-Ray player, but need the emitter on... Please help!!!
-
- One Eyed Hopeful
- Posts: 47
- Joined: Wed May 02, 2007 12:17 am
Re: [DIY] homemade NVIDIA 3dVision interface code
flyguyjake:
I got the same error as you, upon investigating it said you might need to install the VC2008 runtimes, but it didn't fix the issue. I think someone linked debug dll's into the pre-compiled version perhaps? If you get it figured out do say report back. Thanks.
I got the same error as you, upon investigating it said you might need to install the VC2008 runtimes, but it didn't fix the issue. I think someone linked debug dll's into the pre-compiled version perhaps? If you get it figured out do say report back. Thanks.
-
- One Eyed Hopeful
- Posts: 47
- Joined: Wed May 02, 2007 12:17 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Hey I rebuilt it in release mode so now it runs in vista. I see the glasses flickering! good stuff!
You do not have the required permissions to view the files attached to this post.
-
- One Eyed Hopeful
- Posts: 18
- Joined: Wed Apr 21, 2010 8:31 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Thank you. Good job!peter64 wrote:Hey I rebuilt it in release mode so now it runs in vista. I see the glasses flickering! good stuff!
-
- One Eyed Hopeful
- Posts: 3
- Joined: Wed May 19, 2010 1:38 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Hi Thanks for that! I got it to run. Only problem is when it turns on the glasses aren't flickering properly or in sync. I'm using a Samsung DLP TV with a Panasonic 3D Blu-ray player trying to watch Monsters Vs. Aliens. I've got 120 & 60 in the ini for the hz. Also I don't think that the F1, F2, F3 thing is working.peter64 wrote:Hey I rebuilt it in release mode so now it runs in vista. I see the glasses flickering! good stuff!
Help!
Thanks!
- Likay
- Petrif-Eyed
- Posts: 2913
- Joined: Sat Apr 07, 2007 4:34 pm
- Location: Sweden
Re: [DIY] homemade NVIDIA 3dVision interface code
Just a noobsuggestion but have you tried turning on v-sync?
-
- One Eyed Hopeful
- Posts: 3
- Joined: Wed May 19, 2010 1:38 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Tried that and still not syncing up. Interestingly enough which my HTPC plugged into HDMI port 2 on my Samsung LED DLP and my Panasonic 3D Blu-Ray player plugged into HDMI port 3 I was able to activate the 3D Vision emitter!Likay wrote:Just a noobsuggestion but have you tried turning on v-sync?
I'm assuming this is working due to the activator.exe but I'm not 100% sure. I don't think it worked before. Here's what I did to get things working.
1) Enable 3D Vision in control panel.
2) Start test 3D functionality. You will get VESA unplugged error message across your screen.
3) Switch DLP TV to HDMI 3 & turn 3D mode on. Turn on 3D Vision glasses, 3D Vision emitter should turn bright green and glasses should flicker
Monsters Vs. Aliens 3D Blu-Ray looks amazing! Images really pop out & depth is terrific.
- Petr
- One Eyed Hopeful
- Posts: 6
- Joined: Sat Jun 05, 2010 12:56 am
- Location: Russia
Re: [DIY] homemade NVIDIA 3dVision interface code
int a = (int)(0.1748910*(rate*rate*rate) - 54.5533*(rate*rate) + 6300.40*(rate) - 319395.0);
int b = (int)(0.0582808*(rate*rate*rate) - 18.1804*(rate*rate) + 2099.82*(rate) - 101257.0);
int c = (int)(0.3495840*(rate*rate*rate) - 109.060*(rate*rate) + 12597.3*(rate) - 638705.0);
int sequence[] = { 0x00031842,
0x00180001, a, b, 0xfffff830, 0x22302824, 0x040a0805, c,
0x00021c01, 0x00000002, //note only 6 bytes are actually sent here
0x00021e01, rate*2, //note only 6 bytes are actually sent here
0x00011b01, 0x00000007, //note only 5 bytes are actually sent here
0x00031840 };
Do anyone knows the meanings of fields in sequence?
cms please tell how you get those values. Can you publish more info and your ideas about this commands. I would like to join to experiments, but I never use any USB monitors. Please help me to start. My purpose is to launch the emitter with my engine (I can not use native driver because it provides only horizontal shift with a specified value, but I try to create a rendering with head tracking).
int b = (int)(0.0582808*(rate*rate*rate) - 18.1804*(rate*rate) + 2099.82*(rate) - 101257.0);
int c = (int)(0.3495840*(rate*rate*rate) - 109.060*(rate*rate) + 12597.3*(rate) - 638705.0);
int sequence[] = { 0x00031842,
0x00180001, a, b, 0xfffff830, 0x22302824, 0x040a0805, c,
0x00021c01, 0x00000002, //note only 6 bytes are actually sent here
0x00021e01, rate*2, //note only 6 bytes are actually sent here
0x00011b01, 0x00000007, //note only 5 bytes are actually sent here
0x00031840 };
Do anyone knows the meanings of fields in sequence?
cms please tell how you get those values. Can you publish more info and your ideas about this commands. I would like to join to experiments, but I never use any USB monitors. Please help me to start. My purpose is to launch the emitter with my engine (I can not use native driver because it provides only horizontal shift with a specified value, but I try to create a rendering with head tracking).
-
- One Eyed Hopeful
- Posts: 1
- Joined: Mon Jun 07, 2010 9:48 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Is there any way we can recreate the exisitng driver for the 3d vision usb emitter, but have a control for slight delay/offset in the sync of the glasses. this would help with ghosting issues with current 120hz LCD 3d monitors such as the samsung 2233rz and acer gd245hq.
-
- One Eyed Hopeful
- Posts: 2
- Joined: Thu Dec 31, 2009 2:20 am
Re: [DIY] homemade NVIDIA 3dVision interface code
This is great! Last time I checked this thread, it only had cms' original code. I took that, ran with it a bit, but only made a little progress on my own before I gave up. I'm glad to see people kept moving forward with this in the last year.
My own situation is that I'd like to use the glasses on my Mitsubishi 73" DLP, without the need for an nvidia card at all. So I can use the glasses for watching movies or playing games on the ps3. (and thus avoid shelling out another $200 for other shutter glasses when I already bought these) The DLP has a VESA port. And I always have an HTPC nearby, just not one that can run the nvidia 3d vision software normally.
Originally I snooped the usb codes the driver uses when enabling 3d using a vesa port to sync, and had some limited success tricking the emitter into vesa-based 3d mode. But never got it working perfectly. I hope to do that now with the new info you guys have.
My own situation is that I'd like to use the glasses on my Mitsubishi 73" DLP, without the need for an nvidia card at all. So I can use the glasses for watching movies or playing games on the ps3. (and thus avoid shelling out another $200 for other shutter glasses when I already bought these) The DLP has a VESA port. And I always have an HTPC nearby, just not one that can run the nvidia 3d vision software normally.
Originally I snooped the usb codes the driver uses when enabling 3d using a vesa port to sync, and had some limited success tricking the emitter into vesa-based 3d mode. But never got it working perfectly. I hope to do that now with the new info you guys have.
- tritosine5G
- Terrif-eying the Ladies!
- Posts: 894
- Joined: Wed Mar 17, 2010 9:35 am
- Location: As far from Hold Display guys as possible!!! ^2
Re: [DIY] homemade NVIDIA 3dVision interface code
Yup it would be nice to use the nvidia glasses ( or possibly , 100 usd SONY !!! ) , instead of the bit cauldron arriving later this year.
-Biased for 0 Gen HMD's to hell and back must be one hundred percent hell bent bias!
-
- One Eyed Hopeful
- Posts: 27
- Joined: Sat Dec 29, 2007 11:11 am
Re: [DIY] homemade NVIDIA 3dVision interface code
I am in the same situation as flyguy. I have an HL67A750 LED DLP and a Sony 3D bluray player w/3D firmware update. The 3D will not work with the Bluray player. I sent an email to Samsung and they say the sammy will only work with a PC in 3D mode. I will try flyguy's backdoor fix, but Samsung needs to provide a firmware update for these "older" 3D ready Tv's. They should work with a PC and a 3D Bluray player.
I'll be ticked if there is no firmware update. Dang TV cost me $2,300.00
I'll be ticked if there is no firmware update. Dang TV cost me $2,300.00
-
- One Eyed Hopeful
- Posts: 7
- Joined: Wed Jun 23, 2010 11:39 pm
Re: [DIY] homemade NVIDIA 3dVision interface code
Will this work with ATI graphics cards? That would rock my world!
-
- One Eyed Hopeful
- Posts: 2
- Joined: Thu Dec 31, 2009 2:20 am
Re: [DIY] homemade NVIDIA 3dVision interface code
they rarely provide firmware updates for tvs. especially old ones.
for my mitsubishi (73833) they put out a $100 converter that changes the modern 3d standard signal into the old non-standard one that the mits takes. that's supposed to enable it for use with a ps3, or other hardware.
I too would like to get the glasses working without the nvidia software. For use with other 3d output devices. Theres another thread on this forum about some fellows working up hardware solutions to replace the proprietary emitter. But I always have an HTPC near the tv, so I don't mind a software solution myself. But i'd prefer one that didn't require a top of the line nvidia graphics card, and some tricky program running on the computer, in this case, the 3d video player.
I did a little usb snooping when using the VESA connector. This is what I found.
bits 1 and 2 at 0x2021 apparently have to do with the vesa connector. bit 1 is on when you have something plugged into it (not sure if a signal is necessary though). bit 2 is on when you are actually syncing with the vesa port. bit 0 is still the front button, of course.
When watching 3d this means 0x2021 is 0x06. And when the nvidia drivers disables the 3d (because I hit the button) it drops to 0x02. not sure if bit 2 means we have a good signal from the tv, or we're _using_ the vesa signal to sync to.the driver continues to flip 1B (0x2022) to 0x07 between 0x03 when enabling/disabling the 3d feature.
it still sends timing details like cms' original code pasted here does. they look identical. even though its going to use the vesa (3d dlp link) to do the real syncing.
It never sends any flip commands though, like cms' original code does. just the initial timing details. and a constant 'status' request of location 0x18. which as I said is usually either 0x000006 during normal operation. I don't have any screens that work at 100hz, so I've never seen the glasses operate in any normal fashion, without a vesa connection.
the driver is smart enough to know if you didn't plug the vesa in, and displays a warning on screen. obvious it can see that 0x2021 is 0x00. if I plug the vesa in but dont turn 3d mode on the mits, then it probably figures it out through the hdmi negotation, and displays a different error message to the fact. when I try to use 3d anywyas. the 0x2021 is 0x02. implying the mits always provides some sort of signal over the vesa.
I'm running a x64 windows 7 system.
for my mitsubishi (73833) they put out a $100 converter that changes the modern 3d standard signal into the old non-standard one that the mits takes. that's supposed to enable it for use with a ps3, or other hardware.
I too would like to get the glasses working without the nvidia software. For use with other 3d output devices. Theres another thread on this forum about some fellows working up hardware solutions to replace the proprietary emitter. But I always have an HTPC near the tv, so I don't mind a software solution myself. But i'd prefer one that didn't require a top of the line nvidia graphics card, and some tricky program running on the computer, in this case, the 3d video player.
I did a little usb snooping when using the VESA connector. This is what I found.
bits 1 and 2 at 0x2021 apparently have to do with the vesa connector. bit 1 is on when you have something plugged into it (not sure if a signal is necessary though). bit 2 is on when you are actually syncing with the vesa port. bit 0 is still the front button, of course.
When watching 3d this means 0x2021 is 0x06. And when the nvidia drivers disables the 3d (because I hit the button) it drops to 0x02. not sure if bit 2 means we have a good signal from the tv, or we're _using_ the vesa signal to sync to.the driver continues to flip 1B (0x2022) to 0x07 between 0x03 when enabling/disabling the 3d feature.
it still sends timing details like cms' original code pasted here does. they look identical. even though its going to use the vesa (3d dlp link) to do the real syncing.
It never sends any flip commands though, like cms' original code does. just the initial timing details. and a constant 'status' request of location 0x18. which as I said is usually either 0x000006 during normal operation. I don't have any screens that work at 100hz, so I've never seen the glasses operate in any normal fashion, without a vesa connection.
the driver is smart enough to know if you didn't plug the vesa in, and displays a warning on screen. obvious it can see that 0x2021 is 0x00. if I plug the vesa in but dont turn 3d mode on the mits, then it probably figures it out through the hdmi negotation, and displays a different error message to the fact. when I try to use 3d anywyas. the 0x2021 is 0x02. implying the mits always provides some sort of signal over the vesa.
I'm running a x64 windows 7 system.
-
- One Eyed Hopeful
- Posts: 2
- Joined: Wed Jul 07, 2010 3:46 pm
Re: [DIY] homemade NVIDIA 3dVision interface code
I tried compiling the Activator code following the instructions from drowhunter, but get the following link errors:
fbo.obj : error LNK2001: unresolved external symbol __imp____glewCheckFramebufferStatusEXT
fbo.obj : error LNK2001: unresolved external symbol __imp____glewFramebufferRenderbufferEXT
fbo.obj : error LNK2001: unresolved external symbol __imp____glewRenderbufferStorageEXT
...and 10 other similar ones. It seems it can't find the glew libraries even though I specify glew32.lib to be included. Any suggestions? Thanks in advance!
fbo.obj : error LNK2001: unresolved external symbol __imp____glewCheckFramebufferStatusEXT
fbo.obj : error LNK2001: unresolved external symbol __imp____glewFramebufferRenderbufferEXT
fbo.obj : error LNK2001: unresolved external symbol __imp____glewRenderbufferStorageEXT
...and 10 other similar ones. It seems it can't find the glew libraries even though I specify glew32.lib to be included. Any suggestions? Thanks in advance!
-
- One Eyed Hopeful
- Posts: 2
- Joined: Wed Jul 07, 2010 3:46 pm
Re: [DIY] homemade NVIDIA 3dVision interface code
Nevermind, figured it out. I had to compile as release and input a couple extra libraries. Thanks anyway.
-
- One Eyed Hopeful
- Posts: 2
- Joined: Wed Jul 21, 2010 7:48 pm
Re: [DIY] homemade NVIDIA 3dVision interface code
Thank you a lot!
It's a good program.
It's a good program.
-
- Cross Eyed!
- Posts: 133
- Joined: Mon May 28, 2007 5:39 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Great tool! It´s the program that made me buy an ati 5870 2gb now.
Found out that my Samsung DLP had ghosting when i used the tool. But got rid of it by putting 110 hz in the config file of the tool.
It´s perfectly synced and there is no ghosting, don´t know why but it works.
Thats with 3d vision deactivaded and iz3d Checkerboard and this great tool on my 2x 280 gtx.
Found out that my Samsung DLP had ghosting when i used the tool. But got rid of it by putting 110 hz in the config file of the tool.
It´s perfectly synced and there is no ghosting, don´t know why but it works.
Thats with 3d vision deactivaded and iz3d Checkerboard and this great tool on my 2x 280 gtx.
-
- One Eyed Hopeful
- Posts: 10
- Joined: Fri Aug 06, 2010 11:29 pm
Re: [DIY] homemade NVIDIA 3dVision interface code
The demo executable almost works - but vsync is set to false so there is a lot of tearing.
I'm having some problems building it (vs2008, win7 32 and 64bit, sfml1.6, glew-1.5.5)
1>app.obj : error LNK2001: unresolved external symbol @__security_check_cookie@4
1>nvidiaShutterGlasses.obj : error LNK2001: unresolved external symbol __imp__memmove_s
1>nvidiaShutterGlasses.obj : error LNK2001: unresolved external symbol __imp___invalid_parameter_noinfo
1>nvidiaShutterGlasses.obj = a whole bunch of linker errors about std::string ...
1>main.obj : error LNK2001: unresolved external symbol "public: virtual __thiscall sf::Thread::~Thread(void)" (??1Thread@sf@@UAE@XZ) ... lots more mutex and threading linker errors
Does the code need a particular SVN version of sfml?
I'm having some problems building it (vs2008, win7 32 and 64bit, sfml1.6, glew-1.5.5)
1>app.obj : error LNK2001: unresolved external symbol @__security_check_cookie@4
1>nvidiaShutterGlasses.obj : error LNK2001: unresolved external symbol __imp__memmove_s
1>nvidiaShutterGlasses.obj : error LNK2001: unresolved external symbol __imp___invalid_parameter_noinfo
1>nvidiaShutterGlasses.obj = a whole bunch of linker errors about std::string ...
1>main.obj : error LNK2001: unresolved external symbol "public: virtual __thiscall sf::Thread::~Thread(void)" (??1Thread@sf@@UAE@XZ) ... lots more mutex and threading linker errors
Does the code need a particular SVN version of sfml?
-
- One Eyed Hopeful
- Posts: 3
- Joined: Mon Aug 23, 2010 11:09 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Just to clarify, is it possible that the 3d Vision glasses could be used with the iz3d driver using this activator? IE,
-Use Iz3d shutter glasses mode
-fire up 3d vision activator
-launch game using iz3d driver
Anyone tried it?
-Use Iz3d shutter glasses mode
-fire up 3d vision activator
-launch game using iz3d driver
Anyone tried it?
-
- Two Eyed Hopeful
- Posts: 79
- Joined: Tue Dec 22, 2009 11:48 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Great Work ! There is some problems but that's really a good work !
I managed to extract the firmware directly from Linux :
1) Download the driver NVIDIA 3D Vision v258.96 driver (or the newest version)
2) Extract files from the .exe: "cabextract NVIDIA*" (cabextract is included in many Linux repository)
3) Launch extractfw with nvstusb.sys in the same directory (extractfw must be compilled via a simple "make")
I managed to extract the firmware directly from Linux :
1) Download the driver NVIDIA 3D Vision v258.96 driver (or the newest version)
2) Extract files from the .exe: "cabextract NVIDIA*" (cabextract is included in many Linux repository)
3) Launch extractfw with nvstusb.sys in the same directory (extractfw must be compilled via a simple "make")
-
- One Eyed Hopeful
- Posts: 3
- Joined: Fri Aug 13, 2010 4:46 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Dear all,
I made a translation of the previous codes from C to Python. Currently, the code is located under:
http://code.google.com/p/pyusbir/" onclick="window.open(this.href);return false;
Currently, it is able to control the shutter glasses. I believe having such a module in Python will extend the algorithm on different platforms such as Windows,Linux and Mac OS X. -Currently only tried with Linux-
Soon I will finalize and come up with a installer and a demo script.
Best regards,
Kaan
-
- Two Eyed Hopeful
- Posts: 79
- Joined: Tue Dec 22, 2009 11:48 am
Re: [DIY] homemade NVIDIA 3dVision interface code
Me too... I'm working on this for a long time ^^
http://github.com/magestik/TuxStereoVie ... ters_nv.py" onclick="window.open(this.href);return false;
But I changed the code : I added some small fixes thanks to the usb log file !
Maybe we can merge our project ? My goal is to make a linux daemon to control many glasses (Nvidia, eDimensionnal, ATI ...)
http://github.com/magestik/TuxStereoVie ... ters_nv.py" onclick="window.open(this.href);return false;
But I changed the code : I added some small fixes thanks to the usb log file !
Maybe we can merge our project ? My goal is to make a linux daemon to control many glasses (Nvidia, eDimensionnal, ATI ...)