Page 1 of 1

How will sync for shutter-glasses be done?

Posted: Sun Aug 10, 2008 8:54 am
by Odysee
Hello!

I'm wondering how the sync for the shutter-glasses will be done in the future. I think there are different ways to get the sync to the glasses:


1) stay with the DDC-Data-pin

PRO: The DDC-Data-pin can be used with VGA-signal AND DVI-signal. So you don't have to feed your digital display (DLP) with VGA.
CON: DDC-Data is used for communication between VGA and display. It's IMHO not stable enough and can cause "flashes" in the picture if some device is sending data.
If you want to connect your display via DVI/HDMI you have to make your own break-out-cable the get the sync from the DDC-Data-pin

2) use another Port to output sync

PRO: Independant from VGA-card and its diplay-ports (you can use VGA, DVI, HDMI and future ports without changing the special break-out-cable). Stable solution because it's ONLY used for the sync signal.
CON: external hardware must be developed (COM-device or USB-device)

3) use vsync from VGA-card the generate the stereo-sync

PRO: Stable solution
CON: Is vsync active while sending an digital video-signal? A special adapter/break-out-cable is needed for every connection (VGA, DVI, HDMI)


If I could choose one solution I'd take No. 2. iZ3D could develope some kind of USB-Device which is also used as copy-protection-dongle for the driver/software the output the stereo-sync and sells it as bundle (driver/software, USB-Sync-Device, shutter-glasses). It would be fine if the driver could include the option to delay the sync-signal to compensate the lag of the display, so no extra hardware is needed for this. The cost of Hardware for this solution would IMHO not be that big (a single microprocessor could be used for just outputting the signal). But it could be a problem to read the sync from the VGA-card.

Shutterglasses synching in the future

Posted: Sun Aug 10, 2008 9:13 am
by budda
There is a simple way to synchronise shutterglasses.

Use BlueTooth radio technology to receive the synchronising signal wirelessly.

Not only that, but microphone, stereo headphone or other sensor support can be included.

Look at the versatility of the Nintedo Wii remote as an example of what this technology is capable of.

:)

Posted: Sun Aug 10, 2008 2:19 pm
by Xerion
The question is not how the transmit the sync signal, but rather where to get it from in the first place.

Posted: Sun Aug 10, 2008 3:08 pm
by android78
I believe the only viable 'Future' solution is to have it included in the DVI spec frame data. If you have the flag included in the data for the frame there should be no way for the frame to get out of sync. I do doubt that this will actually ever be implemented, I just dream of it. It will probably end up being the DDC pin, the same as before which is really no good for modern digital TV which all seem to have some processing delay.

Posted: Mon Aug 11, 2008 3:41 am
by LukePC1
Xerion wrote:The question is not how the transmit the sync signal, but rather where to get it from in the first place.
I think so, too. NV included the sync with the GPU+Driver. How can a 3rd party driver use the DVI or VGA port then?

Just look at it:
ED had to use interlaced and puts the sync in the dongle.
Vizuix uses the USB port for sync and Headtracking.

So the only one that is able to use the graphics ports is the manufacturer himself. And NV just stopped most support for S-3D :(

multithreading

Posted: Wed May 13, 2009 12:37 pm
by 1140
I once used this library in my application, to use the LPT port, as a digital output device.
You have 8 bits to set, while you only need one :-)
The problem of 60Hz should not be a problem, possibly latency ?
And how to combine this in your application.
You could configure wait on sync, and the swap the LPT port bit for the other eye.

A nicer way would be if the iZ3D driver could render the frames in the application thread to just to the backbuffer, and taking a screenshot, adding it to a queue.
In a separate thread it could then check if both new eye-screenshots are available, and the takes them from the queue.
These two new images should then be continuosly be rendered to the frontbuffer, waiting on the sync, sleeping 1/5 of the refresh rate, blitting other eye, and wait on vertical sync again.

This way you would get high refresh rates, so no ghosting, even if you application would render 5fps :-)

But if it was this simple, maybe someone would have made this already ?
I do think nVidia is doing something likewise in there driver, since there is a stereo image, even when my application crashes :-)

Maybe this is a nice feature request ?