As promised, here are some pics, a video of the action and a more detailed description of the device!
at 2011-07-06: This picture shows what the device looks like (beautiful isn't it?
). On the left the cable leading to the sensor is plugged in, on top the shutterglasses (Asus VR100 wired shutterglasses, but any 3.5mm wired ones should work). From the right a 12V power supply is connected.
at 2011-07-06: This image shows the box up close. Pardon the crappy styling ...
the left side has a 3.5mm jack for sensor input, right side a DC chassis for 12V input, at the top (though not visible here) there are 4 connectors for shutterglasses (all recieving the same signal). At the bottom of the image, you see a power switch, an eye switch, a slider switch with 4 positions that selects the parameter to be adjusted (left eye delay, right eye delay, shutter opening time and something for debugging that is not worth mentioning here). The two pushbuttons can then be used to increase/decrease that particular parameter. So what is the job of this box? It recieves the signal from the phototransistor, and syncs only on the leading edge of this signal (meaning it syncs only once in two frames - so it syncs everytime both eyes have received a frame). At first it synced on both edges, but since the duty cycle of the phototransistor signal varied all the time (see my posts in this thread
), it was better to let it sync on one frame only and then have a fixed delay for the other eye. Relative to this sync, the delays for each eye and the shutter opening time can be adjusted. Pressing both buttons at the same time saves the current settings to EEPROM memory, so it remembers upon next startup what your last settings were.
at 2011-07-06: This shows how the sensor is mounted on the monitor. Initially, it was just a phototransistor with a cable attached to it, leaving all electronics in the small black box. However, it seemed the signal was to weak to go over the cable without interference. Although covered in tape, you can see it became a bit bulkier: it is now a phototransistor that sends current through a resistor (generating voltage signals whose shape can be seen in my posts in this thread
), which is then lead directly into an operational amplifier (LM741) and compared to a reference voltage which is configurable by a potentiometer. Basically, the voltage signal is compared to this reference voltage and thereby turned into either High or Low, meaning the output becomes a square wave which goes to the little box. The duty cycle of this square wave is varying all the time in the case of my 2233RZ (again, see my post in the aforementioned thread for an explanation why), but sync with the monitor is stable.
at 2011-07-06: These images show the inside of the box. If anyone iis interested I can draw up full-blown schematics, but for now I will just give an overview of the circuitry. The long black chip in the lower left corner is an operational amplifier. Basically it does the same thing that the circuitry embedded in the sensor does: compare the input to a fixed reference determined by the potentiometer next to it, and generate a square wave as a result. It's just a precaution: if the signal from the sensor wasn't a square wave yet, it is now.
To the left there is a bunch of wires going to all the connectors, buttons and switches in the top half of the box. The "three-legged" chip in the middle is a 7805 voltage regulator that converts the 12V into 5V supply for the logic elements in the circuit. It has some required capacitors around it.
The longest chip is an AtTiny2313 microcontroller. This has my software on it, and controls every aspect of the shutter timing based on the signal it receives from the operational amplifier. It outputs the signals for each eye of the shutterglasses. Next to it is a connector for uploading code into the device.
Finally, there is a pair of transistors (BC550) and a pair of resistors near the wire connector on the left. These two transistors amplify the output signal of the microcontroller, which is still 5V range, to 12V. They also have the power required to drive the shutter glasses (though I have not tested how much power is actually needed for this, they seem to do well).
...But of course, all you really want to see is the result!
Since taking crappy pictures with my phone only resulted in photos that were smeared all over the place, I chose making a small video instead:
In the video you can see that while there is no noticable ghosting in the middle of the screen, the bottom half of the car seems a tad ghosted. I found that no matter how short I make the shutter opening, or how I adjust the timings, there is never a combination without any ghosting anywhere. The best I can do is move the ghosted part around the screen (top or bottom). I am still hoping that activating the "hidden display mode" of the monitor (which, to be fair, I am not certain exists) will solve this issue. Anyway, I can be sure that the ghosting is not "my fault" but the monitor's: after all, I can tweak the glasses any way I want and the sync is stable as hell. This means that it shouldn't be a problem when using this system for a projector, for example.
Also, the image is considerably darker with the glasses on - this is of course due to the shuttering. It can be adjusted to be brighter, but some brightness will always be lost since the eyes share brightness coming from the monitor.
I wish I could compare my results so far with those of 3D Vision users! I have never had the chance to use such a system yet. I am pretty sure though, that this system should provide equal or better 3D - at least, if the monitor behaves identically in both cases! I still haven't tried using 100hz or 110hz instead of 120hz, which should also go a long way for reducing ghosting.
As soon as I get bored playing around on my monitor, I will take the time to build a less sensitive sensor for projector use, and adapt the software to different frequencies. Then I can post some results with my 85Hz DLP projector!