Whether the TX2 development can be connected to a DVI display through an HDMI-to-DVI port
No guarantee. Better using native HDMI device.
I know, but my monitor only has VGA and DVI ports, and I’d like to know whether HDMI to VGA or HDMI to DVI is reliable
When my answer is “no guarantee”, it means not reliable.
Ok, thank you
Some additional information…
VGA was not “plug-n-play”. One had to have a “driver disk”. This disk was not really a driver, it was instead a standard format of a list of monitor capabilities which the actual driver would read. Once we got to plug-n-play (digital DVI, HDMI, DisplayPort), a wire known as the DDC wire was added. This is an i2c serial protocol wire. That i2c circuit is in the monitor, and the power to the i2c is provided by the GPU over the cable (which means the GPU can query a monitor which is turned off). That i2c data is the “EDID” data (nowadays it is EDID2 protocol, but I don’t know anyone who would normally specify the second revision).
When you cut the DDC wire you cut all possibility of automatic configuration. What you have left is a default mode. Perhaps that mode works with the monitor, but perhaps not.
Some people will tell you that VGA added a DDC wire. What they usually don’t say is that the older VGA monitors which provided DDC usually also had a digital DVI. Or else the EDID on that wire was the original EDID, not EDID2. In theory, EDID is backwards compatible with EDID2. However, I don’t know of any modern drivers which actually work with EDID (rev. 1). So I’d say there is zero chance of a VGA adapter (even if has the DDC pins) allowing automatic configuration. You’re back to whatever the default mode is.
I think some people actually edit their kernel to pick a default mode which is compatible with their monitor, but that’s a lot of work.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.