Page 1 of 1

Is there any reason why I shouldn't use D-SUB for WUXGA?

Posted: 2007-12-15 07:23pm
by Xisiqomelir
Just bought a new el cheapo widescreen (nice deal @ the egg, btw), mainly because it has lots of connectors and I want to play my consoles on it as a stopgap while I wait for CES and next year's good TVs.

Although I'm quite fond of it so far, there's no DVI connector. On various random hardware forums, people act as though this is a horrible thing, but so far I'm doing all right with the computer over D-SUB, PS3 over HDMI and Wii over composite while I wait for new component cables.

Is there really some amazing advantage to a digital-digital connection, so great that I should pick up a DVI-HDMI male-male and use an HDMI switch, or is it really no big deal?

Posted: 2007-12-15 07:38pm
by phongn
If you don't have a dual-link DVI port on your computer, you might not be able to properly drive the monitor at full resolution; otherwise, the HDMI-DVI route is the best bet.

Posted: 2007-12-15 08:17pm
by Xisiqomelir
phongn wrote:If you don't have a dual-link DVI port on your computer, you might not be able to properly drive the monitor at full resolution; otherwise, the HDMI-DVI route is the best bet.
1900x1200 (monitor native) is working fine over D-SUB.

Posted: 2007-12-15 08:20pm
by phongn
Gah, I totally messed that one up. I meant that if you didn't have dual-link DVI, you wouldn't be able to properly drive it over DVI/HDMI. DSUB is a bit less than ideal, but if it works, it works.

Posted: 2007-12-16 12:31am
by The Grim Squeaker
Sorry for the thread hijack, but assuming that I also have a nice large widescreen with VGA only, 1440X1050 res - is that worth trying to swap it out/upgrade for? (No, I don't plan to watch HD dvd's on my screen)

Re: Is there any reason why I shouldn't use D-SUB for WUXGA?

Posted: 2015-01-22 05:17am
by Xisiqomelir
Mmm, dat 8-year bump.

I still have the same Westinghouse monitor, really one of my best tech purchases ever for all of the features (like 16:10, and IPS, and S-Video/Component inputs!) and reliability, but now I have to use HDMI because my 970 GTX only has ports for that, mini-DP and DVI. Of those, the L2410NM only supports HDMI.

When I first put my new card in everything looked like totally smeared, blurry shit, which was strange because my 4670K's on-board graphics had been doing sterling service with no problem. After an entire day of frustration and more information than I ever wanted to know about EDID, the Extended Display Identification Data, I learned that the monitor was identifying itself as a television, and the NVIDIA card was not displaying actual 1920x1200.

To make a horrid long story short, if you're having trouble running your computer monitor over HDMI, you can force an appropriate EDID.

This is the solution I used.

Bonus for kinder, gentler, 2015 Xisiqomelir: Windows fix