Page 1 of 1
Help! My computer thinks a DVI out is VGA.
Posted: 2010-05-03 03:59pm
by Davey
I got another monitor, a Samsung Syncmaster to go with my Viewsonic VA2026, and so far I'm having a bit of a problem with it; Windows 7 doesn't seem to want to realize that my 9800GTX+ has two DVI outputs; it keeps insisting that the second output is a VGA. Either one works just fine as a primary monitor with a DVI-D cable, but either will only work as a secondary if I use a DVI-to-VGA adapter, a VGA cable, and the VGA output. I checked in the NVIDIA control panel and it still seems to be detected as a VGA. No option to specify manually there, at least not through the GUI...
Does anyone know if I can get the computer to realize that the second output on the 9800GTX+ is a DVI, not a VGA? I already updated to the latest drivers from NVIDIous. I don't see a BIOS option to change the output type.
Any help, or am I stuck using VGA, and with a wasted DVI-D cable?
Thanks.
Re: Help! My computer thinks a DVI out is VGA.
Posted: 2010-05-03 04:08pm
by Dominus Atheos
You're quite certain it refuses to output as a DVI? You've tried hooking up two digital monitors with two DVI cables, restarted the computer, opened up "screen resolution" and it definitely only had one monitor listed there?
Re: Help! My computer thinks a DVI out is VGA.
Posted: 2010-05-03 04:10pm
by Davey
Dominus Atheos wrote:You're quite certain it refuses to output as a DVI? You've tried hooking up two digital monitors with two DVI cables, restarted the computer, opened up "screen resolution" and it definitely only had one monitor listed there?
Yup, I rebooted a few times, and it will be detected as a VGA, in both the NVIDIA control panel and in Screen Resolution.
Seems to keep getting wackier by the second. Either monitor works when used as a primary one, with DVI-D, but only VGA works for the second one - the computer isn't picky whether the Samsung or the Viewsonic is the first or second monitor, only how they're connected.
Re: Help! My computer thinks a DVI out is VGA.
Posted: 2010-05-03 04:16pm
by Dominus Atheos
Davey wrote:Dominus Atheos wrote:You're quite certain it refuses to output as a DVI? You've tried hooking up two digital monitors with two DVI cables, restarted the computer, opened up "screen resolution" and it definitely only had one monitor listed there?
Yup, I rebooted a few times, and it will be detected as a VGA, in both the NVIDIA control panel and in Screen Resolution.
Seems to keep getting wackier by the second. Either monitor works when used as a primary one, with DVI-D, but only VGA works for the second one - the computer isn't picky whether the Samsung or the Viewsonic is the first or second monitor, only how they're connected.
So that seems to be completely different then what you were saying before. Before you said the only way you could get it to work is if you used a VGA cable but now you're saying that if you have it hooked up with a DVI cable then the computer recognizes the dvi connection as a VGA connection. So which is it?
Re: Help! My computer thinks a DVI out is VGA.
Posted: 2010-05-03 04:19pm
by Stark
The cunt levels in this thread are high enough already, but I had this problem; Nvidia Control Centre insisted my monitor was a VGA CRT and locked me out of heaps of functionality. It went away when I updated a few months ago.
Re: Help! My computer thinks a DVI out is VGA.
Posted: 2010-05-03 04:24pm
by Davey
Dominus Atheos wrote:So that seems to be completely different then what you were saying before. Before you said the only way you could get it to work is if you used a VGA cable but now you're saying that if you have it hooked up with a DVI cable then the computer recognizes the dvi connection as a VGA connection. So which is it?
Wierd, but it's a bit of both... I'll restate it in point form so it isn't so ambiguous.
* The Primary output:
- Works with the Samsung in DVI-D
- Works with the Viewsonic in DVI-D
- Is detected correctly by Windows as a DVI output.
- Is detected correctly by the NVIDIA control panel as a DVI output
* The secondary output:
- Doesn't work with the Samsung in DVI-D.
- Doesn't work with the Viewsonic in DVI-D.
- Is detected
incorrectly by Windows as a VGA output
- Is detected
incorrectly by the NVIDIA control panel as a VGA output
- Works with the Samsung if a DVI-To-VGA adapter and a VGA cable is used and the resolution is set in the NVIDIA control panel to 1920x1080.
- Works with the Viewsonic if a DVI-To-VGA adapter and a VGA cable is used and the resolution is set in the NVIDIA control panel to 1680x1050.
Stark wrote:The cunt levels in this thread are high enough already, but I had this problem; Nvidia Control Centre insisted my monitor was a VGA CRT and locked me out of heaps of functionality. It went away when I updated a few months ago.
Crud. Guess I'll just have to wait.
Re: Help! My computer thinks a DVI out is VGA.
Posted: 2010-05-03 04:29pm
by Dominus Atheos
Davey wrote:Dominus Atheos wrote:So that seems to be completely different then what you were saying before. Before you said the only way you could get it to work is if you used a VGA cable but now you're saying that if you have it hooked up with a DVI cable then the computer recognizes the dvi connection as a VGA connection. So which is it?
I'm sorry to confuse you. It's actually both... I'll restate it in point form so it isn't so ambiguous.
* The Primary output:
- Works with the Samsung in DVI-D
- Works with the Viewsonic in DVI-D
- Is detected correctly by Windows as a DVI output.
- Is detected correctly by the NVIDIA control panel as a DVI output
* The secondary output:
- Doesn't work with the Samsung in DVI-D.
- Doesn't work with the Viewsonic in DVI-D.
- Is detected
incorrectly by Windows as a VGA output
- Is detected
incorrectly by the NVIDIA control panel as a VGA output
- Works with the Samsung if a DVI-To-VGA adapter is used and the resolution is set in the NVIDA control panel to 1920x1080.
- Works with the Viewsonic if a DVI-To-VGA adapter is used and the resolution is set in the NVIDIA control panel to 1680x1050.
Did the issue just start or have you always had it? How old is your video card?
Re: Help! My computer thinks a DVI out is VGA.
Posted: 2010-05-03 04:42pm
by Davey
I don't know if I've always had this particular issue; the last monitor I used as a secondary monitor was a much older Viewsonic 1600x1200 CRT monitor that only had a VGA input, so I didn't notice any problems when I used a DVI-to-VGA adapter and a VGA cable with it. It's only now that I've noticed the secondary output acting up and thinking it's a VGA, when I've tried switching to DVI-D cables. I tested both sets of cables just to be sure, the cables themselves are fine. I tried a bit of google-fu but didn't see anything besides 'update the device drivers,' which I'd already done.
The computer's still fairly new, the graphics card is an XFX NVIDIA 9800GTX+ OC Edition. The graphics card was bought when I put together the PC. So I'd say it's seen roughly a year and a couple months' of use.
Re: Help! My computer thinks a DVI out is VGA.
Posted: 2010-05-07 03:02pm
by Davey
Problem solved. I reinstalled the driver with the DVI-D cable plugged in instead, instead of with no cable plugged in at all into the computer, like when I had installed it before. It is now detected as a DVI.
Could somebody close this thread, please?