Jump to content

Recommended Posts

Posted (edited)

Hi all,

I'm running an eVGA GeForce 8800GTS with 2 monitors on Windows Vista (Ultimate if that matters). Up until about 2 months ago both monitors ran fine and then for reasons that can be boiled down to personal stupidity I allowed Windows to install a graphics driver update. Ever since only one of my monitors will work. If I try to enable the second monitor (either through nVidia control panel or through Display Properties) the one working monitor goes black, and then comes back exactly the same as before with the standard pop-up asking if I want to keep the new settings. I've tried just about every permutation of which monitor is in which plug, and which one is supposed to be primary and cloning instead of extending etc.etc.etc. and nothing will get the second monitor as an active display. I've also tried completely removing the nVidia drivers and reinstalling them, which frustratingly causes the problematic monitor to become primary when they're removed, and it promptly dies again as soon as they're installed. To me this seems to imply a driver/Vista interaction problem, but I was hoping for some outside opinions.

Here's the hardware invovled:

eVGA GeForce 8800GTS

Samsung SyncMaster 920N (The one that works)

Acer A2216W (The one that doesn't)

and I'm using ForceWare version: 169.25

Thanks for any thoughts,

Apollo324

Edited by Apollo324

Posted
Hi all,

I'm running an eVGA GeForce 8800GTS with 2 monitors on Windows Vista (Ultimate if that matters). Up until about 2 months ago both monitors ran fine and then for reasons that can be boiled down to personal stupidity I allowed Windows to install a graphics driver update. Ever since only one of my monitors will work. If I try to enable the second monitor (either through nVidia control panel or through Display Properties) the one working monitor goes black, and then comes back exactly the same as before with the standard pop-up asking if I want to keep the new settings. I've tried just about every permutation of which monitor is in which plug, and which one is supposed to be primary and cloning instead of extending etc.etc.etc. and nothing will get the second monitor as an active display. I've also tried completely removing the nVidia drivers and reinstalling them, which frustratingly causes the problematic monitor to become primary when they're removed, and it promptly dies again as soon as they're installed. To me this seems to imply a driver/Vista interaction problem, but I was hoping for some outside opinions.

Here's the hardware invovled:

eVGA GeForce 8800GTS

Samsung SyncMaster 920N (The one that works)

Acer A2216W (The one that doesn't)

and I'm using ForceWare version: 169.25

Thanks for any thoughts,

Apollo324

Go into device manager, do an uninstall of your graphics card, shutdown the machine. Power back up, when windows asks to install drivers do not let it, install your drives from the latest Nvidia driver pack. Same card (kicks a** btw) same issue, this fixed it for me

Posted

Thanks for the tip! I'm not sure how to stop Vista from installing the drivers as soon as it boots. All it installs is a generic VGA driver set, so I don't know if that's what you mean for me to stop, but if it is I don't know how to prevent it.

Also, I realized there's a couple of details not in my original post. One is that I just formatted the computer today so the problem occurs from a clean start as well as just randomly occurring after an update. Also, and more strangely imo, I have my Acer monitor in the primary port of the card, so when the system is booting it all shows on that monitor, even the little scrolling Windows load bar shows up on that screen, but once that's done everything gets kicked over to the Samsung.

So yeah, if you could tell me how to force Windows not to install the basic drivers I'll give that a shot, and if the new info I just added changes recommendation let me know and hopefully I can try something else.

Thanks again,

Apollo324

Posted

Any more word on this?

Another couple of things to try, just so we get some more data...

- Have you tried starting the system in VGA mode, so that Windows loads the default VGA drivers?

- Have you tried running your monitor using a VGA cable instead of a DVI cable? Perhaps the DVI connection on the Acer monitor isn't properly providing EDID to your graphics card (something broke maybe)?

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...