Jump to content

Cannot get 32-bit color at 1920x1200 from IGP


Recommended Posts

I am using the HDMI output on a Gigabyte TForce 8200 and I cannot get 32-bit color to work on at 1920x1200 - only 16-bit is working at that resolution. This is true if I use the default BIOS setting of Auto (256MB) or I force 512MB.

I can get 32-bit color on 1920x1080 though. So I am trying to figure out if this is a limitation of the chipset or buggy nvidia drivers.

Gigabyte TForce 8200

Windows 7 Server RC

nvidia 185.85_desktop_win7_64bit_english_whql

If this is a limitation of the 8200 does anyone know of an AM2+ compatible motherboard with a chipset that can pull this off?

Link to comment
Share on other sites


The AMD chipset 780G would do the trick.

If you can return the board I would change it for something better, if not, than I would say it's a memory problem, could be BIOS (any updates?), could be indeed the BIOS settings.

You can always throw in a ATI HD4350 or so, cheap and works well ;).

Link to comment
Share on other sites

I'm not sure - nvidia makes it very hard to determine the specs of their chipsets, and I don't have one to test. However, I do know that the driver does determine what res and color depth settings you can set (whether that be the video driver or the monitor restrictions, but in this case it would be a video driver restriction as it's color depth and not max res or refresh). I'd guess the driver doesn't think you have the memory bandwidth to the video chipset to handle 1080p at 32bit color, which may be a driver bug (if you in fact actually can support that depth at that res) or an actual chipset limitation (it takes quite a bit to run that many colors on that many pixels, and the 8200 isn't exactly a great video chipset).

Nvidia does make a point to say it supports "HD resolutions" over HDMI, but not the color depth. That may be on purpose (1920x1080 is still HD, even if it's 1080i and not 1080p).

Link to comment
Share on other sites

I'd guess the driver doesn't think you have the memory bandwidth to the video chipset to handle 1080p at 32bit color, which may be a driver bug (if you in fact actually can support that depth at that res) or an actual chipset limitation (it takes quite a bit to run that many colors on that many pixels, and the 8200 isn't exactly a great video chipset).
It's a memory problem indeed, the RAMDAC is fast enough, but it could be an early BIOS problem too. The 8200 isn't limited on HDMI as it's just linked to DVI without any sound at all (not 100% sure about if the 8200 supports sound over HDMI)...

EDIT: Second thought, nVidia doesn't have sound provided by the GPU, at leats not on mGPU/iGPU, what was I thinking :P.

Link to comment
Share on other sites

Thanks for the thoughts guys, I am think this is likely a bug, because they advertise that HD (1080?) works including HDCP, Purevideo HD, and Audio over HDMI so I think it should be powerful enough to hanldle those extra 120 lines considering I am not even using those features.

Working

1920x1080 = 2,073,600 pixel elements

32 bits per pixel

total bits = 66,355,200

Not working

1920x1200 = 2,304,000 pixel elements

32 bits per pixel

total bits = 73,728,000

I found a person with about the same issue

http://forums.nvidia.com/index.php?showtop...286&hl=8200

Not working

1280x1024 = 1,310,720 pixel elements

32 bits per pixel

total bits = 41,943,040 x2 = 83,886,080

Working

1280x1024 = 1,310,720 pixel elements

24 bits per pixel

total bits = 31,457,280 x2 = 62,914,560

Link to comment
Share on other sites

Ow come on, they can't address over 64MB, when using DVI/HDMI, of the shared memory for just the GUI? Well, again it's better to move on to ATI.

I don't have anything against nVidia, but they REALY do their best to make blunders like this... :no:

Link to comment
Share on other sites

Nvidia does make a point to say it supports "HD resolutions" over HDMI, but not the color depth. That may be on purpose (1920x1080 is still HD, even if it's 1080i and not 1080p).

In fact, 1920x1080 is full HD in 1080p in 16:9 aspect ratio. 1920x1200 has a 16:10 aspect ratio, and is capable of full HD, but it'll leave a black unused area on the top and bottom of the screen. ;)

Edited by beats
Link to comment
Share on other sites

If you'd find it useful, I can try it on my passively cooled Sapphire HD 4670 using HDMI (already know it works perfect over DVI, just gotta look for my HDMI cable) and onboard 780G too (ASUS M3A78-EM mobo, AM2+ indeed). I did pay the extra $10 not to be stuck with the GeForce 8200 on a cheaper M3N78-VM :yes: (I've finally learned my lesson after buying enough nvidia cards)

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...