azagahl Posted June 15, 2005 Share Posted June 15, 2005 My NVIDIA video card has two outputs - an ordinary VGA kind and a DVI-D kind. My LCD monitor has the same kinds of inputs.However when I view DVI-D on my monitor, it looks chunky, very bright, and the number of colors is greatly reduced.Is DVI-D just intended for TV's? Link to comment Share on other sites More sharing options...
Zxian Posted June 15, 2005 Share Posted June 15, 2005 DVI = Digital Video InterfaceDVI is supposed to be better for LCD monitors since they render the images digitally, while CRT monitors use analog signals to render the pictures.If you're having troubles, I'd look into your drivers or any type of "auto-adjust" setting on the monitor.And to be complete... no DVI-D is not just meant for TVs. Link to comment Share on other sites More sharing options...
azagahl Posted June 16, 2005 Author Share Posted June 16, 2005 I gave DVI-D another try and set my monitor from User color settings to 5400 K and it looks nice now. I also used Norman Koren's gamma test patterns to tune for Gamma 2.2. I can't tell if DVI-D is better than VGA - they both look spectacular on LCD. Link to comment Share on other sites More sharing options...
Zxian Posted June 16, 2005 Share Posted June 16, 2005 I gave DVI-D another try and set my monitor from User color settings to 5400 K and it looks nice now. I also used Norman Koren's gamma test patterns to tune for Gamma 2.2. I can't tell if DVI-D is better than VGA - they both look spectacular on LCD.<{POST_SNAPBACK}>Is there an option for 9300K colour? It'll give you a much brighter display with more vibrant colours. Technically DVI-D is better than VGA, but unless you're really finicky, you probably won't be able to tell the difference. Link to comment Share on other sites More sharing options...
azagahl Posted June 16, 2005 Author Share Posted June 16, 2005 (edited) 9300K colourI didn't try that; I thought 5400K or 6500K were "standard"; wouldn't 9300 K look extremely bluish?I'll give it a try.. I think on an LCD its phony anyway because they don't actually heat anything, it's just simulated. On mine you can set R G and B factors directly. Edited June 16, 2005 by azagahl Link to comment Share on other sites More sharing options...
Zxian Posted June 19, 2005 Share Posted June 19, 2005 5400K is meant to simulate the light emitted by a standard incandecent (I can't spell...) light bulb, while 9300K represents the light from the sun (since the surface of the sun is ~9300K).9300K is a much "whiter" white, while, 5400K tends to be a bit more yellow.If you want to understand a bit more on how this works, read up on Blackbody radiation and emission. As things get hotter, they emit more light at higher frequencies (lower wavelengths = bluer -> purple). That's why a relatively "cool" electric stove element looks red - it's not hot enough to emit the shorter wavelengths, which would mix to give you white light. Link to comment Share on other sites More sharing options...
azagahl Posted June 19, 2005 Author Share Posted June 19, 2005 (since the surface of the sun is ~9300K).It's only 5780 K: http://en.wikipedia.org/wiki/SunSo 5400K is the common choice for simulating daylight.I don't know what 6500K is for - indoor lighting?It seems 6500K is somewhat of a standard choice for computer graphics.Not sure about photography...9300K looks far too blue. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now