QUOTE(lepo @ May 23 2009, 11:34 AM)
guys, are there significant increase in quality if using HDMI cable but in the end using HDMI/DVI adaptor at the lcd?
btw my gc sapphire HD 4830
QUOTE
Because HDMI is electrically compatible with the signals used by Digital Visual Interface (DVI), no signal conversion needs to take place nor is there a loss of video quality when a DVI to HDMI adapter is used.
Source:
http://en.wikipedia.org/wiki/HDMIBut the HDMI can transfer more video data than DVI.
HDMI video bandwidth
8.16 Gbit/s : 2560Ă—1600 @ 75 Hz
DVI video bandwidth
Single Link : 3.96 Gbit/s : 1920 Ă— 1200 @ 60 Hz
Dual Link : 7.92 Gbit/s : 2560 Ă— 1600 @ 60 Hz
Sources:
HDMI Versions and
DVI SpecificationsJust don't run at 2560x1600 @ 75 Hz and you should be fine.
QUOTE(RegentCid @ May 23 2009, 11:42 AM)
LOL....Brother Human Eye max also can see the FPS until 23 only.......if the FPS is more than 25 above u will not make any diffrences bewteen 25 or 50.....to play a game u just need maintain the fps at 25 above only then u will see anything in smooth...some ppl is 20 fps some is 25fps.
While we're not likely to consciously notice differences at above 30 FPS, our eyes are indeed sensitive up to 60 FPS. It is this reason that constant CRT usage for hours is not recommended and CRT can cause illness to those sensitive to it.
If you look at a CRT with your peripheral vision and you can easily see the screen refreshing or 'moving', then its refresh rate is too low for long term use. You should then increase the refresh rate from the default 60 FPS to a higher value like 75 or 90 FPS, depending on what the machine supports, or get an LCD.
Now, LCD refresh rates are normally stuck at 60 FPS, but this is okay because LCDs method of refreshing is different than that of CRTs.