QUOTE(fun_feng @ Nov 5 2009, 03:40 PM)
Firstly each pixel has 32 bits (0,1), so a FULL HD 1080p is 2 million x 32 bits = 64mil bits.
Assume a 60Hz refresh rate which means 60x64 mil bits = 3.84Gbps + additional bits for CRC, audio and god knows what... is well within the range of HDMI 1.3 (10.2 Gbps)
YOu must understand 0 or 1 is not ON or OFF this simple in this case. These 32 bits represent a coding that the TV understand so that it will project a color to the pixel.
Then you must also understand something call CRC (Cyclic Redundancy Check). These bits transmit throught the cable in "packets". Any corrupted bits in the packets gets detected by the CRC algorithm which the TV calculates. Now the TV know that the packet for this particular pixel is spoilt, the TV will most probably display a default color for it, white if i am not wrong.
You are correct you will still get a picture, but not getting the FULL picture. But if a few thousands pixels are corrupted, most probably your picture will have white patches on it. You will not get clearer picture, red is redder, sky is bluer etcc....
Anyway, the probabllity of getting corrupted bits is probably 0.0001%.
The colour bits are a bit more complicated than that with the sub pixel yet to be talked about. But your explanation shd be enuf for most to get the jist.
However, moomoos sees something different. Let's leave it at that.
I for one have not seen differences and the measurement equipment I have also does not show anything different. However, picture noise and sharpness are pretty subjective and cannot be measured.
Maybe the next time I'm in Ipoh I can take a look and see it on his screen.