Maybe i got the above wrong...
Spent a little time trying to understand this.... 2160p, 60hz, 24hz, 23.976hz, 10 bit, ycbcr 444, 420, hdmi 2.0, hdcp2.2...
After playing around with my q10pro, 4k avr, 4k tv... am i right to say below... true or false?
.. if a video is 1920x1080, setting the player at 2160p does nothing; only that the tv may upscale it.
.. if a video is 23.976fps, the correct one to set on player is 24hz; 25, 30, 50, 60hz can work but does not improve video.
.. rgb is meant for older, lower grade videos like dvi, avi; ycbcr is better for newer videos.
.. ycbcr 420 8bit 60hz is a better setting than ycbcr444 10bit 24hz unless the video is 10bit.
Maybe some of u can help me out, thanks!
Set player to auto...player will output FHD...TV will upscale. If set player to 4k, TV does nothing.
23.976 if available...24 closest if not.
Not really.
Not really.
Movies are typically encoded to 420. (not sure about 4k) For you to see, it's converted back like this....
. ycbcr to rgb.
Your settings decide who gets to convert what. Your picture quality depends on who does a better job...player or TV.