QUOTE(jamesleetech @ Jan 3 2017, 03:18 PM)
Thanks for your reply. Appreciated it.
You said "
The receiver/player needs to be set to passthrough (or anything equivalent)". This is where I did understand much earlier. The metadata being "relayed out" without being touched by the AVR/Player so its the TV/Projector display that processes the HDR10 metadata. Yes, whether HDR can be properly processed or not is an ongoing debate.
I should also say that I heard that the TV/Projector must also be calibrated (ISV?) properly and if not done properly, HDR can make it worse, not better.
My question was not about the complexity of Dolby Vision or possibly HDR. Its about the question on why any player must be compatible to just passthrough the HDR signal out to the display. I could not find the answer to this and you also cannot answer too when you replied "
can't answer why some players can passthrough HDR and some can't" if player compatibility is not required.
IF, again IF, player compatibility is required, then it should also mean the same thing for the HT AVR/Pre-Amp when its connected to the player. Correct me if I am wrong... I haven't heard and did not find anywhere that says AVR/Pre-Amp needed to be compatible. If compatibility is needed then I believe the HDR logo should appear printed on the front body of the AVR/Pre-Amp (similar to Atmos logo) but I have not seen it.
Do correct me if I have misunderstood you... as you have said, non-HDR Ready TVs simply don't do anything to the "additional metadata" received so the picture will not be affected. If that being the case, why should Oppo put in the HDR "on and off" setting into 203 when the HDR metadata do not need to be filtered out to a Non-HDR TV/Projector? Having this HDR on or off did affect the picture as someone in Facebook have already tested. What did Oppo 203 do to the signal when its set to HDR On or HDR Off? By right, the Non-HDR TV/Projector just ignores the metadata so the HDR will not affect the picture... or am I wrong here?
Displays that are capable of reproducing proper HDR values (at least above 1000 units on a peak brightness scale of 10% window) follows a different set of calibration values. As for HDR making it worse, yup - just take a look at all those displays that really only has 400 nits in brightness and pretends that's HDR. Edge lit TVs are the worse offenders especially, with haloing and blown-out whites. Unfortunately, majority of people thinks it's awesome because.. "HDR". Lol.
Perhaps not entirely on-topic, just hear me out for a minute. I've did some digging into the HDR compatibility thing via the PS4. Yes I know, it's a game console but it's also a media player at the same time. Thing is, it's related in the fact that it never had HDR capability until just over 2 months ago.
By checking out some of the design prints and schematics, it seems to me that the HDMI controller on the PS4 (a custom Panasonic MN86471A) needs to be able to passthrough HDR in the first place. In this case, because the southbridge of the PS4 encrypts all data into HDCP2.2, all the HDMI controller needs is patching it for support via firmware update (which Sony did).
Now to return to the question at hand: whether or not disc players compatibility is required?
From what I understand, many do not have the support because of various reasons like not having the proper HDMI controller or as it was already infamous enough; the HDMI port not actually being a true HDMI 2.0a port with HDCP 2.2 support.
Remember how in 2015 we had receivers with only 2 or 3 ports being HDMI 2.0a/HDCP 2.2 capable? In 2016, we have countless displays where only
ONE HDMI port is capable of receiving HDR metadata, 4k at 4:4:4 at 8-bit, or 4k at 60hz.
All this leads me to believe that the HDMI controllers (and since there are so many of them) are at fault here. To take the PS4 as an example again (since it's like the only mainstream media player that received a firmware to enable itself to be a HDR capable machine), manufacturers can indeed patch HDR support into their receivers/players IF the HDMI controller is capable of handling the data to begin with.
Unfortunately it's impossible to tell which player/receiver has which HDMI controller since there are so many of them. Also the fact that majority of manufacturers refreshes their product cycle every 8-14 months (not everyone is Oppo you know, telling people it'll be ready when it's read), it's economically obvious to many that wasting time on old products where they can instead cash in the feature on next year's product is a wiser choice (unless you have like 45 million units sold on a single SKU).
As for the last question, I don't have an Oppo 203. So... not much comment about what you've said about the player have different results. By right, if the player is connected to a non-HDR capable TV, then the option should simply be greyed out entirely (which is what the PS4s/XboneS are doing).
Can you link me to that FB post where the person tested out HDR on a non-HDR capable display? Sounds to me like the processing is done on the player itself before being sent out to the display (basically like "bitstreaming" the entire video signal over after post-processing).
Good discussion though.

QUOTE(teop @ Jan 3 2017, 04:01 PM)
I'm no expert here, just writing my thoughts for discussion.
I too think that all that is in the digital domain when unprocessed should comes out identical thru the signal path. But I have my doubts when I tried ripping audio CDs. I found out that what you rip may not be bit-to-bit accurate due to the source CD quality and also the CD reader. And the thing is there is no way of knowing if there is an error. So that kinda tells me that in this path there is lacking of error detection or correction here.
So when I think about it, it sounds logical since when it comes to streaming most of the data is in real-time and hence time sensitive and must be tolerant to errors. Unlike data files where accuracy is more important than timeliness, the system can afford to retry and therefore includes more accurate error detection and correction that will allow the system to ensure accuracy.
So in that sense, it maybe possible that the blue-ray player are allowed to read audio with errors to a certain extend. Same goes to audio bit streaming. If this is true, then it is desirable to have as short as possible path...
Which ripping software did you use? Also, what settings? Different rippers has different algorithms and it has objectively been tested that many ripping softwares has not always done the perfect job of a bit-matched rip.
This is the same case with remuxing blu-ray movies also. I've came across some remux rips using DTS-HD as the codec but instead somehow, the bitrate is only 1.5mbps instead of being above 3mbps (the average for DTS-HD files). Sure enough, I took the same disc version and compared them A-B and there's an absolute difference which is noticeable to my ears.
This is what you call variable causes.
But with all else being equal,
I still hear no difference.