QUOTE(Lego Warfare @ Mar 30 2025, 12:58 AM)
Except for videophiles who will only focus on PQ, similar to audiophiles who nitpicks on details, clarity. They focus on the technicals more than the music/movie experience itself. Yet many will fail in blind tests when it comes to differentiating 24 bit vs 16 bit audio, or even
10 bit vs 12 bit video[U]. There is only that much our
eyes/ears
can perceive for our brain to process. And sitting closer to the screen to envelop your field of vision is so wrong especially when you have proper speaker placements for accurate sound staging.
You may have your opinion, but sorry I still stand by mine when I say that immersion = size matters. Growing up with Laser Discs and Projectors with 7.1 speakers, to LED and now OLED, I’ve lived and experience them long enough for me to not get caught up with
newer tech hype that detracts my personal enjoyment. I’d still be more immerse in a 120” inch projector screen with a 1080p Blu-ray than a 65” inch OLED with a 4K Blu-ray.
I don't think there are commercially available 12-bit OLED TVs yet (Not sure about some 12-bit phone brands in the market). The current reference grade displays like the Sony HX3110 or Flanders Scientific XMP650 (QD-OLED) are still with 10-bit color depth with peak luminance keeps increasing each iteration.
While I am not sure how a 12-bit HDR to 10-bit HDR would look like, I am sure I can tell a 8-bit SDR to 10-bit HDR video (A big step up 16.7 million colors to 1.07 billion colors, 12-bit is 68.7billion colors) which translate to a very smooth gradient to the color's representation, more "life like", again more noticeable in "4K HDR" videos, D65 calibrated display.
HDR is a different kind of immersive-ness, I think the more proper word would be to let you "submerge" into the "life-like" visual experience. Also, HDR is not a hype and certainly not a "newer tech hype", it is about expanding the range of luminance (contrast ratio) again to be more "life-like". The wider the range, the better the display would be able to finetune the colors and details. Once you experience an OLED TV hanged to the wall in a completely dark room (to the point you can't even see the border of the TV), and firework, starry night, bright sun floats/glares right in front of you, you probably wouldn't say the same about HDR.
I own OLED, QD-OLED, FALD LED, 700 nits Art OLED laptop (with overall Delta E <1) and some 1500 nits RGB OLED phone from a fruit company. I can tell you one thing for sure, the higher the brightness and contrast of the display, the better it is with HDR specular highlight "life-like" details and colors. It's actually those little steps up in technical that make the HDR videos more impactful. E.g. Between a 700 nits OLED and a 1000 nits OLED TVs, the higher brightness TV will always produce a more "correct" feel and colors and "depth" to a dark scene composition (Furnitures, lights, surroundings). This will also aid what postproduction/colorists in a movie to achieve their intended "feel and mood" of the movie/scene.
Don't want to get into the debate of which is being more immersive that is completely subjective, just trying to give my opinion on what HDR is.

But I am sure you understand my point since you have projector, OLED TV and photography rig (pretty sure you already have a dark room/environment). I guess if you watch a lot of SDR 100 nits movies still, projector is perfectly fine.
*I am not an AV, videophile or video production guy or colorist or whatever you want to call by any means. I just like HDR movies and gaming as a hobby. Peace