Quote:
Originally Posted by chros
Do you still use the TV's own dynamic tonemapping this way? I assume not.
Have you managed to compare the TV's own dynamic tonemapping vs the way you use it now? (I haven't yet, or maybe I did )
I'm asking this, because I compared the live ago in SDR mode (OLED light was at the max 100) and the produced image was softer than with hdr passthrough, the latter was way sharper.
(I use the TV in PC mode for getting chroma 4:4:4)
|
Quote:
Originally Posted by Warner306
The softer image would come from poorer EOTF tracking. Were you still doing a gamma conversion with a 2.40 3D LUT? That could make things softer.
The HDR output option should only shave the brightest highlights from every source and eliminate most tone mapping by the display because the source will arrive pre-compressed. That should be very similar to way the TV already works. The BT.2390 roll-off doesn't compress 0-100 nits (the knee point starts higher) when you set the output to 700 nits.
Some have claimed that the C8 and C9 are boosting the brightness of the image at times with dynamic tone mapping. I don't know if that is true or not. If you compressed the source and sent it to the display with dynamic tone mapping enabled, it could also influence the result, but I don't know by how much.
|
It does increase the brightness at times. I don't particularly mind even if it's technically not correct. I keep dynamic tone mapping on because in theory it should do LESS tone mapping given that madvr has already done the tone mapping before sending it. I'm quite impressed with the quality of the output.
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED
|