Quote:
Originally Posted by nevcairiel
min/max_display_mastering_luminance describe the display which was used to master this content, not the content itself.
What you probably meant is MaxFALL and MaxCLL?
In any case, the overall luminance of the image should not change from a lossy encoding (and even if it did, it would be miniscule changes at worst), so using the value you calculated on the original frames is fine.
|
Like this ?
https://forum.doom9.org/showpost.php...&postcount=207
also some user here showed something similar with x265 output results where the entire lighting seem to have changed perceptually
both are 10->8 bit conversion results
Though i guess it's just that Intel is more efficient with it's chroma handling that lets it look like more pixels responsible for the indirect lighting where captured in the end and better preserved so overall a higher perceptual visible compression efficiency in the HDR space compared to Nvidia, or they just prefer to give chroma a higher per frame distribution priority to look closer to the source in HDR.
or just a better 10->8 bit conversion origin uknown
As you can clearly see the distribution differs and Nvidia seems to be psy wise investing into other parts then Intel does, quiete fascinating to see this without needing to measure the difference
AMD and Nvidia seem pretty identical in their output only Intel differs on this specific part