View Single Post
Old 19th March 2017, 12:31   #5034  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,926
Quote:
Originally Posted by nevcairiel View Post
min/max_display_mastering_luminance describe the display which was used to master this content, not the content itself.
What you probably meant is MaxFALL and MaxCLL?

In any case, the overall luminance of the image should not change from a lossy encoding (and even if it did, it would be miniscule changes at worst), so using the value you calculated on the original frames is fine.
Like this ?

https://forum.doom9.org/showpost.php...&postcount=207

also some user here showed something similar with x265 output results where the entire lighting seem to have changed perceptually

both are 10->8 bit conversion results

Though i guess it's just that Intel is more efficient with it's chroma handling that lets it look like more pixels responsible for the indirect lighting where captured in the end and better preserved so overall a higher perceptual visible compression efficiency in the HDR space compared to Nvidia, or they just prefer to give chroma a higher per frame distribution priority to look closer to the source in HDR.

or just a better 10->8 bit conversion origin uknown

As you can clearly see the distribution differs and Nvidia seems to be psy wise investing into other parts then Intel does, quiete fascinating to see this without needing to measure the difference

AMD and Nvidia seem pretty identical in their output only Intel differs on this specific part
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 19th March 2017 at 13:35.
CruNcher is offline   Reply With Quote