Quote:
Originally Posted by WorBry
I'll maybe see how the 2160/50p and 1080/50p series compare when plotted against bits/pixel.
|
The VMAF results:
Perhaps not surprising, given that the vmaf_4k_v0.6.1 model predicts the subjective quality of video displayed on a 4KTV and viewed from the distance of 1.5 times the height of the screen whereas the vmaf_v0.6.1 model predicts the subjective quality of video displayed displayed a 1080p HDTV screen at distance 3 times the screen height.
Quote:
Originally Posted by WorBry
In the 1080/50p series an aggregate VMAF=100 score was never attained for precisely the same reason - the VMAF score of the first frame skewed the aggregate score.
|
As seen there, the maximum VMAF score achieved in the 1080 50p series was 99.947 with the lossless (crf0) x264 encode,
What intrigues me more are the FFMPEG-SSIM results:
It's reasonable to assume that down-scaling of the original 2160/50p Crowd Run clip for the 1080/50p tests incurred some loss of fidelity in the 1080/50p source (and reference) clip, making it more 'compressible'. But why is the differential between the bit-matched 1080p and 2160p SSIM scores so much larger at 32-64 bits/pixel than it is down at around 6-8 bits/pixel ?