View Single Post
Old 26th August 2020, 18:34   #17  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
Quote:
Originally Posted by Cary Knoop View Post
Not really, the floats are converted to 8 or 10 bit using a byte stream depending on the destination bit-depth during (or right after) decoding. Practically speaking there are no commercial monitors that can display over 10-bit of accuracy, there is no need to actually render float values to the monitor.
It is still memory and bandwidth required in the decoder itself, even if the output is always normalized to 10-bit.

Quote:
I believe it should be possible when someone makes a documentary with mixed framerate footage not to worry about making the overall framerate unique but to allow multiple segments. There is absolutely no technical reason why modern monitors cannot switch framerates on the fly (or get a change signal a few frames ahead).
Variable frame rate monitors for gaming have been around for a few years, and we're seeing those techs make it into some televisions. But that only works when all devices support it. I don't know of any media playback solution that's tried to leverage variable frame rate at all. For pretty much all consumer delivery, the masters have to get conformed to a specific frame rate for a master to be used broadly.

It's a fine idea, and I could see something like the PS5 or XBox Series X being able to support it eventually. It might "just work" under a few specific high-end Windows gaming setups. But that'd be <0.1% of the installed base at best, today.

Sounds like a potential SMPTE spec.

Quote:
HDMI is an awful, protectionist, and very limited technology.
And startlingly undertested for performance. Most interoperability testing stops with "I saw and heard something" not "Did 4k HDR with Atmos work correctly without having to fiddle with menu settings?"
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote