View Single Post
Old 24th June 2014, 21:20   #1018  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
Quote:
Originally Posted by foxyshadis View Post
It doesn't matter what you put in or take out, the compatibility revolves entirely around the internal bit-depth. Whether the future brings wider 10-bit compatibility is entirely unknown, we just have to hope that since it's included in the base spec, some hardware makers will take advantage of that. So far the major GPU makers (Intel, AMD, nVidia, PowerVR) are barely incorporating support for 8-bit HEVC.
We are seeing some TV players support internal 10-bit decode, like the latest Samsung UHD TVs. They can play back HEVC up to 2160p60 10-bit. But not H.264 High 10.

Quote:
Using 10-bit internal with 8-bit input doesn't seem to have the same advantage over plain 8-bit in x265 as with x264. (And even that is fairly small.) I'm not sure if that's the encoder, or the standard, but we'll have to see how it evolves. Maybe HEVC just doesn't cause as much banding as AVC in general at 8-bit?
It's by spec; HEVC does 8-bit better than H.264 did, so there's no real reason to encode 8-bit sources in Main 10.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote