View Single Post
Old 24th June 2014, 02:04   #1004  |  Link
foxyshadis
Angel of Night
 
foxyshadis's Avatar
 
Join Date: Nov 2004
Location: Tangled in the silks
Posts: 9,559
Quote:
Originally Posted by upyzl View Post
@benwaggoner & @foxyshadis

well, to be specific

1) now at same medium bitrate(medium bitrate means, e.g. x264 crf22), x264-10bit act much better than x264-8bit in prevent banding(especially in dark flat area because of gamma compression) if Source has no banding -- of course benefit from high interal bit depth, also have positive to prevent other artifacts -- and now x264-10bit optimize is good enough even at [same encoding time & bitrate], it could still act better quality than x264-8bit

2) x265 8bpp now also use 8bit internal, I should use x265 16bpp for high bit internal -- x265 works like x264 in this regard

3) until now, H.264/AVC 10bit-depth has low compatibility. e.g. we could not use Hardware acceleration for 10bit video; mobile device/PS3 like hardware device(diff from PC could use x86-CPU for generic software decode and almost ignore decode performance and power consumption) playing 10bit video is much difficulty and unfriendly; seems video editing fields is the same(e.g. Adobe Premiere is not support for H.264 10bit video). I'm very worry about HEVC/H.265 age will be the same...

4) and...for [8bit input] and high-bit internal, if use 8bit output rather than 10bit output, should be smaller size at same quality?(I'm Not expert on this)

----
so, I'm interest in 8bit in/output and high-bit internal, especially in encoding
seek for lowest bitrate for same high quality is eternal topic for video compression, and I am, but I also care about a degree of compatibility (and encoding time)...
It doesn't matter what you put in or take out, the compatibility revolves entirely around the internal bit-depth. Whether the future brings wider 10-bit compatibility is entirely unknown, we just have to hope that since it's included in the base spec, some hardware makers will take advantage of that. So far the major GPU makers (Intel, AMD, nVidia, PowerVR) are barely incorporating support for 8-bit HEVC.

Using 10-bit internal with 8-bit input doesn't seem to have the same advantage over plain 8-bit in x265 as with x264. (And even that is fairly small.) I'm not sure if that's the encoder, or the standard, but we'll have to see how it evolves. Maybe HEVC just doesn't cause as much banding as AVC in general at 8-bit?

With respect to size, it doesn't matter if what you output, it's still the same file (unless you're re-encoding) and internally every calculation is done at the internal bit-depth until the final output, when it can be left alone or downsampled. Even with 8-bit input, Main 10 with 16-bit output instead of 8-bit dithered might look better simply due to not rounding as early. (No decoder currently produces float output, although they could if they wanted.) I don't know if anyone's really tested that, and you'd need a decent monitor to tell the difference, and right now I don't have one. It's an interesting area to investigate.

It definitely will help if you're doing any shader processing on the output; MadVR will accept up to 16-bit and won't ever drop down until it outputs to the screen.

Last edited by foxyshadis; 24th June 2014 at 02:09.
foxyshadis is offline   Reply With Quote