View Single Post
Old 19th July 2011, 23:14   #214  |  Link
Yellow_
Registered User
 
Join Date: Sep 2009
Posts: 378
I'm using Dither to go from 8bit h264AVC to 16bit OpenEXR image sequences but really those should be linear not gamma encoded, the compositing application I'm importing them into assumes linear. So does the Dither_y_gamma_to_linear function serve this purpose?

Is it only necessary to linearize the luma, more accurate and technically correct to do it that way before the conversion to RGB, rather than a typical 0.45 reverse gamma on all channels in RGB data after conversion?

Also as the source was encoded with a BT709 transfer curve not sRGB 2.2, to undo that ie: linearize YCbCr, I should be assuming something like the reciprocal of 2.35? If I understand correctly this helps prevent compressing shadow detail that can occur applying 0.45 to BT709 source?
Yellow_ is offline   Reply With Quote