View Single Post
Old 24th August 2011, 22:54   #266  |  Link
Yellow_
Registered User
 
Join Date: Sep 2009
Posts: 378
cretindesalpes, excellent update. :-)

I have a bit of a query regarding spiking luma using Dither_y_gamma_to_linear function and conversion to RGB. :-)

I have two test video's, ones a h264 AVC off a Canon HDSLR and the other off a HDV camera both sources converted with Dither 1.10 and piped through avs2yuv to ImageMagick Q16 hdri build. Writing 16bit tifs & exr's.

Using a 709 curve with the h264AVC (BT601 Color Matrix, full range luma, 709 transfer curve) gives me spiked luma histogram, using the srgb curve doesn't.

But with the HDV video source (BT709 Color Matrix, full range luma, BT709 transfer curve) whether I use srgb or 709 curve both give smooth luma histogram and definite variation in 'exposure' which I anticipated.

The basis of the scripts is:

Quote:
Dither_convert_8_to_16 ()

Dither_convert_yuv_to_rgb(matrix="709", tv_range=false, cplace="MPEG2", chromak="bicubic", lsb_in=true, output="rgb48y")

Dither_y_gamma_to_linear (tv_range_in=false, tv_range_out=false, curve="srgb")

Dither_convey_rgb48_on_yv12 (SelectEvery (3, 0),SelectEvery (3, 1),SelectEvery (3, 2) )
Obviously I'm changing the matrix="" to suit colour matrix of source and adjusting pixel dims to avoid any scaling ie: 1920x1088 and 1440x1080.

What could be causing the spiking luma with the h264? I thought the problem lay with Imagemagick colorspace handling but that appears to be ok, considering HDV source works fine.

Thanks again for a great update.

Last edited by Yellow_; 24th August 2011 at 22:57.
Yellow_ is offline   Reply With Quote