View Single Post
Old 14th August 2015, 17:04   #20  |  Link
hello_hello
Registered User
 
Join Date: Mar 2011
Posts: 4,829
Quote:
Originally Posted by feisty2 View Post
always a pain in the ass when "color" stuff gets to be the problem
converting "matrix" alone is, not very correct more or less
"matrix" "transfer" and "primaries" are a set of interactive parameters work together as one to represent a certain part of colors from CIE 1931 (the very first color standard covering every single visible color to human vision, and device independent of course, kind like the royal of all color related stuff)
primaries = how red exactly is the red here (green, blue likewise), it decides the coordinates of 3 elementary colors (cannot be represented as some mixed stuff of other colors, they are the purest thing here, they get to mix each other to produce other colors) on CIE system
transfer = a function that connects physical intensity and perceived intensity (aka gamma)
matrix = a matrix to separate luminance and chrominance
so, see, HDTV got a different "matrix" not because "oops, the old bt.601 sucks ass, I don't like it, let's make some crazy new crap", "709" matrix is there because HDTV picks different primaries from SDTV, so a new matrix is needed to separate luminance and chrominance of the image based on this new primaries.
pointless to apply "709" matrix on clips with "601 NTSC/PAL" primaries cuz you can't get correct luminance and chrominance with this matrix, the "luminance" is not the exact luminance and so is the chrominance, the "matrix" does not serve as it's supposed to, and "601" matrix on "709" primaries, likewise
if you are that desperate to change the color system, don't just convert the matrix, convert to CIE 1931 and apply a whole new set of different color system.
That all sounds nice in theory, but in reality if I downscale HD video and convert it to bt.601, the SD version looks the same as the HD version to me, color-wise. A question.....
If I didn't convert the colours and simply wrote bt.709 to the video stream, on the occasions I'm using a media player that pays attention to the colorimetry info, in what way would the video display differently?. The original required bt.709 when converting to RGB on playback, while the encode now requires bt.601 because I converted the colours.

Maybe I'm missing something but I thought bt.601 and bt.709 used exactly the same gamma, and even if the colorprim parameter wasn't generally ignored by players, why wouldn't it still be correct? If the source video displays as it should on a bt.709 calibrated display, shouldn't the downscaled version also display the same way?

The original video probably looks like this:

Color primaries : BT.709
Transfer characteristics : BT.709
Matrix coefficients : BT.709

After I convert the colours and downscale I normally only tell x264 to write the matrix coefficients, but if I was more enthusiastic, or I always knew the correct values, I'd get it to write all three like this:

Color primaries : BT.709
Transfer characteristics : BT.709
Matrix coefficients : BT.601

Why wouldn't the encode display exactly like the source given I've only changed the matrix coefficients to reflect the change in colorimetry?

Last edited by hello_hello; 14th August 2015 at 17:07.
hello_hello is offline   Reply With Quote