Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Announcements and Chat > General Discussion
Register FAQ Calendar Today's Posts Search

Reply
 
Thread Tools Search this Thread Display Modes
Old 3rd May 2011, 12:59   #1  |  Link
yetanotherid
Banned
 
Join Date: Apr 2008
Posts: 723
Color conversion, HD to SD

Edit: For anyone new to this thread.... you won't miss much by skipping to post #10 and probably avoid a lot of my confusion. I was trying to understand why some SD encodes taken from a HD source (1080p or 720p) needed color conversion and why some didn't seem to need it. By post #10 I think I'd worked out where I was going wrong, what the rules are for Windows when displaying video why it displays some HD encodes using the wrong colors (and why SD encodes taken from them appeared not to need color converting when they probably do).

----------------------------------------------------------------------------------------------------------------------

This might end up being a lengthy post, so thank you to anyone who takes the time to read it and help me understand the colors used when displaying/encoding video.

I'm trying to understand when to use color conversion while encoding standard definition files from a HD source... or more importantly why it's not always needed... or at least why it appears to not always be needed as the video displays on my PC. I started trying to work this out while re-encoding some BluRay rips to SD XviD/AVIs for others in the house to watch (easier to do it that way than re-rip the disc). I'm using AutoGK for the conversion which decodes using ffdshow.

I was under the impression that when displaying video on a PC (using MPC-HC & WMR9 renderer in my case) SD video displays using R.601 colors while HD displays using R.709, however due to an inconsistent need to use color conversion in order for the SD encode to display using the same colors as the HD source I assume this must not be a hard and fast rule. So then I did some experimenting using ffdshow to decode the video.....

SD video, either XviD or h264:
I open a SD video with MPC-HC and maximise the window. I then enable ffdshow's resizer and set it to resize the video up to a width of 1280 (HD). Enabling and disabling the resizer doesn't change the size at which the video is being displayed due to the window being maximised, but it always changes the colors, apparently assuming R.601 when there's no resizing up and R.709 when there is. Okay, at least that appears to be consistent.

HD video, XviD:
I only have a few HD XviD encodes but opening those seems to indicate the colours don't change when I resize down. Using the same method as above (maximised window) I can resize down to a width of 720 without it effecting the colors. From this I assume those HD videos are already displaying using R.601 which is why resizing down doesn't effect the color.

HD video, h264:
Once again I open a video, maximise the window and use ffdshow to resize down to SD. This effects the colors when playing some video but not others (seems to be about 50/50). When it does effect the color, skin tones look redder (darker) which I assume means it's being displayed using R.601 after resizing when it should still be R.709?
I've added a basic R.709 to R.601 AVISynth script to ffdshow which fixes the color when resizing down, or without the script using the MPC R.601 to R.709 shader also fixes the color after resizing down.

So all of the above gets me back to my wondering in regard to why I need to color convert some HD video when resizing down but not others. I described the above differences in detail because it exactly matches my experiences when encoding to SD. If a HD video changes colors when I resize it while playing it using MPC-HC then it'll need color converting when I convert it to SD with AutoGK. If it doesn't change colors when resizing it while playing in MPC-HC then it won't need color converting when encoding. I thought at first it was ffdshow making the color decisions when resizing the video but I don't use ffdshow to resize when encoding to SD with AutoGK, as AutoGK uses AVISynth to do the resizing. Either way though, the result is the same.

I assume there must be something in at least some of the HD video streams which tells the decoder or the player or the renderer which colors to use, or there's a rule which they follow I'm completely missing, but either way I'm hoping someone can explain to me how it all works. I've used different x264 encoder GUIs over the years so maybe that's part of the reason for things not being consistent?

Does the x264 encoder know which colors are used when encoding HD streams, does BluRay always use R.709 and will the encodes always display using the same colours as the video on the disc or is it possible for a x264 encode of a BluRay to display using different colors than the original? So far I've only checked a few discs against the encodes and they all display the same on the PC and they all happen to be encodes which needed color correcting when converting to SD.

One last question....
I've read that XviD "expects" R.601 but I'm not sure what that means exactly, aside from guessing any device decoding XviD video would assume R.601 regardless of the definition, and therefore all XviD encodes should be converted to R.601 if necessary.
What about x264? Does it "expect" R.709 or does it assume R.601 for SD etc? I can see how a playback device might "expect" certain colors under certain circumstances but I'm not sure why the encoder itself would need to "expect" anything.

Thanks for any input.....

Last edited by yetanotherid; 13th May 2011 at 09:44.
yetanotherid is offline   Reply With Quote
Old 3rd May 2011, 21:09   #2  |  Link
nurbs
Registered User
 
Join Date: Dec 2005
Posts: 1,460
First off the following is probably incomplete since I don't play around with filters a lot, but I hope it helps.

Some software uses 601 or 709 depending on the input resolution. I'm not sure at what resolution it changes and if the change is at different points for different software, but I believe MPC-HC for example uses 601 for DVD resolution and below and 709 for anything above that.
For H.264 you can signal the correct conversion in the bitstream, see the Blu Ray encoding sticky. Some players require this and will use 601 when nothing else is signaled regardless of resolution.

The encoder doesn't care if it's 601 or 709. Since you normally don't convert the source YV12 to another colorspace before encoding it isn't really used.
If you don't resize from HD to SD or the other way round and if you don't have a colorspace conversion during the processing you are fine as long as you set the bitstream flags correctly. If you do any of these conversions I'd also convert to the appropriate color coefficients. You can do that with colormatrix() in avisynth.
nurbs is offline   Reply With Quote
Old 4th May 2011, 02:05   #3  |  Link
yetanotherid
Banned
 
Join Date: Apr 2008
Posts: 723
Quote:
Originally Posted by nurbs View Post
Some software uses 601 or 709 depending on the input resolution. I'm not sure at what resolution it changes and if the change is at different points for different software, but I believe MPC-HC for example uses 601 for DVD resolution and below and 709 for anything above that.
That's what I assumed originally myself but if I preview a HD to SD encode using AutoGK and MPC-HC, and it displays using the same colors as the original HD video does when it's playing using MPC-HC (and I've compared lots of them) then MPC-HC must be using the same colors to display both the HD and SD versions. Either that or the colors being used are decided on somewhere else in the chain, such as by the renderer (I haven't actually compared different renderers yet) or by the decoder. However it happens.... it means the end result is MPC-HC displays both the SD and HD versions of the video using the same colors.

As I said though, this doesn't happen all of the time. At least 50% of the time color conversion is required when going from HD to SD, so in that case the color does change according to the resolution. The mystery I'm trying to solve is why it happens when resizing some HD files and not others. And I guess even more importantly, if there's a way to know whether they're actually displaying using the correct colors. If they're not, I could be color converting or not color converting, inappropriately.

Quote:
Originally Posted by nurbs View Post
For H.264 you can signal the correct conversion in the bitstream, see the Blu Ray encoding sticky. Some players require this and will use 601 when nothing else is signaled regardless of resolution.
Are you referring to these parameters?
--colorprim
--transfer
--colormatrix
If so my understanding is they're optional and as a general rule ignored by almost all playback devices. Is that correct?
MPC-HC seems to display those parameters for the video's properties if they're present, but also seems to ignore them.
In my case, none of the videos I've been experimenting with have has those parameters included, so they're not what's effecting the need for color converting, or not.

Quote:
Originally Posted by nurbs View Post
The encoder doesn't care if it's 601 or 709. Since you normally don't convert the source YV12 to another colorspace before encoding it isn't really used.
If you don't resize from HD to SD or the other way round and if you don't have a colorspace conversion during the processing you are fine as long as you set the bitstream flags correctly. If you do any of these conversions I'd also convert to the appropriate color coefficients. You can do that with colormatrix() in avisynth.
Bitstream flags?? I understand mpeg2, for example, can have the colors included in the bitstream (R.601 or R.709 etc) which allows colormatrix to do it's thing, but does the same apply to BluRay video or x264 encodes? I assume the bitsream flags you refer to are something different to the x264 command line parameters I referred to earlier? I'm just trying to understand the distinction, if there is one.
Plus if I do set the bitstream flags, wouldn't I have to know which colors the original video actually uses?

At the moment I'm just working on what looks right when I encode. If the SD encode display using the same colors as the original I don't color convert, if it doesn't, I do. Although I have found a couple of HD encodes which don't need color converting when resizing down but do appear to display using the wrong colors themselves.... like they should have been color converted when originally encoded as HD files from the disc. I haven't got around to comparing them to the original disc yet to see if that's the case.

So thanks for the reply but as you can probably tell I'm still as confused as ever, because the need for color converting doesn't seem to consistently apply according to the source's definition as the general consensus says it should. Some HD to SD encodes follow the rule, others don't, and from the few videos I've tested HD XviD encodes always display using the same colors even when resized to SD.

Maybe I should try putting some of what I want to know into questions....

BluRay video can be in different formats such as mpeg2 or h264 etc? Does it always use R.709 or can it use R.601 too, and if so is there any way to determine which colors the original video uses?
When encoding BluRay video (I use MeGUI) does the x264 encoder have a way of determining the colors used and does it write that information to the video stream when encoding or are the colors used left up to the playback device?

There must be a reason why many HD encodes don't require color converting when re-encoding to SD while others do. Logic tells me that somewhere in the decoding process, for some HD video/encodes at least, there must be something telling the decoder which colors to use, regardless of the resolution. If there wasn't, the colors used would be constantly different according to the resolution, but they're not.

Mind you for the purpose of my HD to SD encodes, which are just for others in the house to watch, the correct colors are far from critical, but now I've started thinking about the whole color issue I'd also like to know if all of my original 720p encodes are displaying correctly, or whether there's also instances when BluRay HD encodes should be color converted as is apparently the case when encoding some DVDs (although I'm not sure I've seen a DVD which wasn't R.601).

Thanks.
yetanotherid is offline   Reply With Quote
Old 5th May 2011, 22:36   #4  |  Link
yetanotherid
Banned
 
Join Date: Apr 2008
Posts: 723
Here's a practical example:
I have a very old 720p MP4, encoded using handbrake. I'm positive it's displaying using the wrong colors (I compared it to an original DVD to confirm as I don't have access to the original HD version right at the moment). Reds are darker than they should be, therefore I assume the original video is R.709 but for some reason it's playing using R.601, the same as would happen if I converted a R.709 DVD to AVI without color correction.
Even if the original MP4 was correct however it doesn't change the results of my experimental encodes (I just used that particular MP4 because I'm sure it's displaying wrong and I wanted to see what difference re-encoding might make).

When I re-encode the 720p MP4 as a 720p MKV without color correction it displays the same as the original MP4 (wrong). Therefore I assume it must still be using R.601 too.
When I re-encode the 720p MP4 as a 720p MKV with color correction (Rec.709->Rec.601) it displays using the correct colors. Therefore I assume it's still R.601 also, but the color conversion on the way through fixed it.

The above seems to be confirmed by converting each 720p encode to SD AVI. In all cases, without color correction, the AVI encodes play using the same colors as the source. If the source is wrong, so is the AVI. If the source is correct, the AVI is correct.

I just can't see how the colors used for each AVI encode can be the same as those used for each source if the only rule regarding color is based on definition. If an XviD AVI always uses R.601, and if a 720p encode always uses R.709, then each of the AVI encodes would display using different colors to their source, but none of them do. In each case it appears all the 720p encodes must be displaying using R.601 and therefore no color correction is needed on the way through.

Anyone with any idea as to why this happens? According to all the rules I've read the 720p versions should be displaying using R.709 (even if it happens to be incorrect) and therefore each AVI encode should display using different colors to their source.

Surely someone knows what's going on?
yetanotherid is offline   Reply With Quote
Old 6th May 2011, 08:37   #5  |  Link
Ghitulescu
Registered User
 
Ghitulescu's Avatar
 
Join Date: Mar 2009
Location: Germany
Posts: 5,769
And you're still of the opinion that playing media on a PC is superior vs. the same thing on a standalone?
Well, coming back to the 709/601 issue, yes, most codecs and players assume 601 for SD resolutions and 709 for HD (some exceptions are given though, apparently). Yet, this is not always given as such, as codecs that gained fame as SD encoders (like xvid) would use 601 even for HD resolutions. The same seems to be true for other SD codecs with "HD extensions", like DVCPro HD.
__________________
Born in the USB (not USA)
Ghitulescu is offline   Reply With Quote
Old 6th May 2011, 21:42   #6  |  Link
yetanotherid
Banned
 
Join Date: Apr 2008
Posts: 723
Well I'm of the opinion a PC is a far more convenient player than a standalone, and the rules for encoding and color converting (via the colormatrix plugin) are well established for encoding AVIs from DVDs. It's pretty hard to go wrong. Once I work out the rules for HD conversions (because the accepted ones don't seem to be applying) using a PC shouldn't be a problem. The main issue is I didn't have to expect to have to check HD encodes for correct colors so I haven't been doing so. Now I've realised it's not as simple as "everything uses R.709" (or at least it appears not to be), I know to check the output before encoding and color correct if necessary.

I still don't understand the idea "XviD always uses R.601" etc. because I still don't understand fully how the encoder knows what colors it's encoding. Wouldn't it be the case that it's the decoder which decides which colors to use when decoding, and not the encoder? Even the "XviD always uses R.601 regardless of definition" theory doesn't hold true. If I resize a SD XviD AVI to HD dimensions with ffdshow the colors change every time. Likewise when resizing up a x264 SD encode. Resizing down from HD to SD though isn't giving predictable results. A HD XviD Encode doesn't change colors when resizing down (or at least I haven't found one which does) and some x264 encodes change colors when resizing down while others don't.

Here's an example I found today. Maybe someone can explain to me what's going on, because I'm obviously missing something. I've resized the screenshots to make them all the same size for posting.

This is a screenshot taken by MPC-HC playing a BluRay disc. I'm pretty sure the colors are correct here:



Next is the same screenshot taken from an XviD encode of the DVD disc. No color correction was used. It displays using the same colors as the BluRay disc (and the picture aspect ratio is the same.... lucky I never use ITU resizing ).



This is a 720p encode of the BluRay disc. Encoded using MeGUI. Note the color of the ocean has changed. The encode is displaying using the wrong colors, or at least displaying using different colors than the disc.



Here is an XviD encode of the 720p encode. No color correction was used. Now according to the rules as I understand them it should display using different colors than the 720p encode (which would in fact make it two color conversions away from displaying the same as the original BlueRay disc). However it doesn't. It's displaying using the same colors as the (incorrect) 720p source.



This is a screenshot of the same 720p encode as above, but I used ffdshow to resize it up to 1080p. Guess what, it now displays using the correct colors???? That's one fun discovery I made tonight. HD isn't just HD as far as colors go.



So from all the above I know for sure my PC displays, for reasons I don't yet understand, a 1080p encode of the above BluRay disc using the correct colors, and a 720p encode using the wrong colors. The question which I've yet to answer is would a standalone device play the two encodes using different colors as well or is my PC getting it wrong? I'll have to do some experimenting there at a later date. Help!!

PS I just checked, and VLC plays the 720p encode using the wrong colors too. As do both the ffdshow h264 decoders. Decoding using CoreAVC also has MPC-HC still displaying the wrong colors.

Last edited by yetanotherid; 6th May 2011 at 22:20.
yetanotherid is offline   Reply With Quote
Old 6th May 2011, 22:30   #7  |  Link
Wilbert
Moderator
 
Join Date: Nov 2001
Location: Netherlands
Posts: 6,364
Quote:
I still don't understand the idea "XviD always uses R.601" etc. because I still don't understand fully how the encoder knows what colors it's encoding.
He means upon feeding RGB to the XviD encoder. Unless you can specify it in an encoder, the encoder doesn't know what color coefficients to use and will assume something. The XviD encoder always assumes Rec.601.

Quote:
Wouldn't it be the case that it's the decoder which decides which colors to use when decoding, and not the encoder?
That depends on whether the decoder converts it to RGB or that it lets the renderer do that.

The following link might be interesting for you:

http://avisynth.org/mediawiki/Colorimetry
Wilbert is offline   Reply With Quote
Old 7th May 2011, 07:29   #8  |  Link
yetanotherid
Banned
 
Join Date: Apr 2008
Posts: 723
Thanks for the info.
Okay so far at least, for the purposes of my SD encoding, I'm not feeding the encoder RGB so the "XviD encoder assuming Rec.601" stuff isn't relevant.

Seems if the video is mpeg2 then the colorimetry information can be obtained from the video header if it's present, but what about AVC? Can it contain the same colorimetry information and is there a way to retrieve it?

From the page you linked to:
- Windowed/renderless VMR7 and VMR9 use BT.601 for video < 720p (720 vertical lines)
- Windowed/renderless VMR7 and VMR9 use BT.709 for video >= 720p (720 vertical lines)


I'm using the WMR9 so I guess now I rethink it (it didn't really sink in the previous 50 times I've read it), my example above does follow those rules. If I encode to 720p, with the black bars cropped there's actually only 544 vertical lines, hence the 720p encode of the original disc displaying using different colors (wrong) than the 1080p encode which has over 800 vertical lines.
So I guess in theory, every 720p encode which has black bars cropped must therefore display using the wrong colors if the source is Rec.709. If it's R.601 then it'll display correctly when encoded as 720p and incorrectly when encoded as 1080p. Wonderful!! I'll check some more encodes against the original BR discs to see if the rule always holds.
What about a standalone player? Does anyone know if it'd play my 720p encode using the wrong colors as per a PC or would it assume 544 vertical lines is HD and use R.709?

I've still got one remaining thing to understand.
I take a 720p encode (black bars cropped) with 544 vertical lines and convert it to SD AVI. If the above rule holds true then the 720p source must play using R.601 (whether it's correct or not). My AVI should also display using R.601 so no color correction should be necessary to keep it displaying using the same colors as the 720p source. As I keep saying though, this is only true some of the time. Some of the time I do need to color convert so the 720p source must be displaying using R.709 despite only having 544 vertical lines. WHY???

When it comes to stuff that's actually got 720 vertical lines the rule does seem to be consistent. I always have to convert the colors when resizing to SD, although I'll check some more encodes to be sure.

As a side note, I'm fairly certain the cut-off point between R.601 and R.709 isn't 720 vertical lines. I did a bit of experimenting with resizing one 720p video and the cut-off point, where the colors change while resizing down, was actually 674 vertical lines.
yetanotherid is offline   Reply With Quote
Old 12th May 2011, 16:42   #9  |  Link
SeeMoreDigital
Life's clearer in 4K UHD
 
SeeMoreDigital's Avatar
 
Join Date: Jun 2003
Location: Notts, UK
Posts: 12,227
Hmmm...

As mentioned before on the forum. Nowadays, quite a few high-def Blu-ray disc releases also come packaged with a std-def DVD version of the same movie. Which makes colour comparisons quite simple...

Given this to be the case, I'm able to play the DVD disc in one of my stand-alone players and the Blu-ray disc in another one of my stand-alone players and view both images at the same time via split screen on the same display.... They look the same colour to me!
__________________
| I've been testing hardware media playback devices and software A/V encoders and decoders since 2001 | My Network Layout & A/V Gear |
SeeMoreDigital is offline   Reply With Quote
Old 13th May 2011, 08:42   #10  |  Link
yetanotherid
Banned
 
Join Date: Apr 2008
Posts: 723
Okay, so I've done some more experimenting and realised where I was going wrong, some of it being my mistakes.

Firstly though, I'm 100% sure the accepted cut-off point regarding the colors used by Windows when displaying video is incorrect. I'm 100% sure it's not 720p or 720 vertical lines. It's actually around 674 vertical lines. So a video with dimensions of 1280x688 for example will display using R.709 (correctly), not R.601. As this wasn't what I was expecting it led to some of my confusion. When posting earlier, rather than confuse the issue with multiple dimensions, I think I always referred to HD video under 720 vertical lines as 1280x544 and stated it didn't display consistently. I guess I thought whether it was 1280x688 or 1280x676 or 1280x544 is should all display the same way on a PC (partly because of the accepted 720 vertical lines rule) so I lumped all those dimensions together in my head.... turns out that's not the case.

When it comes to HD video of around 674 vertical lines or less, and my theory of it not following a consistent rule, I'm fairly sure now I was wrong. I re-checked quite a few HD to SD encodes and it turns out the majority of the HD sources which required color correcting when converting to SD had over 674 vertical lines. The ones which didn't (1280x544 etc) hadn't in fact needed color correction for the SD encodes taken from them to display using the same colors (I guess when I checked those ones I must have got it wrong). A 1280x544 source and SD encode taken from it without color correction should always display the same.... probably with the wrong colors.

So after all that at least I now know how the video is being displayed on my PC and even if it's not perfect, at least it's predictable.

--Any video with more than 674 vertical lines will display using R.709, which will most likely be correct.
--Any video with 674 or less vertical lines will display using R.601. If it's a HD encode then this will most likely be incorrect. So for 1280x544 encodes etc, it's necessary to enable the MPC-HC "BT.601 to BT.709" shader to display the video correctly. No big deal, it's just a pity it needs to be done manually. Or maybe it is better to encode 720p video without cropping the black bars to ensure 720 vertical lines and the correct colors on playback?
--Any SD encodes will of course be R.601 and should therefore display correctly. If they come from a HD source then color conversion should always be used. At least now I know why that color conversion causes a SD encode to display differently to a 1280x544 encode taken from the same 1080p source.

I guess it's a pity Windows can't be a little smarter about color choice. ffdshow's RGB color conversion, for example, is based on video width as well as height (anything with a width over 1024 or a height of 600 uses R.709, which means it'd be far more likely to make the correct color choice).

Quote:
Originally Posted by Ghitulescu View Post
Well, coming back to the 709/601 issue, yes, most codecs and players assume 601 for SD resolutions and 709 for HD.
Which leaves me with the same question for standalone devices as I had when using a PC. Now I know a PC's definition of SD and HD aren't the same as mine, but at least I now know there's consistency, when the PC will be wrong and how to easily fix it, but what does a standalone device use as it's definition of SD and HD (in reference to MKV capable BluRay players)?
I'd assume an anamorphic DVD encode would be R.601, and any HD encode without cropping (720p or 1080p) would be R.709, but what about an encode such as 1280x544 which has dimensions which sit somewhere in between? Which colors would a standalone device use when playing those sorts of enocdes? If it's always R.601 (as per a PC) maybe I should be color converting many of my 720p encodes?

Last edited by yetanotherid; 13th May 2011 at 09:58.
yetanotherid is offline   Reply With Quote
Old 13th May 2011, 08:55   #11  |  Link
yetanotherid
Banned
 
Join Date: Apr 2008
Posts: 723
One other thing I'm still vague on, is whether colour information is, or can be, stored in a HD video stream the same as it is for mpeg2 video and whether standalone devices pay any attention to it (the PC doesn't seem to).

So far I've only been able to find HD color information in the form of "Color Primaries", "Transfer Characteristics" and "Matrix Coefficients", which I understood as being different to the way color information is stored for mpeg video, and being more a "suggestion" which most standalone devices ignore. Is this the only way color info is stored for HD video?

What I'm trying to understand, is if I write the same info to my x264 MKV encodes, would it ensure a standalone device uses those colors? For instance if I encoded a DVD and added R.709 color properties to the video stream, would a standalone device then use R.709 or would it happily use it's own color rules just like a PC?
I'm primarily interested in what an MKV capable BluRay player would do, as I'm about as likely to author a BluRay compliant disc as I am a standard DVD video disc instead of using AVI.
yetanotherid is offline   Reply With Quote
Old 18th May 2011, 06:12   #12  |  Link
pwnsweet
Registered User
 
Join Date: Nov 2008
Posts: 101
Hi yetanotherid,

I'm still noob at video editing, encoding, color conversion etc but I'm learning, and I'm learning fast, so please excuse some of the lol-moments in this post. This post will be long, but I'm striving to make it as easy to read as possible, so you shouldn't have to spend my time 'thinking' about or deciphering what I've written. I'll start by saying that I'm so glad you performed this investigation into the oddities of color conversion. I was having the same problem a few days ago where I first encountered this issue with changing color in my own encodes and had no idea why it was happening. After reading this thread through a few times I decided to do some testing of my own.

My encode goal is to encode Blu-ray to ~720p using MeGUI and then store and watch the resultant mp4 on my Playstation 3. When I say ~720p, I mean roughly 921,600 pixels after border cropping which means my resulting encode is not always 720 vertical lines. In fact, I'm typically encoding at resolutions of 1504x640, 1536x640, 1280x720 or 1328x720 depending on the source's aspect ratio after copping. I started noticing color issues when I was previewing my avs script in MeGUI. Greens and reds were different in the preview window to the source, and I just couldn't get the same colors as the source regardless of whether the color correction option in MeGUI was enabled or not.

What made it even more strange was that when I turned color correction off and encoded my movie, the resultant encoded movie was the same color as the source, despite the fact that the avs script preview window was not. Confusion abound, I decided it was time that I start doing some serious research into the issue which led me to discovering this thread. Subsequently, it was also at this point that I learnt, thanks to your testing, that the way Windows decides which color 'palette' to use when displaying a video clip is based on vertical resolution. After I picked myself up from falling off my chair after learning this (the fact Windows does this continues to amaze me), I decided to try to corroborate your results.

I grabbed my source .m2ts and wrote a simple AviSynth script (which I didn't know how to do 3 days ago ) that simply resized the source to two different resolutions: 1120x480 and 1504x640. The script used the default Windows DirectShow framework to open the video, cropped the borders and then resized using spline36 (has never heard of that before 3 days ago ). That's it. I opened my source in one Daum PotPlayer window and my two scripts in two separate Daum PotPlayer windows and compared all three side-by-side. Here's where it gets interesting.

If I understand your results correctly, you claim that anything below 676 vertical lines is displayed in Windows in Rec.601. If this was correct, then my source should've exhibited Rec.709 colors and my two resized scripts should've exhibited Rec.601 colors...but they didn't. Only the 1120x480 had Rec.601, the other two had Rec.709. Next I kept decreasing the vertical resolution of my 1504x640 script until it too exhibited Rec.601 colors and found that to occur when the vertical resolution was 577. At 577 vertical lines the colors switched from Rec.709 to Rec.601 and looked the same as that of the 1120x480 resize. Just for fun, I then increased the horizontal resolution to 1920 (so 1920x577) and opened that up in Daum PotPlayer. Besides the fact that the aspect ratio was completely out (which I corrected within PotPlayer) the colors were still Rec.601. So clearly horizontal resolution makes no difference to Windows' choice.

So this got me thinking. If I need to encode to a vertical res of less than or equal to 577, would color correction fix the colors? So I tested that next. Sure enough, after color correction had been performed with ColorMatrix the colors were the same as the source. Then I remembered about my end game - the Playstation 3. I wasn't going to be watching this stuff on my PC after all, so I wondered how the Playstation 3 decides what colors to use. I believe this might also partially answer your question regarding how standalone players handle which colors are used. So I encoded a short clip at 1120x480 and 1504x640 (both without color correction) and transferred them, along with the source over to my PS3. I had my suspicions that they would all exhibit the same colors and they did. Why this is, I have no idea. Based on preliminary research (and this is by no means meant to be taken as fact) it appears the PS3 XMB (that's the main menu from which mp4 files can be opened from) is in RGB color and anything launched from that menu is converted to RGB. Whether this is true, and whether this explains why they all have the same color is not something I can speak with any authority on. I have absolutely no knowledge in regards to YV12->RGB color conversion and how Rec.601/709 factors into that conversion. What I do know with 100% certainty, however, is that they did all have the same color. This was confirmed by two other people in the room at the time.

And so my epic post comes to a close. It was nice not writing in such a formal tone for once. Everything just come out easier.

Last edited by pwnsweet; 18th May 2011 at 06:28.
pwnsweet is offline   Reply With Quote
Old 18th May 2011, 08:43   #13  |  Link
GodofaGap
Registered User
 
Join Date: Feb 2006
Posts: 823
Quote:
Originally Posted by pwnsweet View Post
I have absolutely no knowledge in regards to YV12->RGB color conversion and how Rec.601/709 factors into that conversion.
YUV<->RGB conversion is the whole crux of the 601/709 issue. It's those co-efficients that determine how the conversion is done. If we never converted between YUV and RGB we wouldn't need 601/709 recommendations.

Quote:
So far I've only been able to find HD color information in the form of "Color Primaries", "Transfer Characteristics" and "Matrix Coefficients", which I understood as being different to the way color information is stored for mpeg video, and being more a "suggestion" which most standalone devices ignore. Is this the only way color info is stored for HD video?
Yes, these are just flags, but this is not any different for MPEG2. We can only hope that players respect things like frame rate and PAR flags, but no player is really obliged to. Not any different for color conversions no matter what format you have.
GodofaGap is offline   Reply With Quote
Old 18th May 2011, 08:56   #14  |  Link
TheSkiller
Registered User
 
Join Date: Dec 2007
Location: Germany
Posts: 632
Quote:
Originally Posted by yetanotherid View Post
For instance if I encoded a DVD and added R.709 color properties to the video stream, would a standalone device then use R.709 or would it happily use it's own color rules just like a PC?
DVDs are always Rec.601, it does not matter what colorimetry you set. Usually no flag is set or the Rec.601 one is set.
TheSkiller is offline   Reply With Quote
Old 18th May 2011, 08:59   #15  |  Link
pwnsweet
Registered User
 
Join Date: Nov 2008
Posts: 101
Quote:
Originally Posted by GodofaGap View Post
YUV<->RGB conversion is the whole crux of the 601/709 issue. It's those co-efficients that determine how the conversion is done. If we never converted between YUV and RGB we wouldn't need 601/709 recommendations.
I see...So let me get this straight - The color information is stored as YUV in the video file and then converted to RGB when displayed on a PC monitor or HDTV and the resultant colors displayed on the screen depend on whether that conversion utilized Rec.601 or Rec.709. Is that right?
pwnsweet is offline   Reply With Quote
Old 18th May 2011, 09:25   #16  |  Link
yetanotherid
Banned
 
Join Date: Apr 2008
Posts: 723
Quote:
Originally Posted by pwnsweet View Post
I see...So let me get this straight - The color information is stored as YUV in the video file and then converted to RGB when displayed on a PC monitor or HDTV and the resultant colors displayed on the screen depend on whether that conversion utilized Rec.601 or Rec.709. Is that right?
Yes.
So far the general consensus is that every SD DVD player uses Rec.601 regardless of any color info in the mpeg2 stream to the contrary. Then again though, the general consensus seems to be Windows switches colors at 720 vertical lines, and we're both in agreement that's incorrect (even if we don't yet agree exactly where it does switch).
I'd guess.... and hope.... that any HD capable device would switch to Rec.709 for anything higher than DVD resolution (and ignores the color info in the video stream), as at least then it'd be predictable even if Windows doesn't do the same thing (and it appears your PSP works that way).
It might even be safe to assume a standalone device uses Rec.709 for HD regardless of any color information, but I'm not aware of anyone here, aside from yourself, having tested their non-PC playback device for the colors it uses according to definition or whether it pays any attention to the color information in the video stream. So far it's all "in theory" and "fingers crossed" but nobody has yet posted to say they've tested their standalone device and "this is what it actually does". At least not that I'm aware of.

Last edited by yetanotherid; 18th May 2011 at 09:27.
yetanotherid is offline   Reply With Quote
Old 18th May 2011, 09:36   #17  |  Link
Ghitulescu
Registered User
 
Ghitulescu's Avatar
 
Join Date: Mar 2009
Location: Germany
Posts: 5,769
Quote:
Originally Posted by TheSkiller View Post
DVDs are always Rec.601, it does not matter what colorimetry you set. Usually no flag is set or the Rec.601 one is set.
Not always, there are some digitized analog tapes that use another colorimetry, I fail to remember its name right now, but it was used before the .601 came into force.
I'm not sure however whether the DVD player simply uses the .601 for all video materials (irrespective of their colorimetry) or really observes their colorimetry and output the colours as intended.
__________________
Born in the USB (not USA)
Ghitulescu is offline   Reply With Quote
Old 18th May 2011, 10:31   #18  |  Link
yetanotherid
Banned
 
Join Date: Apr 2008
Posts: 723
Quote:
Originally Posted by Ghitulescu View Post
Not always, there are some digitized analog tapes that use another colorimetry, I fail to remember its name right now, but it was used before the .601 came into force.
Are these digitised analogue tapes playable in a DVD player or are you just confusing the issue with irrelevancies?
yetanotherid is offline   Reply With Quote
Old 18th May 2011, 12:05   #19  |  Link
pwnsweet
Registered User
 
Join Date: Nov 2008
Posts: 101
Quote:
Originally Posted by Ghitulescu View Post
there are some digitized analog tapes that use another colorimetry, I fail to remember its name right now, but it was used before the .601 came into force.
No offense or anything, and I say this in the nicest possible way, but I fail to see how the colorimetry of analog tapes is relevant to the discussion or the colorimetry of DVD's and HD material.
pwnsweet is offline   Reply With Quote
Old 20th May 2011, 01:31   #20  |  Link
mpucoder
Moderator
 
Join Date: Oct 2001
Posts: 3,530
To get technical, there are more than 2 colorimetry standards that may be used. Mpeg-2 supports SMPTE-170M (aka 601), SMPTE 240M, ITU-709, and ITU-470 (M, B, and G). DVD is limited to the subset of ITU-470 and SMPTE-170M - no use of 709 for DVDs. The DVDs colorimetry and geometry are based on analog to digital conversions. HD departs from those old standards, using pixels with 1:1 aspect ratio and ITU-709 colorimetry. Since HD begins at 720P it is fair to assume that video of 720P or higher will use 709. Lower resolutions could use any of the previously mentioned colorimetries. Many programs, even mine (so far) make this assumption, but all video streams used by DVD, HD-DVD, BluRay, and ATSC carry colorimetry information which should not be ignored unless you know that the encoder was set incorrectly. This should not happen unless the colorimetry information gets lost or ignored prior to encoding.
Colorimetry consists of three factors:
1) color primaries (what exactly is red, green, and blue)
2) transfer characteristics (aka gamma, although only ITU-470 uses simple gamma. Computers use a transfer characteristic called sRGB which is not used by motion video encoders afaik - see http://mpucoder.com/Carbon/Composito...ns/index.shtml for a comparison of the curves)
3) matrix coefficients - how to convert YUV (YCbCr) from/to RGB
As you can see, even RGB video has 2 of these factors (and there are even more colorimetries available, eg Cineon).
SMPTE-170M and ITU-709 use the same transfer characteristics, but differ slightly in the color primaries and greatly in the matrix coefficients.
mpucoder is offline   Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 12:08.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.