View Single Post
Old 8th March 2012, 22:37   #1  |  Link
Petrakeas2
Registered User
 
Join Date: Feb 2012
Posts: 5
BT.601 and BT.709 compatibility benchmark



Without getting into much detail , 2 main color spaces exist noways for video: BT.601 and BT.709 . If a video is encoded using one of them and then decoded with the other, a slight color shift will occur (see the post's image that is split into 3 rows). Most people may not even notice it, but areas with blue and green colors will be more affected and if you want your video to be reproduced the way you mastered it, you have to take some things into consideration.

That's why I made this benchmark, testing video editing applications like Adobe Suite, Sony Vegas, video players and many decoders suck as windows media player, quick time on a mac, Adobe Flash player, VLC, XBMC and browsers with mp4 playback capabilities such as chrome and safari. If you are curious about their behavior read below!

Possible behaviors


A video using H.264 codec can carry some flags that will allow the player to know the color space that was used when it was encoded, so that the same color matrix will be used in decoding. Unfortunately, not all test candidates paid attention to those flags. Most of them, ignored them. Another way to decide is to use BT.601 for SD video (resolutions up to 576 lines vertically) and use BT.709 for HD videos. And yet another way is to stick with just one color matrix (worst way).

Test creation

In order to test the above, I created a "ground truth" sRGB video and then encoded it to H.264 using all the possible combinations: SD VS HD, BT.601 VS BT.709, tagged VS untagged. "Tagged" will be the clips where I stored the "flag" about the color space used when encoded and "unflagged" will be the videos without this flag. The flags I set are: "Color primaries", "Transfer characteristics", "Matrix coeeficients".

Encoding was performed using MEGUI, x264 and avisynth. Using avisyth I converted the colors from RGB to YV12 color space using BT.601 or BT.709 (ex ConvertToYV12(matrix="rec709") ) Using X.264 (inside megui), I encoded the video stream and "tagged" the video stream or not with the color matrix I used earlier. I also used Sony Vegas and Adobe Media Encoder to test their behavior in encoding.

The comparison was done, taking screenshots on Mac and using FRAPS on Windows. In both systems, sRGB profile was used for the monitor.

You can download the video benchmark files here. You can use media info to see the details of the files (flags, resolution etc).

Results

Windows media player

If the default decoder is used for H.264, then tags are read. If the video is untagged, BT.601 is used for SD video and BT.709 is used for HD.

If the decoder is ffdhow and outputs YV12, tags are ignored. BT.601 is used for SD video and BT.709 is used for HD. If RGB32 is set as output in ffdhsow, then software conversion is done for YV12<->RGB. Then, tags are read. If the video is untagged, BT.601 is used for SD video and BT.709 is used for HD.

VLC in Windows

Tags are ignored. BT.601 is used for SD video and BT.709 is used for HD.

XBMC in Windows

If DXVA2 is enabled then tags are read. If the video is untagged, BT.601 is used for SD video and BT.709 is used for HD.

If DXVA2 is not enabled, tags are ignored and BT.601 is used for SD video and BT.709 is used for HD.


Quick Time player on a mac

Quick Time player on a mac reads the tags of the video and uses the appropriate color matrix. If the video is untagged, then it always uses BT.601 .

A strange thing that I noticed is that when the video is BT.709 (tagged), the decoded image by QT is a little different than what all the other software produce. That means that either the BT.709 color primaries that QT uses is right and the other software are wrong or the opposite . When the "color primaries" is tagged as SMPTE 240M (almost the same as BT.709), then the decoded image from QT looks the same as the original.

Video player on mac

I tested this in vlc and XBMC. They ignore tags. They always use BT.601 .


Flash player 11

Flash videos are everywhere. I did this experiment in YouTube (1,2,3) and vimeo and tested it on a mac and on a pc because flash player behaved differently! Flags could not be tested, because YouTube removes them when it re-encodes the video (but it keeps the initial color space).

Flash player pc: Always uses BT.601 . When video is in full-screen and accelerated video rendering is used, then BT.709 is always used. This means that your video will look different in full screen! Vimeo was still BT.601 in full screen.

Flash player mac: Playes SD videos using BT.709 and HD videos using BT.601 . This is the reverse than what it should be!!This means that a color shift will occur when changing YouTube's player quality from 720p (HD) to 480p or lower.

I also noticed a strange behavior in snow leopard where I did the tests: if I have a browser with a YouTube video open (with flash player) and then open a second or even third browser, then the latter will use BT.709 for everything.

Chrome playing .mp4 (mac/windows)

I dragged an mp4 video inside the browser (flash was not used). Tags are ignored. It always uses BT.601.

Safari playing .mp4 (mac)

I dragged an mp4 video inside the browser (flash was not used). It has exactly the same behavior as Quick Time for mac (obviously QT is used by Safari).

iPhone

Either you're playing a video from YouTube app or from safari (youtube mobile) or dropbox app, Quick Time is used. So the behavior is the same as Quick Time player on a mac.

Adobe suite CS5.5

Adobe Media Encoder behaves the same as After effects and Premier when importing and exporting video. It ignores the flags. It uses BT.601 for SD video and BT.709 for HD. This means that if you import an HD video, downscale it to SD and export it, it will do the correct transformations for 709->601. The only problem is that if you have an HD video using BT.601 or an SD video using BT.709 (very rare) they will be imported wrong.. One thing to notice here it that adobe suite CS5 behaved differently. It used BT.601 for everything.

Sony Vegas

Behaves the same as Adobe Suite. It ignores flags and decides by resolution. One thing to notice in Vegas is that it uses full range (0-255) when importing and exporting mp4/mpg. This means that if you want your video to be limited range (16-235) you have to do it yourself. Just use "sony color corrector" on your main video bus and set it to: "comptuer RGB to studio RGB" for proper export.

Conclusion

As you might have guessed, each software behaves differently. The best behavior was to use the video's flag and if the video was untagged, to use BT.601 for SD and BT.709 for HD. This happened in Windows Media Player and XBMC in Windows. Quick Time on Mac behaved almost perfect. Flash was a disaster. There is not way to guarantee what the end user will see. I hope adobe fixes this in next versions.

Regarding the flags, "Transfer characteristics" did not have an effect on any player. "Matrix coefficients" was the one that mattered in proper BT.601/BT.709 reproduction and it was read from Quick Time (mac) and Windows Media player and XBMC windows, as I have already stated. "Color primaries" had an effect only on Quick Time, but it needed to be flagged as "smpte 240m" (when "Matrix coefficients" was flagged as BT.709) to give the same image as other players. Setting "Color primaries" to BT.709 produced more contrast in the output! Very strange...

my original post in my blog:
http://www.wiggler.gr/2012/02/27/bt-...compatibility/
useful links:
http://www.curtpalme.com/forum/viewtopic.php?t=21870
http://www.theasc.com/magazine/april...um2/page6.html
http://forum.doom9.org/showthread.php?t=133982
http://avisynth.org/mediawiki/Color_conversions
http://mewiki.project357.com/wiki/X2...gs#colormatrix
Petrakeas2 is offline   Reply With Quote