View Single Post
Old 15th November 2017, 11:58   #47177  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
How does HDR10 work on UHD BD?

I started to fool around with UHD BD encodes (x265 10bit HDR10) on my 3dlut calibrated SD setup (see my signature below) and I realized that I don't understand a lot of things about how it works. (Please bear with me, I don't have the necessary hardware to play with them, and I didn't find any useful article about this.)

Quote:
Originally Posted by Manni View Post
The HDR10 mandatory layer plays just as well, and if you don't have a 12bits display or content handling ICtCp (which UHD Bluray doesn't) the differences are minimal.
...
Given the excellent quality of MadVR's dithering, I suspect the PQ difference from the higher bit depth (especially from a 1080p layer) would be close to nil. And dynamic metadata makes a bigger difference on low-end displays (those with a limited native contrast).
1. Data
Picture data is stored as bt.2020 (???) 10bit 4:2:0, or just or DCI-P3 in a bt.2020 container (???), resulting in a washed out picture.
- Wait, what????! Washed out picture?
- so, how does a true red become some greyish color when it's stored?
- I thought we have a truly increased color space compared to bt.709. Is it similar cheating as with those old anamorphic DVDs?
How does the static HDR10 metadata look like?
- is it stored in the beginning of the hevc video data?
- is it just couple of properties (bytes) about the mastering attributes and picture levels and "that's it"?

2. Processing
Splitter and decoder pass data to renderer (MadVR) for further processing:
- that probably happens in the "limited washed out pictures" world
- then either passthrough or HDR processing happens (for SD displays)
-- what does bt.2020 -> DCI-P3 means in OSD?

3. Displaying via passtrhough
Let's suppose that I understand how to calibrate an HDR display (that I don't ). Comparing to SD displays, how does the device behave with HDR10? (TVs, projectors)
- does it automatically increase/decrease "brightness" during playback? If so, can the device do it only for certain region or just the whole display? (especially if the questions about HDR10 in point 1. are true )
-- e.g. my SD display is set to ~120 nits, while can it happen that HDR display operates for a long time on 400nits????? If this is true (again) how can you calibrate it at all???

Quote:
Originally Posted by Manni View Post
The optional Dolby Vision layer is 1080p only, and the only thing it adds is 12bits over 10bits as well as dynamic metadata.
Really? Only 1080p of 12bit data? (I assume 4:2:0, again) That's definitely sound like cheating!

I think this amount of question is enough for starters Thanks!
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config

Last edited by chros; 15th November 2017 at 12:00.
chros is offline   Reply With Quote