View Single Post
Old 14th November 2016, 16:37   #40092  |  Link
Klownicle
Registered User
 
Join Date: Jan 2004
Posts: 16
Can someone help me better understand the current functionality of HDR on computers at this time, obviously specifically with madVR as well.

When I play an HDR movie, I configure madVR to send HDR directly to the TV as I would expect since I have an HDR capable environment. I've tried things that said online to set the calibration in madVR to 702 or 2020 but that made no visible difference. I've tried configuring my desktop to output 4:2:2 @ 24hz @ 10bit (and 12bit). I can't seem to get 4:4:4 at 10bit on the Windows desktop but I believe I read this isn't possible with our consumer graphics cards? I've also set the rendering at DX11 in madVR as I read that consumer cards akin to the sentence prior to this can't output 4:4:4 at 10bit but can in exclusive mode. The TV input for the PC is set to UHD Color. I've also tried Standard/Movie to see if that made a difference in how the TV understands it. I've also set madVR to change resolution for 2160p24, 2160p25 respectively.

When I take the said MKV files and play them directly on the K8500 Samsung Player, the TV detects it as an HDR video as mentioned above and looks fine.

The only way I can get the HDR videos on the PC to look proper is using the HDR to SDR functionality of madVR. As far as I can tell when playing on the K8500, the result is the same within reason.

I have the KS8000 Samsung that supports HDR, currently when I play an HDR movie from my Samsung K8500 UHD Blu-ray player the TV reports "An HDR Video is Playing." I never see this on the TV when using the PC. The Bluray HDMI input has the UHD Color on as well to get it to detect the HDR video. I have a SR5010 Marantz Receiver that supports all the latest gizmos (2.0a/2.2/4k). And topping off with a 970GTX (I've tried a 1070GTX as well).

Now I tried the 1070GTX as I read that the 970GTX doesn't support HDMI 2.0a which introduced support for HDR data. However when I tried the 1070GTX I noticed no differences except being able to decode HEVC (as expected).

I've tried hooking directly up to the TV from the PC and the results are the same.

Is the answer as simple as madVR can't send the HDR data yet. I mean I do see something about that when selecting that option, but it states I need to manually turn it on? So I'm not sure what that is. The tv has an HDR+ mode but I haven't fiddled with that. I believe HDR+ is for simulating an HDR effect with standard content. As it stands when sending HDR to the TV from the PC, the colors are off (drab and washed out along with very grainy) unless I use HDR to SDR in madVR.

Thanks for any insight!

Last edited by Klownicle; 14th November 2016 at 16:41.
Klownicle is offline   Reply With Quote