View Single Post
Old 3rd April 2018, 01:27   #50021  |  Link
Calolo
Registered User
 
Join Date: Mar 2018
Posts: 1
Hello,

I am using MPC-HC, madVR and LAV filters for my HTPC (iGPU HD 630).
I have disabled all image correction settings in Intel drivers.

Setting decoding to D3D11 native result in an average rendering time of 25ms, mostly stable even with light scaling settings. Using D3D9, average rendering was much better (5ms) but with max rendering time > 25ms.
Should I have a faster rendering with D3D11?

I have not been able to watch "The world in HDR" 4K 60fps 10bit demo:
http://4kmedia.org/the-world-in-hdr-uhd-4k-demo/
Even using bilinear scaling, on my 1080p TV the rendering is very slow. I tried with EVR, the video is smooth but HDR is not converted to SDR. Maybe the madVR HDR>SDR conversion is too heavy to be handled by the iGPU?

Using the HDMI port on Intel iGPU does not allow to output 10bit because the drivers always dither to 8bit (I checked that with my AVR reporting 4:4:4 8bit signal).
https://www.intel.com/content/dam/ww...hics-paper.pdf
Could we have an option in madVR to force 12bit output?

Thank you very much.
Calolo is offline   Reply With Quote