Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 13th September 2018, 02:20   #52441  |  Link
Clammerz
Registered User
 
Join Date: Aug 2005
Posts: 54
Quote:
Originally Posted by YGPMOLE View Post
To keep the video signal untouched and to avoid unwanted colorspace conversion, I set the LAV Video decoder to output RGB24
You are actually doing the opposite of what you want to achieve. You are introducing an unwanted colorspace conversion. You are forcing LAV to upscale the chroma data to match the luma resolution bypassing madVR, so your chrome setting in madVR are having no affect at all.

Leave LAV colorspace options as default.
Clammerz is offline   Reply With Quote
Old 13th September 2018, 02:21   #52442  |  Link
Megalith
Registered User
 
Join Date: Mar 2011
Posts: 96
I can't figure out why the 3DLUTs I get from DisplayCAL all result in a very dark picture. Source profile BT.709, destination profile default 2.2 Gamma, so it should be pretty straightforward (no HDR). The picture is easier to see on the profile without VCGT applied, though I heard that option should be checked.

Last edited by Megalith; 13th September 2018 at 02:27.
Megalith is offline   Reply With Quote
Old 13th September 2018, 03:40   #52443  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,278
In an effort to be on topic could you post in Display Calibration?
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 14th September 2018, 02:37   #52444  |  Link
YGPMOLE
Registered User
 
Join Date: Nov 2012
Posts: 25
@ Asmodian & Clammerz

Thank you for the explaination!!! I was totally wrong thinking that the first spacecolor conversion was done by the O.S. when MPC-BE is playing the original YCbCr videos, because with LAV in default mode I got YV12 8 Bit output: instead, YV12 is the codec used to read the YCbCr information. My bad (and ignorance).

Asmodian, AMD drivers say RGB 4:4:4 Full RGB pixel format.
__________________
Best Regards! Leo!

Last edited by YGPMOLE; 14th September 2018 at 05:36.
YGPMOLE is offline   Reply With Quote
Old 14th September 2018, 17:10   #52445  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,278
Quote:
Originally Posted by YGPMOLE View Post
Asmodian, AMD drivers say RGB 4:4:4 Full RGB pixel format.
Of course they do.

What are you doing AMD?
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 14th September 2018, 17:35   #52446  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,085
what am i missing here?
what is AMD doing wrong here?
huhn is offline   Reply With Quote
Old 14th September 2018, 17:45   #52447  |  Link
Klaus1189
Registered User
 
Join Date: Feb 2015
Location: Bavaria
Posts: 423
Quote:
Originally Posted by Asmodian View Post
Of course they do.

What are you doing AMD?
agree with huhn, what is wrong?

Driver displays RGB when you force LAV to output as RGB.
So what's the matter?
Klaus1189 is offline   Reply With Quote
Old 14th September 2018, 18:07   #52448  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,278
I am simply mildly annoyed that they call RGB "RGB 4:4:4", as if there could be something like "RGB 4:2:2". Describing the chroma sampling pattern for RGB formats makes no sense.

But nothing is wrong with the image, and it does have full resolution "chroma".
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 14th September 2018, 18:10   #52449  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,085
AMD always did this.

and yes you can subsample RGB it simply doesn't make any sense.
huhn is offline   Reply With Quote
Old 14th September 2018, 18:49   #52450  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,450
Quote:
Originally Posted by huhn View Post
and yes you can subsample RGB it simply doesn't make any sense.
Not with HDMI you can't.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 14th September 2018, 19:15   #52451  |  Link
Razoola
Registered User
 
Join Date: May 2007
Posts: 454
In relation to the new Nvidia RTX graphic cards and the video decoding they support I read the following for those intrsted (from guru3d)...

The video processor also has had a bit of an update and offers an improved video and hardware en/decoder. HEVC 8K30 HDR real-time sees 25% bitrate savings. H.264 up to 15% bitrate savings. Turing GPUs can drive two 8K displays at 60 Hz with one cable for each display. Turing’s new display engine supports HDR processing natively in the display pipeline. Tone mapping has also been added to the HDR pipeline. Tone mapping is a technique used to approximate the look of high dynamic range images on standard dynamic range displays. Turing supports the tone mapping formula defined by the ITU-R Recommendation BT.2100 standard to avoid color shift on different HDR displays. Turing GPUs also ship with an enhanced NVENC encoder unit that adds support for H.265 (HEVC) 8K encode at 30 fps. The new NVENC encoder provides up to 25% bitrate savings for HEVC and up to 15% bitrate savings for H.264. Turing’s new NVDEC decoder has also been updated to support decoding of HEVC YUV444 10/12b HDR at 30 fps, H.264 8K, and VP9 10/12b HDR.
Razoola is offline   Reply With Quote
Old 14th September 2018, 20:15   #52452  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,085
Quote:
Originally Posted by nevcairiel View Post
Not with HDMI you can't.
why should anyone sane add this.
huhn is offline   Reply With Quote
Old 15th September 2018, 01:52   #52453  |  Link
YGPMOLE
Registered User
 
Join Date: Nov 2012
Posts: 25
Hi guys!

I didn't want to start a flame, just understand and learn a little bit more from who knows really more than I do!

That's my situation now: after checking every option in LAV, the output is NV12 for SD and HD 8 Bit (MPeg2 DVDs, and H264 .mkv) and P010 for HEVC 10 Bit UHD; with the same settings of before, madVR produces dropped frames: I suppose because colorspace conversion and chroma upsampling are on his shoulder, and I have to re-optimize.

If I simply enable SVP to do frame rate conversion for SD and HD, LAV changes automatically his output to YV12: I suppose to "talk" with FFDshow input, am I right?

In this situation, FFDShow output at NV12: wouldn't be better to force his output at YV12 (to avoid the YV12 - > NV12 conversion), and let madVR do directly the YV12 - > RGB conversion?
__________________
Best Regards! Leo!
YGPMOLE is offline   Reply With Quote
Old 15th September 2018, 02:18   #52454  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,085
Quote:
That's my situation now: after checking every option in LAV, the output is NV12 for SD and HD 8 Bit (MPeg2 DVDs, and H264 .mkv) and P010 for HEVC 10 Bit UHD; with the same settings of before, madVR produces dropped frames: I suppose because colorspace conversion and chroma upsampling are on his shoulder, and I have to re-optimize.
yes chroma scaling cost a little bit.

Quote:
If I simply enable SVP to do frame rate conversion for SD and HD, LAV changes automatically his output to YV12: I suppose to "talk" with FFDshow input, am I right?
nv12 and yv12 are pretty much the same thing a conversation between them is very fast and lossless.

Quote:
In this situation, FFDShow output at NV12: wouldn't be better to force his output at YV12 (to avoid the YV12 - > NV12 conversion), and let madVR do directly the YV12 - > RGB conversion?
madVR can convert pretty much everything that lavfilter outputs directly to RGB.
huhn is offline   Reply With Quote
Old 15th September 2018, 19:22   #52455  |  Link
kostik
Registered User
 
Join Date: Jul 2007
Posts: 118
Quote:
Originally Posted by Razoola View Post
In relation to the new Nvidia RTX graphic cards and the video decoding they support I read the following for those intrsted (from guru3d)...

The video processor also has had a bit of an update and offers an improved video and hardware en/decoder. HEVC 8K30 HDR real-time sees 25% bitrate savings. H.264 up to 15% bitrate savings. Turing GPUs can drive two 8K displays at 60 Hz with one cable for each display. Turing’s new display engine supports HDR processing natively in the display pipeline. Tone mapping has also been added to the HDR pipeline. Tone mapping is a technique used to approximate the look of high dynamic range images on standard dynamic range displays. Turing supports the tone mapping formula defined by the ITU-R Recommendation BT.2100 standard to avoid color shift on different HDR displays. Turing GPUs also ship with an enhanced NVENC encoder unit that adds support for H.265 (HEVC) 8K encode at 30 fps. The new NVENC encoder provides up to 25% bitrate savings for HEVC and up to 15% bitrate savings for H.264. Turing’s new NVDEC decoder has also been updated to support decoding of HEVC YUV444 10/12b HDR at 30 fps, H.264 8K, and VP9 10/12b HDR.
Will the new GPUs output RGB 12/10bit@60Hz via HDMI? or is it still limited to RGB 8Bit@60Hz\12BIT@30Hz-23Hz since the HDMI connecter is the same 'old' HDMI 2.0b?
kostik is offline   Reply With Quote
Old 15th September 2018, 19:47   #52456  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,085
on paper they have HDMI 2.0. so no.
but the fact they can send 8k 60hz means they may be able to patch it in.
huhn is offline   Reply With Quote
Old 15th September 2018, 20:10   #52457  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 322
They can't claim they're HDMI 2.1 yet even if the hardware has been designed to be, because they have to wait for the official certification test spec: https://www.hdmi.org/manufacturer/hdmi_2_1/ (section 'Testing and Certification FAQs')
Samsung just announced a 8K TV but when asked if it had HDMI 2.1 inputs they said they had to wait for the certification to say anything:
https://www.cnet.com/reviews/samsung-85q900fn-preview/ (search page for HDMI 2.1)

So if you absolutely want 4K 60 Hz 10/12-bit RGB, the best is to specifically ask NVIDIA if the cards support that, but not ask them if the cards will be HDMI 2.1.
__________________
HTPC: W10 1803, E7400, NVIDIA 1050 Ti, DVB-C, Panasonic GT60 | Desktop: W10 1803, 4690K, AMD 7870, Dell U2713HM | Laptop: Insider Slow, i5-2520M | MediaPortal 1/MPC-HC, LAV Filters, ReClock, madVR
el Filou is offline   Reply With Quote
Old 15th September 2018, 20:16   #52458  |  Link
kostik
Registered User
 
Join Date: Jul 2007
Posts: 118
Quote:
Originally Posted by el Filou View Post
They can't claim they're HDMI 2.1 yet even if the hardware has been designed to be, because they have to wait for the official certification test spec: https://www.hdmi.org/manufacturer/hdmi_2_1/ (section 'Testing and Certification FAQs')
Samsung just announced a 8K TV but when asked if it had HDMI 2.1 inputs they said they had to wait for the certification to say anything:
https://www.cnet.com/reviews/samsung-85q900fn-preview/ (search page for HDMI 2.1)

So if you absolutely want 4K 60 Hz 10/12-bit RGB, the best is to specifically ask NVIDIA if the cards support that, but not ask them if the cards will be HDMI 2.1.
The new GPUs have HDMI 2.0b ports so they don't claim to have HDMI 2.1 . I guess it is possible to use 4k 10/12bit@60HZ using DisplayPort. Or do you mean that the new GPUs have actually HDMI 2.1 ports but hdmi 2.0b is mentioned in specs because they are waiting for official cert?

Last edited by kostik; 15th September 2018 at 20:18.
kostik is offline   Reply With Quote
Old 15th September 2018, 20:30   #52459  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,085
GPU just have HDMI ports end of story. what type of signal it can send depends on there display engine. if there display engine can produce a signal that is HDMI 2.1 complain they can update the driver to do that.

that's why all kepler cards can do HDMI 2.0 with the HDMI 1.4 bandwidth the a b c was always meaningless for GPUs the display engine matters and well HDCP which can't be updated.
huhn is offline   Reply With Quote
Old 15th September 2018, 20:34   #52460  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,450
Quote:
Originally Posted by huhn View Post
that's why all kepler cards can do HDMI 2.0 with the HDMI 1.4 bandwidth
Adding features like that is just software though, what we're really after with HDMI 2.1 is increased bandwidth. Now they may have planned ahead for that, but its a different situation entirely, since it needs stronger hardware to be able to do the increased bandwidth.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 07:38.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2018, vBulletin Solutions Inc.