Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
24th January 2018, 18:56 | #48621 | Link | |
Registered User
Join Date: Aug 2016
Posts: 1,348
|
Quote:
|
|
24th January 2018, 19:33 | #48622 | Link |
Registered User
Join Date: Dec 2016
Posts: 212
|
Madshi, as per the discussion above, just out of curiosity, is there a signal path with madVR that would allow avoiding upconverting 4:2:0 from the file to RGB buffers for processing i.e. to keep everything in component colorspace all the way to the display?
|
24th January 2018, 19:38 | #48623 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
|
Not on a PC, its internally always RGB.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
25th January 2018, 00:13 | #48628 | Link |
Registered User
Join Date: Aug 2016
Posts: 1,348
|
another issue thats been discussed is 444 full rgb 8 bit vs 420 10 bit. Running my desktop in 444 full RGB I lose true 10bit, most, if not all of the HDR movies available are now in 10 bit and my signal path is fully 10 bit compatible so it seems silly not to use it.
The Problems arise with chroma subsampling conversions when playing movies on your PC and this is where i start to get lost down the rabbit whole. if the content is already 420 it gets converted to 444 before being output to your TV, I get that. If you then output the 444 image to your TV then another conversion isnt required but if your desktop is set to 420 it has to convert back to that to output range before feed your monitor or TV with that, that also makes sense. So, there is an extra chroma conversion and this this is where the loss of quality is happening, correct? Couple of questions then. 1. To what extent does this second conversion make any difference (I cant see any difference) if the image started off as 420 as all blurays do, are we assuming it makes a difference or does any one have proof visual proof. I would certainly agree if you fed the GPU an image in 444 colour space which was then converted to 420 before being output to your TV there would quality loss, thats obvious but if the original movie was in 420 to start with is anything really changing on that second conversion? 2. Playing a 10 bit HDR movie in 8 bit 444 is not going to look as good as a movie played in 10 bit 420, it cant be, that's obvious, 8 bit dithered gradients cant look be as good as full 10 bit colour gradients, they will be close though. However if we want to play movies in 10 bit we cant use 444 full RGB at the moment as HDMI 2.0 bandwidth doesnt support it. So the question would be, what is better, losing full 10 bit or having an extra chroma conversion to preserve 10 bit colour. This is an issue thats only arisen recently as we are now only just starting to see 4k HDR content to play, prior to this there was no bandwidth restrictions, we could set our PC as 444 10 bit 1080p and there would be no need for a conversion back to 420. I'd be interested to know if there are an objective answers to this. |
25th January 2018, 00:49 | #48630 | Link |
Registered User
Join Date: Oct 2017
Posts: 331
|
I have an objective answer. You misunderstand one important piece of criteria. HDMI 2.0 doesn't support RGB Full 444 above 30Hz but it does support it under 30Hz. Titles are 23.976 aka 23/24 Hz. Therefor it is supported and how many of use it (as intended). This means we have no judder, we are applying 10bit in Video mode (actually 12bit dithered) from an HDMI 2.0 port. We also don't want to use our Windows Desktop at 30Hz or we get unusable LAG, so we force 12bit video and default 8bit desktop (automatically). Fwiw, 8bit can look and perform better than 10bit at times. There are those that feel mismatching frame rate of the original source and bumping it to 60Hz for some unknown reason and then using an algo to try and smooth out the mismatch as much as possible is a better idea. I really don't understand what the higher resolution offers them for what they give up but that's another discussion.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit KODI 22 MPC-HC/BE 82" Q90R Denon S720W |
25th January 2018, 01:00 | #48632 | Link |
Registered User
Join Date: Oct 2017
Posts: 331
|
Agree. Some Samsung TV's. And most if not all other TV manufactures as well. As you taught me in the past, ANY 60Hz TV is screwed and by all rights, should be. Those that are truly 120Hz are very, very workable with no problems, such as mine (Samsung). I should not have wrote "for some unknown reason". I simply forgot to be honest because it's a problem for others I take for granted. The more I think about it, 60Hz and a smooth motion algo is really a good option. Now I understand why madshi put it in madVR. I'm sure 60Hz display users really appreciate it too. Using the telecine option in nVidia would be a good idea in this case too I suppose.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit KODI 22 MPC-HC/BE 82" Q90R Denon S720W Last edited by brazen1; 25th January 2018 at 01:28. |
25th January 2018, 02:19 | #48633 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,922
|
if you add that i'm not aware of a single Samsung TV that can do 4:4:4 at 23p what so ever there are more reasons even through less important ones.
Quote:
|
|
25th January 2018, 02:57 | #48634 | Link | |
Registered User
Join Date: Jul 2010
Posts: 2
|
Quote:
Nvidia-GTX970. It is random. PLaying well for 30/40/45... mins, PC restarts. Also tried , reset all madvr settings to original. Still happens. Scared. - Now using back EVR(c), no restarts happening. Try again next ver madvr. [] Will try again, and look at windows logs. [] NO more reBoot of windows10 with mpc-be , LAV 0.70.2.88 , madvr 0.92.10 Tests with hours of video. Nice! Last edited by qazokm123; 27th January 2018 at 07:31. |
|
25th January 2018, 05:44 | #48636 | Link | |
Registered User
Join Date: Aug 2005
Posts: 54
|
Quote:
You keep asking about these kinds of nuances over and over and keep stating you don't see any difference. I don't see why you are hung up on it. If you don't see any difference then be happy. If you want to understand why it makes a difference, it's because you are upscaling chroma to full resolution (most likely) using MadVRs high quality scalers (to RGB [not just 444, there's a difference] high bitdepth). To then downscale that again to 420, not only just undoing the high quality chroma upscaling (and possible any refinements you have selected) but potentially also damaging the chroma image more due to the potential of poor or substandard chroma downscalers used by the GPU/drivers. There's a debug/test image on the first page of this thread if you want to test to see what is ending up on your screen. I guess the only "somewhat new" twist on the scenario is the addition of 10bit. You make the incorrect assertion "Playing a 10 bit HDR movie in 8 bit 444 is not going to look as good as a movie played in 10 bit 420". It all depends on the every piece in the pipeline and how it interacts and also the perception of the user. Some users are more distracted by noise level (8bit from 10bit), some users are more distracted from blurry chroma (420). It's also no point sending 10bit if something in your chain is downconverting it to 8bit. |
|
25th January 2018, 08:41 | #48637 | Link | |
Registered User
Join Date: Jun 2017
Posts: 155
|
Quote:
I guess some TVs don't have real 10-bit panels even if stated in the specs. So my suggestion would be to just test it and decide for yourself. |
|
25th January 2018, 11:07 | #48638 | Link | |
Registered User
Join Date: Jan 2016
Posts: 5
|
Quote:
I'm not sure everything is all worked out at the video driver+OS levels either. For example, I see pronounced banding / macroblocking (LG E6 HDR tone map induced clipping?) with nVidia video settings @ RGB 12-bit Full playing HDR video fullscreen windowed in certain scenes - easiest repro for me is in Blade Runner 2049 at ~36:28 in the white background, left of Gaff's head. But I played around and what I found is (all 23Hz 2160p): (NV HDR = where MadVR auto switches; OS HDR = I hit the toggle switch in Windows myself before starting the video) 1. Fullscreen windowed NV HDR RGB 12-bit Full = issue 2. Fullscreen exclusive NV HDR RGB 12-bit Full = no issue 3. Fullscreen windowed OS HDR RGB 12-bit Full = no issue 4. Fullscreen windowed NV HDR YCbCr 12-bit 4:4:4 (Limited) = no issue A while back (maybe almost a year ago) I found that nVidia YCbCr 4:4:4 and 4:2:2 had extreme color shift in the secondaries with my HTPC setup. The RGB primaries were within 0.5 dE but cyan/magenta/yellow were way off. In RGB mode the colors were near perfect when calibrated. I haven't checked it lately since plenty of driver, OS, firmware, etc updates but there are just so many variables I didn't want to spend the time; hence I run the HTPC in RGB. |
|
25th January 2018, 11:14 | #48639 | Link |
Registered User
Join Date: Aug 2016
Posts: 1,348
|
well this is the problem isnt it, everyone has different setups and pipelines so its very difficult to know which setup is best, and yet people on here seem very sure my 10bit setup will be worse than if I switch to 8bit, I think its clear that nobody knows this for sure.
I Moved back to 444 last night, not seeing any dropouts or black screens yet so I'll see how it goes. |
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
Thread Tools | Search this Thread |
Display Modes | |
|
|