Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 18th March 2019, 14:07   #55401  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 546
Quote:
Originally Posted by tp4tissue View Post
So with PC mode, it's actually doing the raw input without internal deband, and thereby exhibiting the natural output ?
But you should not see banding with madVR if dithering is configured correctly, so "raw input" on the TV should not show it either. IMHO it's caused by multiple low-quality conversions inside the TV because some of the image processing pipeline cannot handle RGB. The fact PC mode adds other issues (3:2 judder on 24p) is weird too.
If you want to check if the LG is running a debanding algorithm on its inputs when not in PC mode, you can disable dithering in madVR and then compare PC and standard mode again.
__________________
HTPC: Windows 10 1809, MediaPortal 1, LAV Filters, ReClock, madVR. DVB-C TV, Panasonic GT60, 6.0 speakers Denon 2310, Core 2 Duo E7400, GeForce 1050 Ti
el Filou is offline   Reply With Quote
Old 18th March 2019, 14:41   #55402  |  Link
j82k
Registered User
 
Join Date: Jun 2017
Posts: 145
Quote:
Originally Posted by el Filou View Post
The fact PC mode adds other issues (3:2 judder on 24p) is weird too.
Not really weird. Even in standard HDMI mode you need to enable "real cinema" to get proper 24p cadence but that also adds input lag and in pc-mode this option is greyed out (permanently disabled).

Though I've successfully created a 48Hz custom refresh rate (using 50Hz as a base) as a workaround and that way you can get proper cadence for movie content even in pc-mode.

Last edited by j82k; 18th March 2019 at 14:44.
j82k is offline   Reply With Quote
Old 18th March 2019, 15:55   #55403  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,126
Quote:
Originally Posted by madjock View Post
Having a play with one of the latest test builds of HDR->SDR now I have a graphics card that can handle it. I think it looks really nice, but unsure if this is because my SDR is calibrated more for blacks and whites (using test patterns and nothing fancy) and going from HDR to an HDR -> SDR comparison is hard to compare as I cannot really calibrate the HDR side, so the SDR looks brighter and punchier for an initial comparison.

Any expert or more experienced person done a more in depth analysis with an LED TV and have any thoughts ?
I have set-up an LED TV like this:

2.40 (madVR) -> 2.20 (display)

Because the image is actually getting brighter at the display, I increased the real display peak nits / lower limit in madVR to 225 nits to prevent the image from appearing washed out.

Set like above, the difference in brightness between HDR and SDR is actually reasonably close when the dynamic target nits in the test builds is making the decision on the end display brightness as the movie plays. UHD Blu-ray actually looks better than SDR Blu-ray in most cases, with some exceptions where SDR Blu-ray is noticeably brighter.

For a fair comparison, use full-bitrate UHD rips and not any of the UHD torrents out there that suffer from visual loss of quality.
Warner306 is offline   Reply With Quote
Old 18th March 2019, 17:19   #55404  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 408
Quote:
Originally Posted by j82k View Post
So I just measured a 50 point saturation sweep using 20% pattern stimulus on my LG C8 in PC HDMI input mode (4:4:4 chroma) vs standard HDMI input mode.
All picture enhancement settings were disabled. GPU was set to RGB Full 8-bit.



That pretty much confirms the banding I'm seeing when using pc-mode.
I didn't even bother to check other colors.

SO they make the panel, they probably have to do a sweep / precalibration of some sort

My point is maybe that table gets disabled in PC mode to trade for response time or something.

So you're getting a more raw version of what the panel should look like .

This may actually be a better mode to calibrate under, because then correction is not happening twice, once by chip, once by gpu.

How do calibrated luts look on this mode ? still banding ?
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 18th March 2019, 17:43   #55405  |  Link
j82k
Registered User
 
Join Date: Jun 2017
Posts: 145
What I assume could be happening is that the TVs internal lut works at less precision in pc-mode. If it is input lag related then the game preset which has the same input lag as pc-mode should also suffer from increased banding. Gonna check that out later.

I did create a 3dlut calibration in pc-mode once but it didn't turn out good, which isn't surprising. To correct all of these banding errors (they are everywhere in the low end range) would require an insane amount of corrections.
j82k is offline   Reply With Quote
Old 18th March 2019, 20:53   #55406  |  Link
XMonarchY
Registered User
 
Join Date: Jan 2014
Posts: 489
Quote:
Originally Posted by jkauff View Post
Many of us have got around that problem by using the CRU utility to create custom resolutions, then using madVR to optimize them. CRU apparently adds the custom settings at the OS level rather than the GPU level which protects them from driver-level changes.
Could you help me out with that? I could never figure out on how to add the same exact custom resolution that madVR adds via CRU...
__________________
8700K @ 5Ghz | ASUS Z370 Hero X | Corsair 16GB @ 3200Mhz | RTX 2080 Ti @ 2100Mhz | Samsung 970 NVMe 250GB | WD Black 2TB | Corsair AX 850W | LG 32GK850G-B @ 165Hz | Xonar DGX | Windows 10 LTSC 1809
XMonarchY is offline   Reply With Quote
Old 19th March 2019, 00:21   #55407  |  Link
jkauff
Registered User
 
Join Date: Oct 2012
Location: Akron, OH
Posts: 434
Although you can enter any values you want into CRU, it's generally better to create a custom resolution with the default EDID settings in CRU, then use madVR to create the optimizations.
jkauff is offline   Reply With Quote
Old 19th March 2019, 10:31   #55408  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 1,503
Quote:
Originally Posted by XMonarchY View Post
Could you help me out with that? I could never figure out on how to add the same exact custom resolution that madVR adds via CRU...
https://forum.doom9.org/showthread.p...98#post1868998
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v385.28),Win10 LTSB 1607,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED65B8(2160p@23/24/25/29/30/50/59/60Hz)
chros is offline   Reply With Quote
Old 19th March 2019, 10:53   #55409  |  Link
thighhighs
Registered User
 
Join Date: Sep 2016
Posts: 50
Quote:
Originally Posted by j82k View Post
What I assume could be happening is that the TVs internal lut works at less precision in pc-mode. If it is input lag related then the game preset which has the same input lag as pc-mode should also suffer from increased banding. Gonna check that out later.

I did create a 3dlut calibration in pc-mode once but it didn't turn out good, which isn't surprising. To correct all of these banding errors (they are everywhere in the low end range) would require an insane amount of corrections.
Game mode is specifically designed to reduce input lag for games Changing the input to "PC", but without turning on the game mode, you will get the PC-Desktop mode with improvements to read the text. I don't have a C8, but I have two LG TVs and dont think LG has made new logic.
If you looking for an accurate picture mode for madvr and movies, try isf Expert mode first, this made for it. Game mode is alternative for people who want chroma 4:4:4.

edit: Your input always should be set to PC, otherwise you trash Windows and apps quality. In this mode TV internally has fps-based source detection with different picture settings, so no reasons to switch input, if you dont watch tons of high framerate movies, of course.

Last edited by thighhighs; 19th March 2019 at 12:02.
thighhighs is offline   Reply With Quote
Old 19th March 2019, 12:01   #55410  |  Link
TheProfosist
Registered User
 
TheProfosist's Avatar
 
Join Date: Aug 2009
Posts: 136
anyone have a screenshot of what the madVR OSD shows when madVR is in fullscreen exclusive mode?

I ask because mine just says full screen windowed. Also there isn't that delay when going fullscreen. Manually triggering it via shortcut doesn't seem to do anything either.
TheProfosist is offline   Reply With Quote
Old 19th March 2019, 12:25   #55411  |  Link
j82k
Registered User
 
Join Date: Jun 2017
Posts: 145
@thighhighs
Wow, there is so much wrong with what you just said and I'm not even gonna bother correcting your statements. It seems like you didn't even read (or understand) my previous posts....

edit: also the issues discussed were specific to watching movies via madVR on LG Oleds and not about windows usage.

Last edited by j82k; 19th March 2019 at 12:40.
j82k is offline   Reply With Quote
Old 19th March 2019, 12:57   #55412  |  Link
thighhighs
Registered User
 
Join Date: Sep 2016
Posts: 50
Quote:
Originally Posted by TheProfosist View Post
anyone have a screenshot of what the madVR OSD shows when madVR is in fullscreen exclusive mode?

I ask because mine just says full screen windowed. Also there isn't that delay when going fullscreen. Manually triggering it via shortcut doesn't seem to do anything either.
https://miku.hatsune.my/potplayer-wi...usive-mode.jpg Screenshot from google. Try tick\untick some checkboxes from general settings section. Or if you on Win10 just use windowed mode and dont care about it.

Quote:
Originally Posted by j82k View Post
@thighhighs
Wow, there is so much wrong with what you just said and I'm not even gonna bother correcting your statements. It seems like you didn't even read (or understand) my previous posts....
When watch movies you use game or desktop modes with 4:4:4 chroma, because 4:4:4 chroma. And reported about bad picture (banding). I miss something?

Quote:
also the issues discussed were specific to watching movies via madVR on LG Oleds and not about windows usage
Who care about properly connection...
thighhighs is offline   Reply With Quote
Old 19th March 2019, 13:29   #55413  |  Link
blu3wh0
Registered User
 
Join Date: Feb 2014
Posts: 38
I would say that everything j82k has noted about the LGs has been correct, as I've measured/noticed most of these issues. Unfortunately there is no catch all solution to these problems. PC mode is required for any type of gaming in SDR, general Windows usage, and since the graphics cards operate in RGB 4:4:4 internally (thus madVR), there is reason to use RGB 4:4:4. Movies are not exempt here either if using an HTPC. Game mode in SDR is not worth testing as it provides no calibration options for white point or color, is forced into a wide color space, and do not support 4:4:4. While I can always see the slight banding in SDR on patterns, I rarely if ever see it in games or movies that isn't there in the source. HDMI mode smooths over the gradients, but doesn't really always get rid of the banding either. But then you also get losses from 4:4:4 to 4:2:2. HDMI mode aside from Game also has very bad input lag which can cause audio delays.

The one thing I haven't tested is 12-bit vs 8-bit in HDMI mode for HDR as I'm really hesitant to dither the 10-bit source, but I can't say I have noticed any banding in 12-bit while watching HDR movies. HDR Game Mode is required for HDR games and is very close to Cinema/Technicolor tone mapping without the input lag.
blu3wh0 is offline   Reply With Quote
Old 19th March 2019, 13:50   #55414  |  Link
j82k
Registered User
 
Join Date: Jun 2017
Posts: 145
Quote:
Originally Posted by thighhighs View Post


When watch movies you use game or desktop modes with 4:4:4 chroma, because 4:4:4 chroma. And reported about bad picture (banding). I miss something?
Yes what you missed is the whole point....
I don't use 4:4:4 modes because of the poor gradation/quantization in these modes.
I was just trying to make others aware of these issues.


Also I've never said that I use game mode. The only time I mentioned it was in relation to figuring out if these issues could be related to modes that offer the lowest possible input lag.

And then you jump in, making some false statements, even telling me which modes/picture presets I should use...

I don't even use my TV for windows tasks, this is just about video content played via madVR.
j82k is offline   Reply With Quote
Old 19th March 2019, 14:37   #55415  |  Link
j82k
Registered User
 
Join Date: Jun 2017
Posts: 145
Quote:
Originally Posted by blu3wh0 View Post
The one thing I haven't tested is 12-bit vs 8-bit in HDMI mode for HDR as I'm really hesitant to dither the 10-bit source, but I can't say I have noticed any banding in 12-bit while watching HDR movies. HDR Game Mode is required for HDR games and is very close to Cinema/Technicolor tone mapping without the input lag.
I haven't really watched HDR movies with 12-bit output so not sure how visible this is in content but when checking the 25% saturation HDR color ramps from this pattern package, they look worse with nvidia 12-bit/madVR 10-bit compared to 8-bit. That is in standard HDMI mode.

https://www.avsforum.com/forum/139-d...terns-set.html
j82k is offline   Reply With Quote
Old 19th March 2019, 15:17   #55416  |  Link
thighhighs
Registered User
 
Join Date: Sep 2016
Posts: 50
Quote:
Originally Posted by j82k View Post
Yes what you missed is the whole point....
I don't use 4:4:4 modes because of the poor gradation/quantization in these modes.
I was just trying to make others aware of these issues.
Yes, i understood you care about setup for yourself. And yes, here i'm wrong I'm some confused, because you talking about bad settings and issues, making some tests and different modes comparisons in custom scenarios, but ignore some starting points from user manual.
Hope no problems.

Last edited by thighhighs; 19th March 2019 at 17:22.
thighhighs is offline   Reply With Quote
Old 19th March 2019, 16:46   #55417  |  Link
svengun
Registered User
 
Join Date: Jan 2018
Location: Barcelona
Posts: 49
Maybe it is different for me as my LG OLED is from 2016, but I use Home Theater mode for windows & UHD player. If I put him on PC mode everything looks overly white

I have no problems with HT mode , HDR , MadVR, everything works fine and looks great

Of course Game Mode for Xbox / PS4....

Quote:
Originally Posted by blu3wh0 View Post
I would say that everything j82k has noted about the LGs has been correct, as I've measured/noticed most of these issues. Unfortunately there is no catch all solution to these problems. PC mode is required for any type of gaming in SDR, general Windows usage, and since the graphics cards operate in RGB 4:4:4 internally (thus madVR), there is reason to use RGB 4:4:4. Movies are not exempt here either if using an HTPC. Game mode in SDR is not worth testing as it provides no calibration options for white point or color, is forced into a wide color space, and do not support 4:4:4. While I can always see the slight banding in SDR on patterns, I rarely if ever see it in games or movies that isn't there in the source. HDMI mode smooths over the gradients, but doesn't really always get rid of the banding either. But then you also get losses from 4:4:4 to 4:2:2. HDMI mode aside from Game also has very bad input lag which can cause audio delays.

The one thing I haven't tested is 12-bit vs 8-bit in HDMI mode for HDR as I'm really hesitant to dither the 10-bit source, but I can't say I have noticed any banding in 12-bit while watching HDR movies. HDR Game Mode is required for HDR games and is very close to Cinema/Technicolor tone mapping without the input lag.
__________________
Livingroom: Ryzen 7 1700@3.9ghz - Win Insiders Fast Ring - MSI RTX 2700 Gaming - Philips 65OLED803 | Bedroom Ryzen 3 1200 - Win 8.1 - GTX1060 - LG OLED EG920V 55" > All with MadVR latest test build
svengun is offline   Reply With Quote
Old 19th March 2019, 18:03   #55418  |  Link
blu3wh0
Registered User
 
Join Date: Feb 2014
Posts: 38
While I don't want to get much further into this, my statements were dependent on whether the user cares about accuracy in white point, color, etc. with regards to the source and what you expect/can deal with negatives of each option. This also implies you are only using ISF (Cinema/Technicolor for HDR, Game Mode HDR for games) with the expected configuration/calibration. I can't speak to what looks good to someone else as I don't know their settings and sometimes these might not matter much depending on content. However, Game Mode for SDR is absolutely garbage for the reasons listed, but it does have the low input lag needed. I would suggest you try getting PC Mode configured properly for at least games instead (setting your game system/PC to RGB 4:4:4 Full and matching TV settings would get you started).

PC Mode ISF in SDR should look almost exactly the same (not counting 4:4:4 vs 4:2:2, banding, input lag and 23 Hz judder) as HT in ISF using the exact same settings. This means white point, brightness, and contrast should not be any different, so I would make sure Full/Limited is set correctly relative to the GPU settings.

Edit: Sorry, I wrongly assumed you had a 2016 LG B6, I did not even notice the actual TV you have. Anything I said might not actually apply to you at all as I don't know anything about that range of models.

Last edited by blu3wh0; 19th March 2019 at 19:36.
blu3wh0 is offline   Reply With Quote
Old 19th March 2019, 22:49   #55419  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 419
For all Kodi DSPlayer users, I'm trying to understand whether we could at least salvage madVR for continued Kodi use with updated builds (KDSPlayer being stuck with 17.6 for the foreseeable future).

madshi had something to say about this, three years ago and I am trying to reopen the discussion: https://forum.kodi.tv/showthread.php...975#pid2835975
ashlar42 is offline   Reply With Quote
Old 20th March 2019, 00:19   #55420  |  Link
j82k
Registered User
 
Join Date: Jun 2017
Posts: 145
I know I previously said all configurations would lead to increased color banding in pc-mode.

But YCbCr444 might actually be ok to use in pc-mode on the LG C8.

At least it seems that way from a few comparison measurements I did and also by looking at the 16-bit gradient pattern. Only tested SDR 8-bit so far.
I know fullRGB should theoretically provide the best results but what can you do when the TV messes it up...



Measurement has been done with the ISF Dark preset, all picture enhancements disabled, oled-light 25 (100nits), 20% pattern intensity as the errors are more visible in the lower luminance range.
The magenta sweep looks very bad in any configuration but with fullRGB PC-mode it's atrocious.
j82k is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 11:12.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.