Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 23rd November 2019, 22:19   #57861  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,764
Quote:
Originally Posted by tyguy View Post
I actually don’t have my Apple set in default mode. It’s hooked into my receiver which I have labeled “home theatre” in the input section.
Yes, Home Theatre is the same as the default, it subsamples the chroma.

Quote:
Originally Posted by tyguy View Post
If Windows is converting ycbcr to rgb and madvr only outputs rgb....Why not just use rgb 8 bit limited instead of ycbcr 422?
Windows is not converting YCbCr to RGB.

The GPU driver is converting the RGB from both Windows and madVR to YCbCr. Do NOT use YCbCr 422! That would be subsampling the chroma in the GPU driver instead of the TV.

If you are asking why the TV uses YCbCr 422 internally it is because TV manufactures are too cheap to get decent video processing chips. YCbCr 422 only takes 2/3 of the bandwidth of YCbCr 444 so they do not need as capable of hardware.

In the Nvidia drivers I cannot set limited range RGB anymore so I did not test limited range RGB (that isn't really a standard either). Better to use YCbCr as always limited and RGB as always full range. The TV is expecting YCbCr limited range input because that is what Apple TV, bluray players, etc. will send it. LG got it working reasonably well for that but they seem to not care about the quality of full range RGB input.

Quote:
Originally Posted by DMU View Post
@Asmodian
And the output at 60Hz@RGB@PCMode does not solve the issue's on LG OLED?
The refresh rate does not change the banding with RGB input in PC/Game mode in anyway but it does solve the issue with judder with any non-60Hz input while in PC mode. I always use 60 Hz with smooth motion now. Once in PC mode (so we get full resolution color) these TVs are pretty finicky and only seem to handle 8 bit YCbCr 444 limited range 60 Hz input well.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 23rd November 2019, 22:31   #57862  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,764
Quote:
Originally Posted by huhn View Post
the problem is someone has to write the code for this and how do you even say the GPU this frame 0.5 ms later plz?
Yes, of course it is work for madshi that does not improve the quality in any ideal scenarios. This is the best reason for it not to exist.

But it isn't that hard to decide what to do (coding it is another issue), simply wait 0.5 ms and then present the frame. I probably cannot tell if a frame is 0.5 ms early or late anyway and the even smaller inaccuracies due to Windows scheduling are even less important. As long as the player is presenting the frames close to when it should audio sync is a non-issue. If this would be better than smooth motion at 60 Hz I don't know, I am pretty happy with smooth motion already so I am not sure it is worth the effort. However, as it is now I need to manually turn off VRR when I switch from gaming to watching video.

Quote:
Originally Posted by mclingo View Post
HDMI 1 selected, lleft it named HDMI 1
Nothing I said applies if you let the TV subsample chroma (e.g. do not use the PC HDMI setting).

Also, use the banding test patterns to judge banding. I do not see obvious banding with normal content except very rarely. I also don't know about models that old, I have only tested the C7 and C9 myself.

HDR tone mapping is obviously worse quality on my display when in PC/Game mode, I really wish LG did not subsample chroma in their video processor. According to my calibration software the gamut is much smaller, which results in less saturated/washed out video but the gamma is still reasonable, not as good, but reasonable. I do switch to Home Theater when watching HDR (rare for me).
__________________
madVR options explained

Last edited by Asmodian; 23rd November 2019 at 22:52.
Asmodian is offline   Reply With Quote
Old 23rd November 2019, 23:11   #57863  |  Link
DMU
Registered User
 
Join Date: Dec 2018
Posts: 112
Quote:
Originally Posted by Asmodian View Post
The refresh rate does not change the banding with RGB input in PC/Game mode in anyway
On my Samsung I notice only 1 issue in PC mode: if the frequency of the HDMI input is not 29/30/59/60Hz, then the TV switches to 422 mode.
__________________
R3 2200G / Vega8 / Samsung UE40NU7100
Win10Pro 1909 / 4K RGB 60Hz / AMD19.12.2
MPC-HC 1.8.8 / madVR 0.92.17 / FSW / 10bit@60Hz
DMU is online now   Reply With Quote
Old 23rd November 2019, 23:16   #57864  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,062
Quote:
Originally Posted by Asmodian View Post
Yes, of course it is work for madshi that does not improve the quality in any ideal scenarios. This is the best reason for it not to exist.
the point is it even possible?
it you want to do that using chrono you have to write a new rendering path using that...
Quote:
However, as it is now I need to manually turn off VRR when I switch from gaming to watching video.
that's odd this is happening fully automatic on my system with AMD and nvidia.

alternatively use the manage 3D settings to automate it.
huhn is offline   Reply With Quote
Old 23rd November 2019, 23:44   #57865  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,764
Quote:
Originally Posted by huhn View Post
the point is it even possible?
Of course it is possible, all you need to do is not present it until you want it displayed, like any game with a frame rate limit would. This is why your timing wouldn't be that accurate but we don't know that it would be bad enough to be annoying. A lot of people seem OK with frames being displayed for +/- 8 ms (3:2 judder) and it would likely be possible to get it much better than that.

Quote:
Originally Posted by huhn View Post
that's odd this is happening fully automatic on my system with AMD and nvidia.
Fully automatic disable? How does it know to turn off?

Quote:
Originally Posted by huhn View Post
alternatively use the manage 3D settings to automate it.
I tested this a bunch in the past but it does work now!

At least with 441.20 setting Zoom Player to a fixed refresh rate works perfectly, my player stays at a solid 60 Hz instead of drifting about. Thanks!

Quote:
Originally Posted by DMU View Post
On my Samsung I notice only 1 issue in PC mode: if the frequency of the HDMI input is not 29/30/59/60Hz, then the TV switches to 422 mode.
What?! Why?!? The way TVs handle various inputs is totally bizarre. I wonder how the engineers make decisions.
__________________
madVR options explained

Last edited by Asmodian; 24th November 2019 at 02:35.
Asmodian is offline   Reply With Quote
Old 24th November 2019, 02:35   #57866  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,062
Quote:
Originally Posted by Asmodian View Post
Of course it is possible, all you need to do is not present it until you want it displayed, like any game with a frame rate limit would. This is why your timing wouldn't be that accurate but we don't know that it would be bad enough to be annoying. A lot of people seem OK with frames being displayed for +/- 16 ms (3:2 judder) and it would likely be possible to get it much better than that.
you say that so easy. games are designed differently with bad design choices like physics been coupled with frame rate.
the needs for frame rate capper which in turn kind of do the same.

Quote:
Fully automatic disable? How does it know to turn off?
it's supposed to detect video playback and blocking it looks like your player didn't make it on that list.
i never even got free sync to trigger with mpc-hc.
disabling free sync is planned or even already part of the nvidia driver for fixed frame rate games.
Quote:
What?! Why?!? The way TVs handle various inputs is totally bizarre. I wonder how the engineers make decisions.
that's an upgrade in the past it was only 60 hz followed by the addition of 30 hz. that was well known for samsung TVs.
only sony and phillips are well known to support PC mode at all refreshrates without dumb things like forcing 3:2 judder that's a panasonic classic.
huhn is offline   Reply With Quote
Old 24th November 2019, 02:57   #57867  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,764
Quote:
Originally Posted by huhn View Post
you say that so easy. games are designed differently with bad design choices like physics been coupled with frame rate.
I only say that I think it would still be useful/interesting, not that it is necessarily worth it for madshi to implement. He has definitely said he is not interested in trying to do the presentation timing in madVR, but might use a D3D API if available.

Quote:
Originally Posted by huhn View Post
disabling free sync is planned or even already part of the nvidia driver for fixed frame rate games.
That makes sense, Zoom Player probably has a pretty small user base today. People don't like paying for a player when the open source alternatives are so good. I am just happy manually assigning it works. Is that because of a newer driver or software VRR instead of the Gsync module? I didn't test it with the TV before, only on monitors with a module.

Quote:
Originally Posted by huhn View Post
only sony and phillips are well known to support PC mode at all refreshrates without dumb things like forcing 3:2 judder that's a panasonic classic.
It is interesting that Sony seems to generally do a good job at video processing, historically as well. I believe their engineers have a different viewpoint or something.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 24th November 2019, 04:02   #57868  |  Link
tyguy
Registered User
 
tyguy's Avatar
 
Join Date: Oct 2019
Posts: 32
Quote:
Originally Posted by Asmodian View Post
Yes, Home Theatre is the same as the default, it subsamples the chroma.







Windows is not converting YCbCr to RGB.



The GPU driver is converting the RGB from both Windows and madVR to YCbCr. Do NOT use YCbCr 422! That would be subsampling the chroma in the GPU driver instead of the TV.



f.
So if pc mode is full Chroma 444, and everything but game mode and pc is 422, then what pixel format is game mode?

I just see tons of conflicting information. Like this here:

“Only use RGB 8-bit for everything on a PC, including HDR games and movies, even when connected to a HDR TV over HDMI. The GPU does dithering for 10-bit content and there will be no banding.

Almost everything you read on this topic is misinformation.”


https://www.google.com/amp/s/amp.red...12_bpclimited/

“A 10-bit signal to the display is only required when the source doesn't perform dithering (PS4 , Blu-ray player, etc.). If the PS4 did dithering, it could run RGB 8-bit 60 Hz instead of subsampling at YCbCr420 10-bit 60 Hz because there isn't enough bandwidth for RGB 10-bit 60 Hz over HDMI 2.0.

If the display is 8-bit + FRC, the 10-bit signal is dithered internally by the display anyway. A true 10-bit panel is pointless since the quantization noise on 8-bit + dithering is invisible.

On Windows, HDR apps render to a 10-bit surface and the GPU does dithering automatically if the signal is 8-bit. So you should just use 8-bit RGB for maximum quality.”
tyguy is offline   Reply With Quote
Old 24th November 2019, 05:19   #57869  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,062
Quote:
Originally Posted by tyguy View Post
So if pc mode is full Chroma 444, and everything but game mode and pc is 422, then what pixel format is game mode?
a game mode should be 4:4:4 if you ask me but that doesn't mean it is may just be low latency. sony doesn't even have a PC mode they just have game mode or graphic both are 4:4:4 with all the processing you want to ruin the image.

Quote:
I just see tons of conflicting information. Like this here:
Quote:
Limited. All movies and TV shows are transmitted/streamed in limited color range (including Blu-ray’s)
this didn't even got corrected this obvious flaw. reddit there are simply not enough people that understand this topic to run after those that spread blind misinformation which is only wrong in this context.

sending limited or full range RGB should only depends on if the TV excepts full range or limited range. if the end device can be set up to except full range error free. full range is better because that is what the windows desktop runs at that's the output of the video renderer else limited range RGB in the GPU output would never ever be correct.

rendering 10 bit on a WFS surface with the nvidia GPU at 8 bit has a history of terrible banding. this was such a bad setup that i didn't recheck it if this issue is still present. but i guess this could affect other software then madVR.
edit: nvidia 441.08 win 1809 issue still present. test with 1909 tomorrow. edit2: same issue.

Last edited by huhn; 24th November 2019 at 17:56.
huhn is offline   Reply With Quote
Old 24th November 2019, 08:15   #57870  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,764
Quote:
Originally Posted by tyguy View Post
“Only use RGB 8-bit for everything on a PC, including HDR games and movies, even when connected to a HDR TV over HDMI. The GPU does dithering for 10-bit content and there will be no banding.
Is this on a recent LG OLED? I would agree with that advice unless using a display that has banding when sent full range RGB but doesn't when sent limited range YCbCr.

I would read that as saying 10 bit is unimportant, arguing against sending YCbCr 422 10 bit instead of 444 or RGB 8 bit, which some think sounds good because they know what 10 bit means but do not know what YCbCr 422 means.

The topic is super complicated once you bring the failings of individual displays into it. It is important to differentiate advice about a particular display from general advice.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 24th November 2019, 17:12   #57871  |  Link
Stef2
Registered User
 
Join Date: Jan 2018
Posts: 2
Hi. Could someone with a RTX 2080 card (no matter the version) and a projector (no matter which one) do this quick test for me: compare the rendering time of your usual, every day settings for 4K UHD video with the anamorphic stretch disabled vs enabled (no need for an anamorphic lens in place). I would like to know how much of a jump does the anamorphic stretch causes in the rendering time, everything else untouched.

Thank you!
Stef
Stef2 is offline   Reply With Quote
Old 24th November 2019, 17:21   #57872  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,062
this depends on your up/downscale settings so it ranges from close to nothing to impossible for a 2080 ti.
huhn is offline   Reply With Quote
Old 24th November 2019, 23:05   #57873  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 422
Quote:
Originally Posted by GTPVHD View Post
I'd wait and see if anyone will test the GTX 1650 Super launching today, it's cheaper than GTX 1060's original US$249 MSRP at US$159 and Turing is certainly better than Pascal at compute workloads.
The complete silence from Nvidia on HDMI 2.1 is quite unnerving.
ashlar42 is offline   Reply With Quote
Old 25th November 2019, 00:18   #57874  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 414
Quote:
Originally Posted by huhn View Post
only sony and phillips are well known to support PC mode at all refreshrates without dumb things like forcing 3:2 judder that's a panasonic classic.

Cough.. also all Roku TVs support 23/24/30/60 w/ 4:4:4, RGB FULL.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 25th November 2019, 00:20   #57875  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 414
Quote:
Originally Posted by ashlar42 View Post
The complete silence from Nvidia on HDMI 2.1 is quite unnerving.
For movies, it's not very useful.

The only uptick will be 4K120 and 8K60 for games. Buhh that's really far away GPU wise, so having 2.1 even in next generation isn't going to be critical.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 25th November 2019, 02:11   #57876  |  Link
jkauff
Registered User
 
Join Date: Oct 2012
Location: Akron, OH
Posts: 436
Quote:
Originally Posted by tp4tissue View Post
Cough.. also all Roku TVs support 23/24/30/60 w/ 4:4:4, RGB FULL.
I have a TCL Series 6 Roku TV. It's an excellent display for madVR custom resolutions.
jkauff is offline   Reply With Quote
Old 25th November 2019, 09:37   #57877  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,860
Quote:
Originally Posted by ashlar42 View Post
The complete silence from Nvidia on HDMI 2.1 is quite unnerving.
Such changes don't typically happen on minor card releases, those are just minor varients on boards they made years ago.

If anything you'll need to wait for the 3000 series for that.

Quote:
Originally Posted by tp4tissue View Post

The only uptick will be 4K120 and 8K60 for games. Buhh that's really far away GPU wise, so having 2.1 even in next generation isn't going to be critical.
We already have cases today where games need to use 4:2:0 chroma or such nonsense to be able to bring their content to a screen, since not even 4K@60 10-bit HDR is possible right now. And higher refresh rates on 4K would finally make an argument for serious gamers to consider it.

So the next generation either needs the new DisplayPort, or HDMI 2.1, or ideally both.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 25th November 2019 at 09:39.
nevcairiel is offline   Reply With Quote
Old 25th November 2019, 13:11   #57878  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 573
What I'm afraid of is, seeing some display manufacturers right now don't even support the same processing quality when you go above 4:2:2, this is not going to get better soon if they have even more bandwidth to handle.
I'd certainly be happy if at one point we can just set RGB 10-bit for every refresh rate, but the graphics cards supporting this is only half of the problem.
__________________
HTPC: Windows 10 1809, MediaPortal 1, LAV Filters, ReClock, madVR. DVB-C TV, Panasonic GT60, 6.0 speakers Denon 2310, Core 2 Duo E7400, GeForce 1050 Ti
el Filou is offline   Reply With Quote
Old 25th November 2019, 13:48   #57879  |  Link
Grimsdyke
Registered User
 
Join Date: Nov 2013
Location: Hannover, Germany
Posts: 147
Just a quick question ... I have only just noticed that MadVR, although the player was closed, is still loaded in RAM. Is this the way it is designed to run ? Is there switch in the UI to change that ? Thx.
Grimsdyke is online now   Reply With Quote
Old 25th November 2019, 14:29   #57880  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 573
Do you mean madHcCtrl.exe?
You can avoid that by clicking the tray icon => Edit Tray Icon Settings => "show tray icon when madVR is running" or "don't show tray icon" instead of "always show tray icon"
__________________
HTPC: Windows 10 1809, MediaPortal 1, LAV Filters, ReClock, madVR. DVB-C TV, Panasonic GT60, 6.0 speakers Denon 2310, Core 2 Duo E7400, GeForce 1050 Ti
el Filou is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 10:39.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, vBulletin Solutions Inc.