Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 22nd November 2019, 17:02   #57841  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,977
madVR doesn't support it(it''s complicated for video).
the VRR ranges is another problem. at a resolution with 120 hz support it should be fine but at UHD 60 is utterly useless because it can't do LFC making it very hard to get a video into the VRR range.
huhn is offline   Reply With Quote
Old 22nd November 2019, 17:44   #57842  |  Link
tyguy
Registered User
 
tyguy's Avatar
 
Join Date: Oct 2019
Posts: 24
Quote:
Originally Posted by huhn View Post
madVR doesn't support it(it''s complicated for video).
the VRR ranges is another problem. at a resolution with 120 hz support it should be fine but at UHD 60 is utterly useless because it can't do LFC making it very hard to get a video into the VRR range.
Yeah, the range on my tv is 40-120 hz. Xbox will do lfc at 20-30 fps because that doubles from 40-60 hz.

I dont think nvidia will do that though so I would have to drop to 1440p if I wanted full range (atleast until hdmi 2.1 graphic cards come out).

You would think using dx11 native would just engage variable refresh rate.

This seems like the ultimate solution to the repeated frames problem. This guide I read said your card can only output 8 bit in madvr if you use custom modes... Which seems far from optimal.
tyguy is offline   Reply With Quote
Old 22nd November 2019, 18:08   #57843  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,977
Quote:
Yeah, the range on my tv is 40-120 hz. Xbox will do lfc at 20-30 fps because that doubles from 40-60 hz.
the xbox can't do 120 hz uhd it does it at 1440p or 1080p and then this feature will work.

Quote:
You would think using dx11 native would just engage variable refresh rate.
it's not that easy in my experience it is blocked and it's planned to be blocked for fixed FPS games too a video is part of this category too. this doesn't make sense to me 144 hz is very common and 60 fps at 144 is just bad but totally fine with g-sync.

and g-sync for video is very complicated it's not a game where A/V sync doesn't exist. audio that "need" sync in games is usually only a couple of secs so it doesn't matter and the sync of BGM is irrelevant. but for movie playback you have to sync an video to an hour or longer audio stream it not build for that.

to properly use that madVR needs to measure the difference between audio and video clock fix this using the system clock. yeah lot of fun.
Quote:
This seems like the ultimate solution to the repeated frames problem. This guide I read said your card can only output 8 bit in madvr if you use custom modes... Which seems far from optimal.
you mean better for lg oleds.
huhn is offline   Reply With Quote
Old 22nd November 2019, 18:16   #57844  |  Link
tyguy
Registered User
 
tyguy's Avatar
 
Join Date: Oct 2019
Posts: 24
Quote:
Originally Posted by huhn View Post
the xbox can't do 120 hz uhd it does it at 1440p or 1080p and then this feature will work.

you mean better for lg oleds.
I read somewhere if you set your xbox to 4k 60 it will still do LFC on 30 fps titles that dip below 30 fps since 25-30 doubles into samsungs vrr range. Well, im assuming the 2020 and beyond lineup of tvs will start to feature hdmi 2.1 from every manufacturer. So it wont just benefit lg oled.

Last edited by tyguy; 22nd November 2019 at 20:34.
tyguy is offline   Reply With Quote
Old 22nd November 2019, 20:03   #57845  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,229
Quote:
Originally Posted by tyguy View Post
So it wont just benefit lg oled.
You can edit down quotes so they don't take up more forum space than your reply. Full quotes are not required most times. Cheers.
ryrynz is offline   Reply With Quote
Old 22nd November 2019, 20:34   #57846  |  Link
tyguy
Registered User
 
tyguy's Avatar
 
Join Date: Oct 2019
Posts: 24
Quote:
Originally Posted by ryrynz View Post
You can edit down quotes so they don't take up more forum space than your reply. Full quotes are not required most times. Cheers.
I posted it on the phone which makes it more difficult.
tyguy is offline   Reply With Quote
Old 22nd November 2019, 20:42   #57847  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,229
I feel that. Missing the [ is a common occurrence for me on the first reply. The bigger the quote the more meaningful it is though imo.
ryrynz is offline   Reply With Quote
Old 22nd November 2019, 20:50   #57848  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,709
Quote:
Originally Posted by tyguy View Post
This seems like the ultimate solution to the repeated frames problem. This guide I read said your card can only output 8 bit in madvr if you use custom modes... Which seems far from optimal.
Why? On LG's OLEDs 8 bit has less banding than 10 bit when in PC or Game mode. 8 bit is more optimal than 10 bit.

I still think VRR support in madVR would be useful. Playback wouldn't be as good as a tuned fixed refresh rate but we wouldn't need to worry about refresh rates at all. madshi is less interested simply because it is going to be worse. I think the minor judder from imperfect presentation timing would be fine, way better than 23fps at 60 Hz, and audio sync is pretty easy if having all frames presented for the exact same amount of time is not considered too important. Simply adjust the frame times a tiny bit as you go to maintain sync.

madVR is all about quality so spending a bunch of work to get VRR support for gaming monitors to support a slightly worse playback mode was not reasonable. However, now that TVs are getting VRR too it might be more reasonable to look into it? Probably not.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 22nd November 2019, 20:55   #57849  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,977
the problem is someone has to write the code for this and how do you even say the GPU this frame 0.5 ms later plz?

want to handle everything with chrono?
huhn is offline   Reply With Quote
Old 22nd November 2019, 21:14   #57850  |  Link
tyguy
Registered User
 
tyguy's Avatar
 
Join Date: Oct 2019
Posts: 24
Quote:
Originally Posted by Asmodian View Post
Why? On LG's OLEDs 8 bit has less banding than 10 bit when in PC or Game mode. 8 bit is more optimal than 10 bit.
I'm only using 8 bit color depth in madvr because im using rgb full 8 bit in my nvidia drivers and nvidia wont switch to 10/12 bit when it drops the refresh rate.

I dont label my hdmi input anything. I just keep it to default hdmi 1. With 8 bit I haven't noticed any banding. I will be getting a new card when 2.1 cards come out though. So ill be running my desktop at 4k 120 hz rgb full 10/12 bit.

Would there be an issues if I set madvr to 8 bit, but im using 10 or 12 bit with my nvidia drivers?
tyguy is offline   Reply With Quote
Old 23rd November 2019, 04:17   #57851  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,709
Quote:
Originally Posted by tyguy View Post
I dont label my hdmi input anything. I just keep it to default hdmi 1. With 8 bit I haven't noticed any banding.
Using the default HDMI input the TV will convert to YCbCr 4:2:2 internally. This blurs the color information horizontally. Both color planes are resampled to 1920x2160 by the TV. This does eliminate banding with 8 or 10 bit RGB input but the image is blurred. I hope you are using very low power chroma scaling options because anything better is pointless.

Quote:
Originally Posted by tyguy View Post
Would there be an issues if I set madvr to 8 bit, but im using 10 or 12 bit with my nvidia drivers?
No issues except being totally pointless, why set Nvidia drivers to anything but 8 bit? There is absolutely zero improvement in quality on an LG 7/8/9 when sending it 10 bit instead of 8 bit when using any HDMI option, and 10 bit is worse quality when using PC or Game mode.

Ideally you want to send an LG C9 8 bit YCbCr limited range using the PC or game HDMI mode. Anything else either has banding or blurs the chroma. If you set the drivers to YCbCr limited set madVr to full range output.
gradient-perceptual-v2.1 24fps.mkv
gradient-perceptual-colored-v2.1 24fps.mkv

I know 10 bit sounds cool but it really is pointless at best on an LG OLED. I will be using 8 bit YCbCr limited range 3840x2160 @ 120 Hz on my LG C9 when I get an HDMI 2.1 GPU unless I get a new TV that is not so bad with RGB full input.
__________________
madVR options explained

Last edited by Asmodian; 23rd November 2019 at 11:21.
Asmodian is offline   Reply With Quote
Old 23rd November 2019, 07:59   #57852  |  Link
YukonTrooper
Registered User
 
Join Date: Oct 2008
Posts: 13
My aspect ratio for HDR files is broken using MadVR in any player. The film area is off-center and stretched. Playback without MadVR OK. Any ideas?
YukonTrooper is offline   Reply With Quote
Old 23rd November 2019, 08:53   #57853  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,977
you could start with a screen of the OSD (control+j).
huhn is offline   Reply With Quote
Old 23rd November 2019, 09:02   #57854  |  Link
tyguy
Registered User
 
tyguy's Avatar
 
Join Date: Oct 2019
Posts: 24
Quote:
Originally Posted by Asmodian View Post
Using the default HDMI input the TV will convert to YCbCr 4:2:2 internally. This blurs the color information horizontally. Both color planes are resampled to 1920x2160 by the TV. This does eliminate banding with 8 or 10 bit RGB input but the image is blurred. I hope you are using very low power chroma scaling options because anything better is pointless.







No issues except being totally pointless, why set Nvidia drivers to anything but 8 bit? There is absolutely zero improvement in quality on an LG 7/8/9 when sending it 10 bit instead of 8 bit when using any HDMI option, and 10 bit is worse quality when using PC or Game mode.



Ideally you want to send an LG C9 8 bit YCbCr limited range using the PC or game HDMI mode. Anything else either has banding or blurs the chroma. If you set the drivers to YCbCr limited set madVr to full range output.

gradient-perceptual-v2.1 24fps.mkv

gradient-perceptual-colored-v2.1 24fps.mkv



I know 10 bit sounds cool but it really is pointless at best on an LG OLED. I will be using 8 bit YCbCr limited range 3940x2160 @ 120 Hz on my LG C9 when I get an HDMI 2.1 GPU unless I get a new TV that is not so bad with RGB full input.

Where did you get the information about the tv internally doing Ycbcr 422? I thought in Windows you always want to use rgb because windows will just convert ycbcr to rgb?

Also, why is my Apple TV able to do 10 bit without banding, but my windows pc can only send it 8 bit dithered or else I get banding?

Finally, why would I set madvr to full if Iím using limited and have my TV set to low hdmi black level?

Iíve seen a lot of conflicting information out there. Itís hard to know what to believe.
tyguy is offline   Reply With Quote
Old 23rd November 2019, 11:09   #57855  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,709
Quote:
Originally Posted by tyguy View Post
Where did you get the information about the tv internally doing Ycbcr 422? I thought in Windows you always want to use rgb because windows will just convert ycbcr to rgb?
It is easy to test for yourself. With ChromaRes.png you can test which modes use 4:2:2 or 4:4:4. View this image at 100% scaling, i.e. in a window on a 4K screen, and switch between hdmi and pc modes. With the gradient test patterns I linked above you can verify all of what I have said.

Windows will always render in 8 bit RGB but if you set the GPU to YCbCr it converts everything to YCbCr. madVR always outputs RGB too.

Quote:
Originally Posted by tyguy View Post
Also, why is my Apple TV able to do 10 bit without banding, but my windows pc can only send it 8 bit dithered or else I get banding?
Because you have your HDMI port in the default mode, also the Apple TV is sending subsampled limited range YCbCr so this isn't really a problem. However, madVR with 8 bit output and the GPU converting to 8 bit YCbCr 4:4:4 is higher quality.

Quote:
Originally Posted by tyguy View Post
Finally, why would I set madvr to full if I’m using limited and have my TV set to low hdmi black level?
Because limited in the GPU drivers means "convert the full range RGB into limited YCbCr." If you set madVR to limited range too you get double limited because the GPU always converts, it has no way to know madVR is sending limited range.

The reason madVR has a setting for limited range is because if you only care about madVR and don't mind if everything else in Windows is wrong you can set madVR to limited and the GPU to full range. This results in the same image but without the GPU converting ranges. I care about everything else too, and the drivers still need to convert the RGB to YCbCr anyway, so I use limited range in the GPU and full in madVR.

Also, what is ideal in principle and what is ideal in real life with an LG 2019 OLED is not the same thing.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 23rd November 2019, 11:43   #57856  |  Link
tyguy
Registered User
 
tyguy's Avatar
 
Join Date: Oct 2019
Posts: 24
madVR - high quality video renderer (GPU assisted)

Quote:
Originally Posted by Asmodian View Post
It is easy to test for yourself. With ChromaRes.png you can test which modes use 4:2:2 or 4:4:4. View this image at 100% scaling, i.e. in a window on a 4K screen, and switch between hdmi and pc modes. With the gradient test patterns I linked above you can verify all of what I have said.

Windows will always render in 8 bit RGB but if you set the GPU to YCbCr it converts everything to YCbCr. madVR always outputs RGB too.



Because you have your HDMI port in the default mode, also the Apple TV is sending subsampled limited range YCbCr so this isn't really a problem. However, madVR with 8 bit output and the GPU converting to 8 bit YCbCr 4:4:4 is higher quality.



Because limited in the GPU drivers means "convert the full range RGB into limited YCbCr." If you set madVR to limited range too you get double limited because the GPU always converts, it has no way to know madVR is sending limited range.

The reason madVR has a setting for limited range is because if you only care about madVR and don't mind if everything else in Windows is wrong you can set madVR to limited and the GPU to full range. This results in the same image but without the GPU converting ranges. I care about everything else too, and the drivers still need to convert the RGB to YCbCr anyway, so I use limited range in the GPU and full in madVR.

Also, what is ideal in principle and what is ideal in real life with an LG 2019 OLED is not the same thing.
I actually donít have my Apple set in default mode. Itís hooked into my receiver which I have labeled ďhome theatreĒ in the input section. Maybe thatís the same as default hdmi, but I can always just label that game console or pc as well and see if I can notice banding.

I donít think you can get wide color gamut from a console or Apple TV without sending the tv 10 bit because they donít do dithering.

If Windows is converting ycbcr to rgb and madvr only outputs rgb....Why not just use rgb 8 bit limited instead of ycbcr 422?

Last edited by tyguy; 23rd November 2019 at 11:47.
tyguy is offline   Reply With Quote
Old 23rd November 2019, 12:06   #57857  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 547
Because the LG's processing of RGB input is of lower quality.
__________________
HTPC: Windows 10 1809, MediaPortal 1, LAV Filters, ReClock, madVR. DVB-C TV, Panasonic GT60, 6.0 speakers Denon 2310, Core 2 Duo E7400, GeForce 1050 Ti
el Filou is offline   Reply With Quote
Old 23rd November 2019, 13:32   #57858  |  Link
DMU
Registered User
 
Join Date: Dec 2018
Posts: 85
Quote:
Originally Posted by el Filou View Post
Because the LG's processing of RGB input is of lower quality.
@Asmodian
And the output at 60Hz@RGB@PCMode does not solve the issue's on LG OLED?
__________________
R3 2200G / Vega8 / Samsung UE40NU7100
Win10Pro 1909 / 4K RGB 60Hz / AMD19.12.2
MPC-HC 1.8.8 / madVR 0.92.17 / FSW / 10bit@60Hz

Last edited by DMU; 23rd November 2019 at 13:38.
DMU is offline   Reply With Quote
Old 23rd November 2019, 16:42   #57859  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 908
Quote:
Originally Posted by Asmodian View Post
It is easy to test for yourself.

hi, this issue with LG OLEDS, its quite confusing,, there is a lot to take on here and some conflicting information to what ive been given in the past. Firstl, does it extend to older models, i have gen 1.0 EF950v, i've never noticed any banding to be honest, my current setup is as follows:

HDMI 1 selected, lleft it named HDMI 1
Expert 1 profile set to HIGH colour space
AMD RX 580 running at 4:4:4 FULL RGB set to 8 bit in graphica settings
MADVR set to 0-255

everything looks fine to me, however I do have an issue if I use the actual "PC" HDMI input setting, SDR is fine but no matter what I do HDR is massively washed out and gamma is all messed up, nothing i do corrects this, I dont use PC mode anyway though as I use a tiny bit of smooth motion which greatly improves bright panning shots without any artefact or soap opera effects.

Are you suggesting I should be runniing my panel in PC mode and at YBCR limted mode, what should I actually see on that chroma res pattern, I see 442 clearly but I can also see 444 albeit fiently.

I dont see any banding in either of those movies, gradients and pretty smooth.
__________________
OLED 4k HDR EF950-YAM RX-V685-WIN10 444 RGB 60hz-AMD RX 5700 19.12.2 KODI DS - MAD/LAV 92.17/0.74.1 - 3D MVC / FSE:Off / MADVR 10bit
mclingo is offline   Reply With Quote
Old 23rd November 2019, 17:38   #57860  |  Link
Stef2
Registered User
 
Join Date: Jan 2018
Posts: 2
Anamorphic stretch in madvr

I would like to hear from madvr users with a projector and an anamorphic lens.

I just bought such a lens and when I enable the anamorphic option in madvr (4/3 vertical stretch in my case) the rendering time goes up quite a lot, increasing from 36ms average to 55+ms average after the stretch...

Of course, that makes my 4K HDR 24p movie unwatchable

The only way I gan get the rendering time back to below 40ms is by disabling completely any HDR processing in madvr, but that deteriorates the projected image a lot, of course. Decreasing luma and chroma upsampling quality by a lot is not enough. Decreasing dithering quality is not enough. Enabling every "trade quality for performance" option over all that that is not enough...

My GPU is a GTX 1070. I do not mind upgrading it to a RTX 2080 if this is what I will need.

Is there any anamorphic lens user around here? what do you observe when enabling this option? Any RTX 2080 user can help me by observing the difference in rendering time between no stretch and anamorphic stretch enabled in madvr.

Any input is welcome!
Thank you.
Stef2 is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 07:10.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.