Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 13th July 2019, 20:55   #461  |  Link
amayra
Quality Checker
 
amayra's Avatar
 
Join Date: Aug 2013
Posts: 284
why i can't run MADVR with wine or dxvk ?
__________________
I love Doom9
amayra is offline   Reply With Quote
Old 16th July 2019, 13:52   #462  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Those with LG OLEDs might find this post interesting. I was reading through the Spears & Munsil UHD HDR Benchmark Disc - Discussion thread at AVS Forum and Stacey Spears was explaining how the LG's PC mode works. He owns an LG OLED and claims PC mode processes the image at 8-bits. That might explain why only 8-bits doesn't exhibit banding on LG OLEDs:

https://www.avsforum.com/forum/139-d...l#post58289648
Warner306 is offline   Reply With Quote
Old 17th July 2019, 10:29   #463  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by Warner306 View Post
Those with LG OLEDs might find this post interesting. I was reading through the Spears & Munsil UHD HDR Benchmark Disc - Discussion thread at AVS Forum and Stacey Spears was explaining how the LG's PC mode works. He owns an LG OLED and claims PC mode processes the image at 8-bits.
Thanks, but personally I disagree with it and I'd prefer @jk82 's opinion.

Quote:
Originally Posted by Warner306 View Post
That might explain why only 8-bits doesn't exhibit banding on LG OLEDs
But it does!
link , link
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 17th July 2019, 11:19   #464  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
if you are only using 8 bit processing you are "obviously" getting banding even with an 8 bit dithered input signal
huhn is offline   Reply With Quote
Old 17th July 2019, 13:52   #465  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
You would take jk82's opinion over one of the world's leading experts on video processing who talks to engineers from manufacturers like LG all the time? Come on. I don't think he has the precise answer to everything, either, but he owns an LG OLED and has run a battery of his own tests on the display.

If Stacey Spears isn't an expert, who is?

Edit: In that post, he says "the video goes 8-bit" in PC mode, so that doesn't necessarily mean all image processing happens at 8-bits, but some compromise has to be made to get 4:4:4 chroma processing on a chip designed for 4:2:2 image processing. Chroma makes up 2/3 of the signal bandwidth and requires an equivalent of amount of space reserved in memory for image processing calculations.

Last edited by Warner306; 17th July 2019 at 13:57.
Warner306 is offline   Reply With Quote
Old 17th July 2019, 14:21   #466  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
you can do processing with 8 bit and store it in 32 bit you will still get banding and one time is enough to ruin the image.

in PC mode most TV can't do motion interpolation if that is not enough processing saved(well pretty much only memory) i don't know what will and these TV now have smartphone CPUs now they can do a full RGB pipe without any issues what so ever.

there are cortex CPU in them and your smart phone can do simple 32 bit RGB processing...
huhn is offline   Reply With Quote
Old 17th July 2019, 14:26   #467  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Yes, but Stacey says in that thread that most off-the-shelf TV processors are designed for 4:2:2 image processing. It sounds more like subsampling on TV's is a hardware limitation and not a software limitation. They must be cheaper that way.
Warner306 is offline   Reply With Quote
Old 17th July 2019, 14:42   #468  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
it's simply not worth the development time.

who does critical work with colors using a TV no one sane the rest doesn't notice banding who uses RGB and sees the difference between RGB and 4:2:2 clearly not the AVG TV user.

BTW it is clearly not a hardware limitation sony can do everything with RGB/4:4:4 and extremely cheap monitors too.
huhn is offline   Reply With Quote
Old 17th July 2019, 14:55   #469  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Maybe Sony uses a more expensive processing chip. I don't have all the answers, either, but the development time argument is somewhat weak compared to other arguments like devoting additional processing to improving the image elsewhere on the same chip and not having anything left for 4:4:4 chroma calculations. That doesn't explain why PC mode can't have perfect 4:4:4 chroma with higher bit depth processing, but I would guess there is not enough space in the processing pipeline of some cheaper processing chips. That is akin to calling TV engineers "lazy" when they are more or less perfecting the identical TV model each year with small upgrades. Why would LG put out a somewhat broken PC mode on purpose?
Warner306 is offline   Reply With Quote
Old 17th July 2019, 22:33   #470  |  Link
j82k
Registered User
 
Join Date: Jun 2017
Posts: 155
I wouldn't be surprised if the cause of the increased banding on LG Oleds in pc-mode isn't the 4:4:4 chroma but the low input lag. Because I also noticed increased banding in non-pc mode when using the game preset which gives the same 21ms of input lag as pc-mode does.

I noticed this while playing the game 'Observer' on the nintendo switch which has some very dark scenes with lots of near-black gradients. Going back and forth between the game preset and a non-game preset is almost instant so it's easy to see the difference. Of course all picture processing was disabled when I did this.
There probably is some less accurate processing going on internally to archive lowest possible input lag. If this is the case then it's a shame the TV doesn't offer a 4:4:4 mode with higher input lag and no banding problems.

I'm not saying that what Stacey Spears wrote is wrong but there is more to it than that. There is the absolutely terrible banding in pc mode with 10-bit output but then there is also the only slightly worse banding with 8-bit output, which most people probably just don't notice.
j82k is offline   Reply With Quote
Old 18th July 2019, 01:19   #471  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
sorry but TV manufacture are not trying to perfect there devices. it's a company they want to make money.

just look at those to 2:
https://www.rtings.com/tv/reviews/samsung/q70-q70r-qled
https://www.rtings.com/monitor/reviews/samsung/chg70

there is a chance that you could run the TV with board of the monitor and the banding will be gone.

and the monitor screen has less imputlag supports HDR and HFR. so it processing is magnitudes faster and still does better.

or take a look at this LG screen yes LG: https://www.rtings.com/monitor/reviews/lg/27uk650-w

beware that rtings banding test is still flawed and only screens with the same input bit deep should be compared.

it's not broken on purpose it just good enough in there eyes.
huhn is offline   Reply With Quote
Old 18th July 2019, 12:12   #472  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by j82k View Post
I wouldn't be surprised if the cause of the increased banding on LG Oleds in pc-mode isn't the 4:4:4 chroma but the low input lag. Because I also noticed increased banding in non-pc mode when using the game preset which gives the same 21ms of input lag as pc-mode does.
...
There probably is some less accurate processing going on internally to archive lowest possible input lag. If this is the case then it's a shame the TV doesn't offer a 4:4:4 mode with higher input lag and no banding problems.
Thanks, interesting, I didn't know about this since I never used the normal mode and I don't play games.

Quote:
Originally Posted by Warner306 View Post
You would take jk82's opinion over one of the world's leading experts on video processing who talks to engineers from manufacturers like LG all the time? Come on.
Maybe it's surprising, but I reached the point in my life when I more believe clever guys like Us than experts/specialist/etc.
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config

Last edited by chros; 18th July 2019 at 12:16.
chros is offline   Reply With Quote
Old 18th July 2019, 13:02   #473  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by chros View Post
Maybe it's surprising, but I reached the point in my life when I more believe clever guys like Us than experts/specialist/etc.
When you search for his bio, his current job titles are "Sr. Principal Engineer, Software Applications at RED Digital Cinema" and "Chief Color Scientist at SpectraCal."

https://www.linkedin.com/in/stacey-spears-611a249

https://www.ravepubs.com/spectracal-...ilate-scratch/

You should trust a color scientist. I don't know if he still works with the math at this point in his career. But if you read through that thread, you'll find his head is filled with all sorts of useful knowledge on video processing and insider knowledge of the various TVs, projectors and Blu-ray players.

In that particular thread, he sent one user's concerns about Dolby Vision on Apple TV in an email addressed to Dolby. I've also seen posts on other forums where he is talking to people who are shooting current TV shows and movies, telling them how to use their equipment. I would consider him to be a reliable expert.

Last edited by Warner306; 18th July 2019 at 13:04.
Warner306 is offline   Reply With Quote
Old 18th July 2019, 16:30   #474  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by Warner306 View Post
I would consider him to be a reliable expert.
OK, cheers.
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 18th July 2019, 16:39   #475  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 896
Now that madVR will be a commercial product https://www.madvrlabs.llc/ I hope to see professional reviewers compare its dynamic tonemapping to the ones in various displays, or even compare HDR10 with madVR to the Dolby Vision version on some Blu-rays.
Quote:
Originally Posted by leeperry View Post
I just wonder why they don't record in D65 in the first place huh? Even with high bit-depth 3DLUT's, is 5200K>6500K conversion lossless?
I've seen it on EU / US & korean videos, must be a reason to it. I guess they use digital cameras with REC.709 gamut so why "daylight" 5200K
That number you see on the camera monitor is the real colour temperature of the lighting used for the scene. For another recent example that springs to my mind, if you watch the GoT documentary 'The Last Watch' you'll see 3200K on their monitors for the torch-lighting night scenes.
If they 'shot at D6500' for this particular environment the colours would be all wrong like when you set your TV to standard/cold temp, so it's better to record with the real scene lighting as reference. Mastering is never lossless anyway.
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40
el Filou is offline   Reply With Quote
Old 18th July 2019, 16:58   #476  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by el Filou View Post
Now that madVR will be a commercial product https://www.madvrlabs.llc/ I hope to see professional reviewers compare its dynamic tonemapping to the ones in various displays, or even compare HDR10 with madVR to the Dolby Vision version on some Blu-rays.
It should be no contest for Dolby Vision. You can do trim passes in Dolby Vision at different levels of peak brightness where the content is tone mapped by a colorist to look appropriate at different brightness levels. So the tone mapping should be far more precise than dynamic HDR10 and done per frame. The science on how to apply gamut mapping used by madVR is also based similar logic recommended by Dolby.

The newest Dolby Vision standard is trying to become a preferred format over HLG for combined HDR/SDR projects by including trim passes for SDR grades. Dolby has an automatic HDR to SDR algorithm that can be applied to the HDR version and hand-tweaked by a colorist to make each scene look as desired. The resulting SDR grade is supposed to look better and not worse than starting in SDR to begin with. If projectors supported Dolby Vision, those with projectors might actually prefer this HDR to SDR grade over the HDR grade.

https://www.mysterybox.us/blog/2019/...lmmaking-e75yg

Last edited by Warner306; 18th July 2019 at 17:00.
Warner306 is offline   Reply With Quote
Old 19th July 2019, 08:35   #477  |  Link
Shiroe
Registered User
 
Join Date: May 2019
Posts: 8
Is there a difference between using madVR on Windows 10 and Windows 8.1?

Using LAV Filters Megamix's mid-tier (NGU medium) preset on Windows 10 to watch anime gave me 60 FPS. I switched to Windows 8.1 for reasons, and now the same preset gives me 30 FPS. Just in case, I am also using an older NVIDIA driver on Windows 8.1, one that predates RTX graphics, also for reasons.

Also, which RTX graphics cards are sufficient for watching 720p and 1080p anime on a Full HD, QHD, or 4K monitor using the madVR settings found here with some minor adjustments, such as higher values for sharpen edges, thin edges, and AdaptiveSharpen and without any frame drops during playback, when seeking, and when entering or exiting fullscreen, etcetera? And which AMD graphics cards are sufficient? Ignoring prices, is NVIDIA or AMD better in terms of support and performance?

Last edited by Shiroe; 19th July 2019 at 09:15.
Shiroe is offline   Reply With Quote
Old 19th July 2019, 09:42   #478  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
windows 8 can't do WFS 10 bit that's the major difference for now.

if you ignore prices nothing beats nvidia.
polaris performance bad with NGU. navi is unknown and for some reason vega user don't share number and only tell you madVR works with it...
huhn is offline   Reply With Quote
Old 20th July 2019, 04:00   #479  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
i'm currently running 18941 WDDM 2.6 with 431.36 nvidia driver and i have no banding.
can someone confirm if stable 1903 version is fixed?
huhn is offline   Reply With Quote
Old 21st July 2019, 12:27   #480  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
I asked an Nvidia employee online about it a few days back and they said there was no update for this issue as of yet and they're still working with MS. So, no not fixed yet.
ryrynz is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 12:51.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.