Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
12th March 2023, 18:00 | #64021 | Link |
Registered User
Join Date: Oct 2017
Posts: 331
|
Using the latest Nvidia 531.18 driver with no problems on various players with and without madVR including MPC. Prior to this driver (many builds), after playing various videos, randomly SDR would stick and not switch to HDR when it should have. Sometimes turning display and AVR off/on fixed it, sometimes it required a reboot. Hasn't happened using this new driver after many videos that previously would have occurred by now. Crossing fingers.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit KODI 22 MPC-HC/BE 82" Q90R Denon S720W |
14th March 2023, 17:48 | #64022 | Link | |
Registered User
Join Date: Feb 2019
Posts: 231
|
Quote:
|
|
15th March 2023, 18:21 | #64025 | Link | |
Registered User
Join Date: Mar 2023
Posts: 5
|
Quote:
Maybe it's related to colormap? Unfortunately i can't show screenshot, because it's on my monitor only. i tried |
|
16th March 2023, 14:54 | #64028 | Link | |
Registered User
Join Date: Mar 2023
Posts: 5
|
Quote:
But based on Sunspark's idea, i tried to make photo via my phone, but of course its not very good quality. Check the top of the "t" letter for example. The top picture is the HDR, the bottom is when switched to SDR. In SDR the fonts are more antialised. By naked eye clearly see the difference, on picture it's harder but noticable. https://imgur.com/a/Gj3kX1v Thanks Sunpark, link added instead of attachment. Last edited by Warez; 16th March 2023 at 21:26. Reason: added link |
|
18th March 2023, 08:58 | #64030 | Link |
Registered User
Join Date: Oct 2009
Posts: 930
|
Hi!
Is deinterlacing supposed to work with D3D11 nowadays? It says "deinterlacing on", but I still see a lot of combing near the edges of things. It's a lot less though compared to deinterlacing being off. So is it just AMD's HW deinterlacing being garbage? Is it best to use LAV's bobweaver? |
18th March 2023, 09:29 | #64031 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
for presentation it always worked it doesn't work with d3d11 decode.
AMD deint can work when the stars are aligned correctly at least i got it working on one driver. else it is utter garbage. @Warez check your levels maybe they are over exposed and the fine gradian is cut out. Last edited by huhn; 18th March 2023 at 09:33. |
18th March 2023, 22:14 | #64032 | Link |
Registered User
Join Date: Nov 2015
Posts: 471
|
There is no best, because it's dependent on how it was done in the first place.
What you need to do is load up the file/stream that you're actually going to be watching and try each de-int algorithm one-by-one to see which one works the best appearance-wise. Don't pay attention to the names, only the appearance of the final result, namely the least amount of combing, flicker, etc. |
18th March 2023, 22:50 | #64033 | Link | |
Registered User
Join Date: Aug 2007
Posts: 301
|
Quote:
__________________
Windows 10 Pro 2004 with Media Center, 2 x DVB-C with cable cards MPC-HC (madVR, LAV Filters, XySubFilter) Sony Bravia 85X950H/85XH95 Yamaha RX-V685 Yamaha Piano Black NS-777/555/C444/YST-SW315 |
|
18th March 2023, 23:59 | #64034 | Link | |
Registered User
Join Date: Jan 2008
Posts: 589
|
Quote:
When playing HDR content with Windows HDR enabled, madVR uses the Windows HDR APIs instead of the NVidia HDR APIs. As far as I know there should be no difference between the two on Windows 11. On Windows 10 however, last time I checked, the Windows HDR APIs did not properly pass through HDR metadata (e.g. maxCLL), resulting in the TV selecting a suboptimal tone mapping curve. When playing SDR content with Windows HDR enabled, madVR will output SDR, which is then automatically converted to HDR by Windows. I have verified that Windows does this SDR-to-HDR conversion correctly - i.e. it accurately maps the SDR gamut within the HDR "container". However, it's worth noting that Windows assumes a SDR gamma of 2.2 when doing that conversion, which is arguably not great - a BT.1886 (2.4) gamma would be more appropriate. Also, the way your TV displays the SDR gamut when operating in HDR mode might be different from the way your TV displays "native" SDR. (Ironically, on my LG G1 I prefer this SDR-in-HDR mode, because it doesn't suffer from the infamous black crush issues of the LG SDR mode.) In any case, all these issues can always be worked around by configuring a 3DLUT in madVR, if you are so inclined. Last edited by e-t172; 23rd March 2023 at 11:47. |
|
19th March 2023, 01:36 | #64035 | Link |
Registered User
Join Date: Mar 2009
Posts: 3,650
|
I used to have HDR on in Windows but then found the colours were all washed out for SDR content on my Panasonic OLED, pretty weird that the mapping isn't accurate.. But oh well.. It's seamless transitioning here anyway.
|
19th March 2023, 23:45 | #64037 | Link |
Registered User
Join Date: Nov 2015
Posts: 471
|
Out of curiosity, does anyone here downscale 4K to HD in order to get the full chroma that comes with it, instead of upscaling chroma at HD using one of the scaling algorithms?
This video is interesting to contemplate https://youtu.be/kIf9h2Gkm_U According to that video, if the source is 4K one would get free 4:4:4 chroma in addition to the luma when downscaling to HD. Where MadVR is concerned you can downscale using DXVA2 in the GPU, or you can do it in software with the pixel shaders such as cubic. I wonder how noticeable the difference is, upscaling chroma vs downscaling from 4K? I took a quick look, the HUD says it is still upscaling the chroma. Last edited by Sunspark; 20th March 2023 at 00:01. |
20th March 2023, 00:23 | #64039 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
depends on the TV.
level setting doesn't have to be identical between 2 display modes like SDR and HDR. they have al kinds of names. Quote:
you can not downscale the 4K luma to FHD and match the chroma channel that easily it still needs a X shift that's a scaling operation chanaging every chroma pixel. DXVA2 and what ever browser do as downscaling is so terrible the luma channel will get massive damage. you unchecked scale chroma separately from luma if it saves performance so it is still scaling chroma first else it wouldn't. assuming a lossless encode and the same scaling algorithm downscaling 4K to 1080 instead of getting a 1080 source would always be better to bad they don't do that. |
|
20th March 2023, 23:08 | #64040 | Link |
Registered User
Join Date: Aug 2016
Posts: 609
|
Chroma subsampling is another annoying problem that we're saddled with for generations. It's a terrible thing that is baked into every video, just like interlacing used to be baked into every video and creates problems. Of course under ideal conditions it shouldn't be noticeable, but I sold a Panasonic OLED because its chroma upscaling was broken and looked bad even at 3.5m viewing distance - I could see the chroma jaggies at 3.5m, not pixel peeping. I've encountered many older TV shows with wrong chroma position baked into the source. I can fix this with Avisynth script but it's a pain and changes from season to season, or episode to episode. Even one of my local TV stations has wrong chroma position. Many TV channels using wrong colourimetry as well.
Unless you get the chroma position exactly correct then it may not be "true 1080p" supersampled from 4k. It's kind of interesting how we can spot chroma position errors with our eyes, but very difficult to come up with an algorithm to detect it in the source. From algorithm's perspective, there is no way to know if that chroma is supposed to be hanging over the edge, or if it's bad position. But when we look at it we can tell it's not supposed to be like that. |
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
|
|