Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 11th September 2018, 13:06   #52381  |  Link
blu3wh0
Registered User
 
Join Date: Feb 2014
Posts: 38
I recently upgraded from a 1050 to a 1080 Ti and I started to see some weird performance in the rendering statistics. No matter what settings I choose, max rendering times always seem to spike at times up to 50ms+, but without affecting average times or dropping frames. GPU queue does seem to jump up and down more often than I expected, but setting it at 16 and presentation frames at 8 prevents any kind of impact from that. Also, if I turn on smooth motion, I almost always get consistent presentation glitches every minute or so, again no matter what settings I choose. I know that my GPU is barely using 60% at these times, so I don't understand all this activity. I'm on Windows 10 1803 4K23p with the latest nvidia drivers going through a Denon x3400h to LG OLED B7. One other piece of strange behavior with both nvidia cards is that I can only get 23.980hz presentation at 4K, while 1080p can get a perfect 23.976hz, and no amount of measuring changes this. Does anyone with a 1080 Ti see this type of behavior?
blu3wh0 is offline   Reply With Quote
Old 11th September 2018, 16:01   #52382  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 830
Quote:
Originally Posted by blu3wh0 View Post
I recently upgraded from a 1050 to a 1080 Ti
did you definitely create a CRU in madvr for 2160p?
mclingo is offline   Reply With Quote
Old 11th September 2018, 16:29   #52383  |  Link
blu3wh0
Registered User
 
Join Date: Feb 2014
Posts: 38
I did, I initially tested it with the 1050 and gave up on it after I realized it could only do 8-bit. The numbers it computed were good, I setup a custom resolution manually with those numbers, but it always stuck to a 23.980hz composition rate (~4 min for repeated frames) at only 4k. I didn't have to do anything for 1080p since nvidia actually uses 23.976hz for that resolution.
blu3wh0 is offline   Reply With Quote
Old 11th September 2018, 16:58   #52384  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 830
Something isnt right there then, the 1050's can do 10bit, there have been some driver issues though getting 10bit in some situations, I dont know enough about NVIDIA cards though, in fact I didnt think NVIDIA fully supported 23,976 at any res, I knew there were improvements with later drivers but I'm sure you still had to create custom resolutions in MADVR to get decent timings compared to AMD cards.
mclingo is offline   Reply With Quote
Old 11th September 2018, 17:20   #52385  |  Link
blu3wh0
Registered User
 
Join Date: Feb 2014
Posts: 38
No, nvidia can only do custom resolutions at 8-bit, but normal configs at 8/12-bit, which is what I use. They fixed timings for 1080p at 23.976hz a few drivers back, but did not fix 4k, leaving that still stuck at 23.980hz for me.
blu3wh0 is offline   Reply With Quote
Old 11th September 2018, 17:36   #52386  |  Link
iSeries
Registered User
 
Join Date: Jan 2009
Posts: 603
There are some instructions some pages back on how to use MadVR to optimize timings, and then take those numbers and create a custom resolution with CRU. With the CRU-created custom res, you can switch between 8 bit and 12 bit. I did this and can confirm I can switch between bit depths. However, I can't for the life of me make MadVR switch to this custom res - MadVR switches to the native 23p res, which like you say is more like 23.980hz.

Another problem is even after creating the custom res with CRU, and then going into Nvidia control panel and switching to it and then switching to 12 bit, if you reboot and go back into the control panel, you'll see it has reverted back to 8 bit. Man, Nvidia drivers suck. Seriously thinking about switching to AMD (currently have 1050ti 4gb).

I am not sure why CRU can create a custom res that can switch between bit depths, but MadVR custom res cannot. I guess Madshi is the only one who could answer that one.
iSeries is online now   Reply With Quote
Old 11th September 2018, 18:39   #52387  |  Link
Stereodude
Registered User
 
Join Date: Dec 2002
Location: Region 0
Posts: 1,132
Quote:
Originally Posted by nevcairiel View Post
Its curious that people try to come up with new use-cases while we already had a Neural Network algorithm in madVR before, for image doubling: NNEDI3. Image scaling should be a breeze for the Tensor Cores, considering they are even using that as a gaming feature called DLSS - Deep Learning Super Sampling. Now their algorithm is not going to be quite as high quality as a madVR scaler might be, but it also needs to run on top of 3D rendering and whatnot, so it has to be ultra fast.

A new Neural Network based scaler with more complexity/quality then NNEDI3, but compensating for the low performance of the added complexity by using Tensor Cores sounds like a rather logical step.
madshi posted on AVSforum that NGU sharp is neural network based.
Quote:
This is actually very near to how madVR's "NGU Sharp" algorithm was designed: It tries to undo/revert a 4K -> 2K downscale in the best possible way. There's zero artificial sharpening going on. The algo is just looking at the 2K downscale and then tries to take a best guess at how the original 4K image might have looked like, by throwing lots and lots of GLOPS on the task. The core part of the whole algo is a neural network (AI) which was carefully trained to "guess" the original 4K image, given only the 2K image. The training of such a neural network works by feeding it with both the downscaled 2K and the original 4K image, and then the training automatically analyzes what the neural network does and how much its output differs from the original 4K image, and then applies small corrections to the neural network to get nearer to the ideal results. This training is done hundreds of thousands of times, over and over again.

Sadly, if a video wasn't actually downscaled from 4K -> 2K, but is actually a native 2K source, the algorithm doesn't produce as good results as otherwise, but it's usually still noticably better than conventional upscaling algorithms.
Stereodude is offline   Reply With Quote
Old 11th September 2018, 19:03   #52388  |  Link
tij
Registered User
 
Join Date: Jun 2018
Posts: 7
Quote:
Originally Posted by blu3wh0 View Post
I did, I initially tested it with the 1050 and gave up on it after I realized it could only do 8-bit. The numbers it computed were good, I setup a custom resolution manually with those numbers, but it always stuck to a 23.980hz composition rate (~4 min for repeated frames) at only 4k. I didn't have to do anything for 1080p since nvidia actually uses 23.976hz for that resolution.
I have 1070 (with 2016 LG OLED E6) it all used to work - 3D, 10bit, HDR I think 1803 windows 10 update broke it (went back to known working NVIDIA drivers but it still the same):

1. Fullscreen windowed D3D11 only outputs 8-bit (HDR and SDR)
2. FSE D3D11 outputs 10-bit (HDR and SDR) but
3. Composition rate is now weird 23.98Hz for windowed 60Hz for FSE (which did not have composite rate before)
4. if you turn Windows HDR on Fullscreen windowed D3D11 will outputs 10-bit but that will make 3D unusable

However, display rate was 23.97 in all scenario.

Think 1803 made changes to Desktop Window Manager (DWM) specifically how it behaves when some App start directly access screen through DXGI (intel and amd) or NVAPI (nvidea) since madVR uses DXGI and NVAPI that pretty much guarantees (theoretically) that madVR directly accesses screen, so reported composite rate is irrelevant (my guess is DWM now leaves composition desktop active in background even though it is bypassed by DXGI and NVAPI probably in order to go back to it as quickly as possible when needed)

This is all my best educational guess lol

So now I run my stuff all in FSE (used to run in Fulscreen Windowed for a bit faster response)
tij is offline   Reply With Quote
Old 11th September 2018, 19:36   #52389  |  Link
kostik
Registered User
 
Join Date: Jul 2007
Posts: 128
Quote:
Originally Posted by tij View Post
I have 1070 (with 2016 LG OLED E6) it all used to work - 3D, 10bit, HDR I think 1803 windows 10 update broke it (went back to known working NVIDIA drivers but it still the same):

1. Fullscreen windowed D3D11 only outputs 8-bit (HDR and SDR)
2. FSE D3D11 outputs 10-bit (HDR and SDR) but

4. if you turn Windows HDR on Fullscreen windowed D3D11 will outputs 10-bit but that will make 3D unusable

However, display rate was 23.97 in all scenario.
I think madshi fixed some of the issues in his testing builds and will work when the final build is done.

Try this build for example:
http://madshi.net/madVRhdrRestoreDetails20.zip
kostik is offline   Reply With Quote
Old 12th September 2018, 07:27   #52390  |  Link
tij
Registered User
 
Join Date: Jun 2018
Posts: 7
Quote:
Originally Posted by kostik View Post
I think madshi fixed some of the issues in his testing builds and will work when the final build is done.

Try this build for example:
http://madshi.net/madVRhdrRestoreDetails20.zip
Thx ... great work by madshi as usual ... that fix 10bit output in fullscreen windowed for me

Also in OSD, additional info on display (bit depth, full range) is welcome.

Composition rate are still there in FSE, but i dont think madshi can do anything about it ... just the way Windows work now post 1803 (read somewhere that DX12 has no FSE at all now)
tij is offline   Reply With Quote
Old 12th September 2018, 08:41   #52391  |  Link
waldnebel
Registered User
 
Join Date: Sep 2017
Location: Ancient Germanic Forest
Posts: 13
Is this new HDR to SDR conversion feature of madVR also an option if you have a HDR capable display with a fairly low maximum brightness of say around 300 nits? Might this HDR video look better converted to SDR?
waldnebel is offline   Reply With Quote
Old 12th September 2018, 08:41   #52392  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,829
the FSE in windows 10 is simulated

10 bit windowed fullscreen was buggy on nvidia cards so madshi blocked it for nvidia cards so he just removed the block because nvidia fixed it.

source file bit deep information and range information are in the OSD.
huhn is offline   Reply With Quote
Old 12th September 2018, 10:28   #52393  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 718
Quote:
Originally Posted by kostik View Post
I think madshi fixed some of the issues in his testing builds and will work when the final build is done.

Try this build for example:
http://madshi.net/madVRhdrRestoreDetails20.zip
Please do not post direct links to test builds in this thread, especially the latest one.

There seems to be many issues with test build 20 and linking it here might cause a lot of issues and confusion. Please can you delete this link (or at least provide a link to the post where Madshi linked to it, so that people know where to comment if they have issues with it)?

There is a reason why Madshi is doing this experimental work in a separate thread, and hasn't posted a public build yet.

If anyone has any issue with a test build, please post in the appropriate thread in the other forum, not here.
__________________
Win10 Pro x64 b1809 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 398.11 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25
Denon X8500H>HD Fury Maestro>JVC RS2000

Last edited by Manni; 12th September 2018 at 10:43.
Manni is offline   Reply With Quote
Old 12th September 2018, 10:29   #52394  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 830
Quote:
Originally Posted by waldnebel View Post
Is this new HDR to SDR conversion feature of madVR also an option if you have a HDR capable display with a fairly low maximum brightness of say around 300 nits? Might this HDR video look better converted to SDR?
You should not need to do this, my ageing 2015 HDR OLED has very low nits and its generally fine with HDR content, stuff looks better than the original SDR blurays, there is only one exception i've found so far, star trek 2009 looks awful to me. its doesnt map very well on my display at all, skin looks bleached out and highlights are none existent at times, its all a bit yellow too. i've also tried using the HDR to SDR convert instead, still looks bad to me, its either just me or a bad conversion.
mclingo is offline   Reply With Quote
Old 12th September 2018, 11:55   #52395  |  Link
Axelpowa
Registered User
 
Join Date: Jan 2018
Posts: 16
Quote:
Originally Posted by huhn View Post
check your GPU power setting and make sure it is not set to optimal.

after you did that can you make a screen of the OSD?
Hi,

Managed to solve the issue. Unchecked the enhancements.Now I can use chroma ngu sharp and lanzcos for downscaling. I also can now use again the mpc of my jvc and I get better sharpness.

Would like to use jinc, but it uses a lot of gpu and I get frame drops.

Regards!
Axelpowa is offline   Reply With Quote
Old 12th September 2018, 15:42   #52396  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,115
Quote:
Originally Posted by waldnebel View Post
Is this new HDR to SDR conversion feature of madVR also an option if you have a HDR capable display with a fairly low maximum brightness of say around 300 nits? Might this HDR video look better converted to SDR?
It could make sense if you want to watch both SDR and HDR at the same brightness, likely at 300 nits. There is the issue of sharing a color gamut. If your display has a setting to automatically detect the input gamut and display it without a conversion (typically, Auto), it could be a good option. Otherwise, you would have to choose either BT.709 or BT.2020/DCI-P3 as the shared gamut between HDR and SDR. A 3D LUT could also solve this issue.

It also depends on the quality of the tone mapping provided by your display. At 300 nits, you lose a lot of detail through compression that madVR could restore with its highlight recovery setting. Also, some displays can't tone map all pixels with the correct hue. At least, with madVR you always get the correct hue. The image could also be potentially brighter. It depends on the display.

It is definitely worth trying. I would guess madVR would do a better job of tone mapping than a display that dim.

Here are a couple of images that show how highlight recovery strength works:

highlight recovery strength: none
highlight recovery strength: medium

This is also an image I made from the latest build that I think looks pretty attractive:

1,000 nits BT.2020 -> 150 nits BT.709 (390 target nits) (highlight recovery strength: medium)

Last edited by Warner306; 12th September 2018 at 16:10.
Warner306 is offline   Reply With Quote
Old 12th September 2018, 19:41   #52397  |  Link
Damien147
Registered User
 
Join Date: Mar 2011
Posts: 273
What's better for Madvr?10 bit and 4:2:2 or 8 bit and 4:4:4?
Damien147 is offline   Reply With Quote
Old 12th September 2018, 21:25   #52398  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 528
Depends on your display. On most, 8-bit 4:4:4 will be better as long as you use RGB Full (not YCbCr).
__________________
HTPC: Windows 10 1809, MediaPortal 1, LAV Filters, ReClock, madVR. DVB-C TV, Panasonic GT60, 6.0 speakers Denon 2310, Core 2 Duo E7400, GeForce 1050 Ti
el Filou is offline   Reply With Quote
Old 12th September 2018, 21:51   #52399  |  Link
Damien147
Registered User
 
Join Date: Mar 2011
Posts: 273
Yes,RGB Full.Thank you.
Damien147 is offline   Reply With Quote
Old 12th September 2018, 22:23   #52400  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,663
8-bit YCbCr 4:4:4 is not bad either, it is much better than YCbCr 4:2:2 and only very slightly worse than RGB (RGB cannot be subsampled).

4:2:2 means two out of the three image planes are at half resolution, the way humans perceive color is weird so we don't notice all the information loss but the signal does have a lower spacial resolution. Bluring together the color of every two pixels horizontally so you can get some extra steps between color values is not helpful and preserving the dithering at full resolution is important. The exception is when your display uses 4:2:2 internally, then it often isn't any worse to have the GPU do it first.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 17:13.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.