Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 19th May 2015, 14:31   #30201  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,247
Quote:
Originally Posted by leeperry View Post
NNEDI3 would appear to be a black box whose secrets are only known by tritical himself and he hasn't released stuff in a while AFAIK so I wouldn't set my hopes too high.
Apparently he was working on a NNEDI4? He's been contacted recently, so it might not be terribly hard to improve it.
Looks like the waifu code is available to use for anyone that wants it judging from the license.. I wonder if shiandow could adapt NNEDI3 with waifu2x's improved neural guesswork.
ryrynz is offline   Reply With Quote
Old 19th May 2015, 14:46   #30202  |  Link
The 8472
Registered User
 
Join Date: Jan 2014
Posts: 51
Quote:
Originally Posted by ryrynz View Post
I've edited my post, take a look. NNEDI3 is superior, don't confuse line sharpness with line construction
Hum, ok, maybe my jargon is off.

Compare any of the nnedi3 images vs. the waifu2 I've posted and look at the scarf. scarf, stairs, nnedi

What waifu2 seems to do better is avoiding the stairs effects on fine lines.

How would you call that? It's obviously more than just dumb smoothing/sharpening.

The stairs that nnedi "preserves" are what sticks out like a sore thumb to me, because that's an artifact of rasterization of finer lineart.

Last edited by The 8472; 19th May 2015 at 15:17.
The 8472 is offline   Reply With Quote
Old 19th May 2015, 15:13   #30203  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,058
waifux2 is clear better in term of details at the cost of a lot of aliasing.

but the source has a ton of aliasing too so hard to judge
huhn is offline   Reply With Quote
Old 19th May 2015, 15:33   #30204  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,247
Quote:
Originally Posted by The 8472 View Post
What waifu2 seems to do better is avoiding the stairs effects on fine lines.
Probably shouldn't continue much further with this discussion here but those words link to the same file.

Quote:
Originally Posted by huhn View Post
waifux2 is clear better in term of details at the cost of a lot of aliasing.
but the source has a ton of aliasing too so hard to judge
Fine details judging from what 8472 says.. I guess we could open a new thread for more comparisons.
ryrynz is offline   Reply With Quote
Old 19th May 2015, 15:39   #30205  |  Link
XMonarchY
Registered User
 
Join Date: Jan 2014
Posts: 489
I was right about 8bit + FRC NOT being the same as true 10bit. nVidia just added an option to select color depth in its latest 352.86 drivers and it will not let me select anything other than 8bit on my 8bit + FRC monitor (Eizo Foris FG2421 - using DisplayPort). True 10bit monitor would be allowed a setting of 10bit in nVidia CP. Using 10bit setting in madVR is a bad idea for 8bit + FRC monitors. I will either not work or make rendering quality worse than it would be if you selected 8bit. Bit depth selection and HDMI Full/Limit color range can be selected under "Change resolution" in nVidia CP in 352.86 drivers. Just scroll down and you will see the new options.

I have this odd problem. I can use Direct3D 11 with Sync sub-option underneath it on my 120Hz FG2421 monitor without ANY frame drops or problems. However, when I use my HDTV at 23Hz, I get constant frame drops with Direct3D 11 with or without the Sync sub-option underneath it. Why is that? Is it because ReClock works harder @ 23Hz or what? I thought 120Hz would be the one with problems because it requires faster rendering.
__________________
8700K @ 5Ghz | ASUS Z370 Hero X | Corsair 16GB @ 3200Mhz | RTX 2080 Ti @ 2100Mhz | Samsung 970 NVMe 250GB | WD Black 2TB | Corsair AX 850W | LG 32GK850G-B @ 165Hz | Xonar DGX | Windows 10 LTSC 1809

Last edited by XMonarchY; 19th May 2015 at 15:58.
XMonarchY is offline   Reply With Quote
Old 19th May 2015, 15:50   #30206  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,058
Quote:
Originally Posted by XMonarchY View Post
I was right about 8bit + FRC NOT being the same as true 10bit. nVidia just added an option to select color depth in its latest 352.86 drivers and it will not let me select anything other than 8bit on my 8bit + FRC monitor (Eizo Foris FG2421). True 10bit monitor would be allowed a setting of 10bit in nVidia CP. Using 10bit setting in madVR is a bad idea for 8bit + FRC monitors. I will either not work or make rendering quality worse than it would be if you selected 8bit.

P.S. Bit depth and HDMI Full/Limit color range can be selected under "Change resolution" in nVidia CP. Just scroll down and you will see the new options.
what connector are you using?
and if your device doesn't accept 10 bit input it's kind of worthless to send 10 bit to the driver.

but not all 8bit+ FRC doesn't accept 10 or 12 bit input.
huhn is offline   Reply With Quote
Old 19th May 2015, 15:57   #30207  |  Link
XMonarchY
Registered User
 
Join Date: Jan 2014
Posts: 489
Quote:
Originally Posted by huhn View Post
what connector are you using?
and if your device doesn't accept 10 bit input it's kind of worthless to send 10 bit to the driver.

but not all 8bit+ FRC doesn't accept 10 or 12 bit input.
I use DisplayPort. I guess its just my monitor that does not accept 10bit. I wonder if there is a way to force it... I mean there is DEFINITELY dithering going on on my monitor. I can see it easily.
__________________
8700K @ 5Ghz | ASUS Z370 Hero X | Corsair 16GB @ 3200Mhz | RTX 2080 Ti @ 2100Mhz | Samsung 970 NVMe 250GB | WD Black 2TB | Corsair AX 850W | LG 32GK850G-B @ 165Hz | Xonar DGX | Windows 10 LTSC 1809
XMonarchY is offline   Reply With Quote
Old 19th May 2015, 16:03   #30208  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 333
dithering error diffusion vs ordered dithering and waifu2 vs NNEDI3

Quote:
Originally Posted by Ver Greeneyes View Post
Using it on your 8-bit monitor will give you worse quality since it will basically bypass madVR's dithering to fall back on the dithering of your display or GPU.
Quote:
Originally Posted by James Freeman View Post
Are you sure?
I think switching to 10bit in madVR does not disable dithering, but it does make the dithering less strong.
If madVR would have 16bit option with dithering enabled, on an 8bit monitor it would look completely undithered.
Madshi wrote about it in another post. The way he currently has things set is if 10-bit is active it will switch dithering mode from Error Diffusion (if one of the Error Diffusion options is selected for dithering) to Ordered Dithering. He wrote that at that depth no significant visual improvements can be seen using Error Diffusion on 10-bit screens and the resources used for Error Diffusion were wasted, and could be better utilized elsewhere (I agree).

The old test here (for those not familiar with it) is to go into madVR display settings and change the bit-depth for the monitor to anything from 1 to 4 bit (which will make the dithering effects visible), and then toggle through the different dithering patterns to find which effect(s) you like the look of best. The difference between Error Diffusion and Ordered Dithering is pretty insignificant. When you change your madVR display setting back to 8 or 10-bit depth the patterns should completely disappear (making the choice even more insignificant).

Quote:
Originally Posted by The 8472 View Post
There's a new superresolution upscaler for still images: https://github.com/nagadomi/waifu2x/ based on http://arxiv.org/abs/1501.00092

There also is a webservice running that implementation which has been trained with anime-style images and provides very good results on that type of material, especially on fine lines that are barely a pixel wide and fairly aliased. It also seems to introduce fewer fake details on flat surfaces than nnedi.
If you run it on real life images you'll get oil paintings.
Specialization a specific kind of content seems to have a pretty big impact.

Source image
new upscaler w/ noise reduction
new upscaler w/o noise reduction
NNEDI3 32 w/ SuperRes
NNEDI3 64 w/o SuperRes
Quote:
Originally Posted by ryrynz View Post
So NNEDI3 blows waifu2 away in the most important area (line construction) check out left side hair clip (nearest neighbour upscale? eww) and the hair outlines.
Sure waifu2 used with a blur is cleaner and sharper but it's totally overdone.
I agree with 8472. I like the look(s) of waifu2 a lot better than NNEDi3.

Quote:
Originally Posted by XMonarchY View Post
I use DisplayPort. I guess its just my monitor that does not accept 10bit. I wonder if there is a way to force it... I mean there is DEFINITELY dithering going on on my monitor. I can see it easily.
Someone already mentioned it, but it would not do any good forcing your system to output 10-bit if your monitor/screen/tv is limited to 8-bit. Like previously stated it can make your image worse.
Perhaps your source video is the problem. If the source/video is lower bit than the screen you're displaying on then you're going to get a worse quality picture/video than you would if the source matched your screen bit depth. Having a 10-bit source is ideal for displaying 10-bit mode on a 10-bit screen. If you think you can see dithering with your madVR settings set to 8 or 10-bit for your screen do the bit changing test I posted above. Toggle through each setting from 1-bit all the way to 10-bit in your madVR settings, and see how the dithering effects become less and less noticeable. Once you get to 5 or 6-bit you shouldn't be able to see the dithering pattern/effects any longer. Still think you're seeing dithering? Its far more likely that you've got a bad/noisy/grainy/garbage filled source or your scaling choices are adding artifacts to the video.
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.

Last edited by Anime Viewer; 19th May 2015 at 16:22. Reason: added reply to XMonarchY
Anime Viewer is offline   Reply With Quote
Old 19th May 2015, 16:13   #30209  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,247
Quote:
Originally Posted by Anime Viewer View Post
I agree with 8472. I like the look(s) of waifu2 a lot better than NNEDi3.
There's only one good thing about waifu2 IMO and that it's line detection code. Bumping up NNEDI3 to 256 neurons and adding a sharpener and denoiser of your choice would produce a superior image overall.

You really can't compare 64 neurons without any sharpening to an image that's undergone sharpening with or without denoising. Of course some people are going to prefer an over sharpened image.. I'm not one of them.

I've started a thread about it here.

Last edited by ryrynz; 19th May 2015 at 16:24.
ryrynz is offline   Reply With Quote
Old 19th May 2015, 16:19   #30210  |  Link
Shiandow
Registered User
 
Join Date: Dec 2013
Posts: 752
Quote:
Originally Posted by ryrynz View Post
I wonder if shiandow could adapt NNEDI3 with waifu2x's improved neural guesswork.
As far as I can tell they're not using better "guesswork" they're just using far far more neurons, they also look at a larger region of the image to calculate a single pixel (hence why it's better at removing aliasing). I estimate it's at least 20x slower than NNEDI3 with 256 neurons, but even that is a very generous estimate, it could easily be 100x.
Shiandow is offline   Reply With Quote
Old 19th May 2015, 16:39   #30211  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,247
Quote:
Originally Posted by Shiandow View Post
I estimate it's at least 20x slower than NNEDI3 with 256 neurons, but even that is a very generous estimate, it could easily be 100x.
Pretty much ends the discussion of comparisons right there, does make me wonder what NNEDI3 with 512 neurons would look like though.
ryrynz is offline   Reply With Quote
Old 19th May 2015, 16:43   #30212  |  Link
x7007
Registered User
 
Join Date: Apr 2013
Posts: 252
I succeed to use the 10bit , but my render and present queue are 2 and 1.

Without present a frame for every Vsync I had tons of presentation errors / with it none

Any idea how to fix the render and present half queues ?

Does watching movies in 10bit will be any usefulness ? what will it be good for if not for movies
x7007 is offline   Reply With Quote
Old 19th May 2015, 17:28   #30213  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by XMonarchY View Post
I was right about 8bit + FRC NOT being the same as true 10bit. True 10bit monitor would be allowed a setting of 10bit in nVidia CP. Using 10bit setting in madVR is a bad idea for 8bit + FRC monitors.
You are absolutely WRONG about everything.

I have a Dell U2410 which is 8bit+FRC and it can receive 10bit signal easily, in fact it is what I use now in Windows 7 by default.
I can select between 8 or 10bit in nvidia control panel, I use DisplayPort.

Using 10bit setting in madVR is a GREAT idea for 8bit + FRC monitors.
For the simple reason that FRC dithers in the refresh rate of the display, but madVR is tied to the frame rate of the video played (until of course madshi decides to change that, if ever).
And because madVR does not dither the perfect integers of 10bit output but 8bit+FRC display is, so not double dithering, everything falls into place like magic.

Yes, if one has a 8bit+FRC or true 10bit display that accepts 10bit signal, please use madVR in 10bit+Dithering with FSE and D3D11, you should get a better/cleaner picture (although practically invisible).
Also keep windows in 10bit so that the switch to FSE is faster.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 19th May 2015 at 17:39.
James Freeman is offline   Reply With Quote
Old 19th May 2015, 17:35   #30214  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,058
Quote:
Originally Posted by James Freeman View Post
You are absolutely WRONG about everything.

I have a Dell U2410 which is 8bit+FRC and it can receive 10bit signal easily, in fact it is what I use now in Windows 7 by default.
I can select between 8 or 10bit in nvidia control panel, I use DisplayPort.

Using 10bit setting in madVR is a GREAT idea for 8bit + FRC monitors.
For the simple reason that FRC dithers in the refresh rate of the display, but madVR is tied to the frame rate of the video played (until of course madshi decides to change that, if ever).
but if he can't output 10 bit to this screen it is doing nothing positive.
huhn is offline   Reply With Quote
Old 19th May 2015, 17:38   #30215  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by huhn View Post
but if he can't output 10 bit to this screen it is doing nothing positive.
Correct.
If the screen only accept 8bit as reported by nvidia control panel, it may very well be that the screen is actually not a 8bit+FRC but a 8bit one.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.
James Freeman is offline   Reply With Quote
Old 19th May 2015, 17:40   #30216  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,058
Quote:
Originally Posted by James Freeman View Post
Correct.
If the screen only accept 8bit as reported by nvidia control panel, it may very well be that the screen is actually not a 8bit+FRC but a 8bit one.
he is using a 120 hz screen could be a limitation from DP 1.2 this maybe works with DVI.
but the eizo is know to be 8 bit FRC
huhn is offline   Reply With Quote
Old 19th May 2015, 17:41   #30217  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,859
Quote:
Originally Posted by James Freeman View Post
Correct.
If the screen only accept 8bit as reported by nvidia control panel, it may very well be that the screen is actually not a 8bit+FRC but a 8bit one.
Screens also do some color processing, so it might be 8+FRC and still not accept 10-bit input. Its a gaming screen afterall..
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 19th May 2015, 17:51   #30218  |  Link
Shiandow
Registered User
 
Join Date: Dec 2013
Posts: 752
Quote:
Originally Posted by James Freeman View Post
Using 10bit setting in madVR is a GREAT idea for 8bit + FRC monitors.
For the simple reason that FRC dithers in the refresh rate of the display, but madVR is tied to the frame rate of the video played (until of course madshi decides to change that, if ever).
And because madVR does not dither the perfect integers of 10bit output but 8bit+FRC display is, so not double dithering, everything falls into place like magic.
I'm pretty sure MadVR dithers in the refresh rate of the display if you enable smooth motion. Also, if you output 10 bit then MadVR should dither from 16 bits to 10, which is then dithered from 10 bits to 8 on your monitor, this would be worse than directly dithering to 8 bits. Although it's unlikely that you'll be able to actually see any difference.
Shiandow is offline   Reply With Quote
Old 19th May 2015, 17:52   #30219  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 333
Quote:
Originally Posted by x7007 View Post
Does watching movies in 10bit will be any usefulness ? what will it be good for if not for movies
The usefulness of you running in 10-bit depends on a number of factors. 10-bit provides access to potentially more colors to be displayed. If you don't notice more colors, or an increase in the different shades of colors in the videos you are watching then it will not be useful to you. If your source is providing 8-bit instead of 10-bit the visual you are given will be inferior to what you would have seen if you had a 10-bit source being displayed at 10-bit. If you're screen only supports up to 8-bit then again you will not see the benefit provided by 10-bit.

If you're not noticing a difference between when you run in 8-bit mode compared to 10-bit mode, or you like taking screen shots of full screen video (which doesn't work in exclusive mode) then you might as well stick with the 8-bit setting(s).
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.
Anime Viewer is offline   Reply With Quote
Old 19th May 2015, 17:57   #30220  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by Shiandow View Post
I'm pretty sure MadVR dithers in the refresh rate of the display if you enable smooth motion. Also, if you output 10 bit then MadVR should dither from 16 bits to 10, which is then dithered from 10 bits to 8 on your monitor, this would be worse than directly dithering to 8 bits. Although it's unlikely that you'll be able to actually see any difference.
Good to know about smooth motion dithering at refresh speed, thanks.
So when smooth motion is enabled the dithering by madVR can be considered of better quality than FRC.

Isn't madVR leaves the steps that don't need dithering undithered?
Can we say the same for FRC?
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 19th May 2015 at 18:00.
James Freeman is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 17:21.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, vBulletin Solutions Inc.