Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 19th May 2015, 15:03   #30201  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 339
dithering error diffusion vs ordered dithering and waifu2 vs NNEDI3

Quote:
Originally Posted by Ver Greeneyes View Post
Using it on your 8-bit monitor will give you worse quality since it will basically bypass madVR's dithering to fall back on the dithering of your display or GPU.
Quote:
Originally Posted by James Freeman View Post
Are you sure?
I think switching to 10bit in madVR does not disable dithering, but it does make the dithering less strong.
If madVR would have 16bit option with dithering enabled, on an 8bit monitor it would look completely undithered.
Madshi wrote about it in another post. The way he currently has things set is if 10-bit is active it will switch dithering mode from Error Diffusion (if one of the Error Diffusion options is selected for dithering) to Ordered Dithering. He wrote that at that depth no significant visual improvements can be seen using Error Diffusion on 10-bit screens and the resources used for Error Diffusion were wasted, and could be better utilized elsewhere (I agree).

The old test here (for those not familiar with it) is to go into madVR display settings and change the bit-depth for the monitor to anything from 1 to 4 bit (which will make the dithering effects visible), and then toggle through the different dithering patterns to find which effect(s) you like the look of best. The difference between Error Diffusion and Ordered Dithering is pretty insignificant. When you change your madVR display setting back to 8 or 10-bit depth the patterns should completely disappear (making the choice even more insignificant).

Quote:
Originally Posted by The 8472 View Post
There's a new superresolution upscaler for still images: https://github.com/nagadomi/waifu2x/ based on http://arxiv.org/abs/1501.00092

There also is a webservice running that implementation which has been trained with anime-style images and provides very good results on that type of material, especially on fine lines that are barely a pixel wide and fairly aliased. It also seems to introduce fewer fake details on flat surfaces than nnedi.
If you run it on real life images you'll get oil paintings.
Specialization a specific kind of content seems to have a pretty big impact.

Source image
new upscaler w/ noise reduction
new upscaler w/o noise reduction
NNEDI3 32 w/ SuperRes
NNEDI3 64 w/o SuperRes
Quote:
Originally Posted by ryrynz View Post
So NNEDI3 blows waifu2 away in the most important area (line construction) check out left side hair clip (nearest neighbour upscale? eww) and the hair outlines.
Sure waifu2 used with a blur is cleaner and sharper but it's totally overdone.
I agree with 8472. I like the look(s) of waifu2 a lot better than NNEDi3.

Quote:
Originally Posted by XMonarchY View Post
I use DisplayPort. I guess its just my monitor that does not accept 10bit. I wonder if there is a way to force it... I mean there is DEFINITELY dithering going on on my monitor. I can see it easily.
Someone already mentioned it, but it would not do any good forcing your system to output 10-bit if your monitor/screen/tv is limited to 8-bit. Like previously stated it can make your image worse.
Perhaps your source video is the problem. If the source/video is lower bit than the screen you're displaying on then you're going to get a worse quality picture/video than you would if the source matched your screen bit depth. Having a 10-bit source is ideal for displaying 10-bit mode on a 10-bit screen. If you think you can see dithering with your madVR settings set to 8 or 10-bit for your screen do the bit changing test I posted above. Toggle through each setting from 1-bit all the way to 10-bit in your madVR settings, and see how the dithering effects become less and less noticeable. Once you get to 5 or 6-bit you shouldn't be able to see the dithering pattern/effects any longer. Still think you're seeing dithering? Its far more likely that you've got a bad/noisy/grainy/garbage filled source or your scaling choices are adding artifacts to the video.
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.

Last edited by Anime Viewer; 19th May 2015 at 15:22. Reason: added reply to XMonarchY
Anime Viewer is offline   Reply With Quote
Old 19th May 2015, 15:13   #30202  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,650
Quote:
Originally Posted by Anime Viewer View Post
I agree with 8472. I like the look(s) of waifu2 a lot better than NNEDi3.
There's only one good thing about waifu2 IMO and that it's line detection code. Bumping up NNEDI3 to 256 neurons and adding a sharpener and denoiser of your choice would produce a superior image overall.

You really can't compare 64 neurons without any sharpening to an image that's undergone sharpening with or without denoising. Of course some people are going to prefer an over sharpened image.. I'm not one of them.

I've started a thread about it here.

Last edited by ryrynz; 19th May 2015 at 15:24.
ryrynz is offline   Reply With Quote
Old 19th May 2015, 15:19   #30203  |  Link
Shiandow
Registered User
 
Join Date: Dec 2013
Posts: 753
Quote:
Originally Posted by ryrynz View Post
I wonder if shiandow could adapt NNEDI3 with waifu2x's improved neural guesswork.
As far as I can tell they're not using better "guesswork" they're just using far far more neurons, they also look at a larger region of the image to calculate a single pixel (hence why it's better at removing aliasing). I estimate it's at least 20x slower than NNEDI3 with 256 neurons, but even that is a very generous estimate, it could easily be 100x.
Shiandow is offline   Reply With Quote
Old 19th May 2015, 15:39   #30204  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,650
Quote:
Originally Posted by Shiandow View Post
I estimate it's at least 20x slower than NNEDI3 with 256 neurons, but even that is a very generous estimate, it could easily be 100x.
Pretty much ends the discussion of comparisons right there, does make me wonder what NNEDI3 with 512 neurons would look like though.
ryrynz is offline   Reply With Quote
Old 19th May 2015, 15:43   #30205  |  Link
x7007
Registered User
 
Join Date: Apr 2013
Posts: 315
I succeed to use the 10bit , but my render and present queue are 2 and 1.

Without present a frame for every Vsync I had tons of presentation errors / with it none

Any idea how to fix the render and present half queues ?

Does watching movies in 10bit will be any usefulness ? what will it be good for if not for movies
x7007 is offline   Reply With Quote
Old 19th May 2015, 16:28   #30206  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by XMonarchY View Post
I was right about 8bit + FRC NOT being the same as true 10bit. True 10bit monitor would be allowed a setting of 10bit in nVidia CP. Using 10bit setting in madVR is a bad idea for 8bit + FRC monitors.
You are absolutely WRONG about everything.

I have a Dell U2410 which is 8bit+FRC and it can receive 10bit signal easily, in fact it is what I use now in Windows 7 by default.
I can select between 8 or 10bit in nvidia control panel, I use DisplayPort.

Using 10bit setting in madVR is a GREAT idea for 8bit + FRC monitors.
For the simple reason that FRC dithers in the refresh rate of the display, but madVR is tied to the frame rate of the video played (until of course madshi decides to change that, if ever).
And because madVR does not dither the perfect integers of 10bit output but 8bit+FRC display is, so not double dithering, everything falls into place like magic.

Yes, if one has a 8bit+FRC or true 10bit display that accepts 10bit signal, please use madVR in 10bit+Dithering with FSE and D3D11, you should get a better/cleaner picture (although practically invisible).
Also keep windows in 10bit so that the switch to FSE is faster.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 19th May 2015 at 16:39.
James Freeman is offline   Reply With Quote
Old 19th May 2015, 16:35   #30207  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,905
Quote:
Originally Posted by James Freeman View Post
You are absolutely WRONG about everything.

I have a Dell U2410 which is 8bit+FRC and it can receive 10bit signal easily, in fact it is what I use now in Windows 7 by default.
I can select between 8 or 10bit in nvidia control panel, I use DisplayPort.

Using 10bit setting in madVR is a GREAT idea for 8bit + FRC monitors.
For the simple reason that FRC dithers in the refresh rate of the display, but madVR is tied to the frame rate of the video played (until of course madshi decides to change that, if ever).
but if he can't output 10 bit to this screen it is doing nothing positive.
huhn is offline   Reply With Quote
Old 19th May 2015, 16:38   #30208  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by huhn View Post
but if he can't output 10 bit to this screen it is doing nothing positive.
Correct.
If the screen only accept 8bit as reported by nvidia control panel, it may very well be that the screen is actually not a 8bit+FRC but a 8bit one.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.
James Freeman is offline   Reply With Quote
Old 19th May 2015, 16:40   #30209  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,905
Quote:
Originally Posted by James Freeman View Post
Correct.
If the screen only accept 8bit as reported by nvidia control panel, it may very well be that the screen is actually not a 8bit+FRC but a 8bit one.
he is using a 120 hz screen could be a limitation from DP 1.2 this maybe works with DVI.
but the eizo is know to be 8 bit FRC
huhn is offline   Reply With Quote
Old 19th May 2015, 16:41   #30210  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
Quote:
Originally Posted by James Freeman View Post
Correct.
If the screen only accept 8bit as reported by nvidia control panel, it may very well be that the screen is actually not a 8bit+FRC but a 8bit one.
Screens also do some color processing, so it might be 8+FRC and still not accept 10-bit input. Its a gaming screen afterall..
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 19th May 2015, 16:51   #30211  |  Link
Shiandow
Registered User
 
Join Date: Dec 2013
Posts: 753
Quote:
Originally Posted by James Freeman View Post
Using 10bit setting in madVR is a GREAT idea for 8bit + FRC monitors.
For the simple reason that FRC dithers in the refresh rate of the display, but madVR is tied to the frame rate of the video played (until of course madshi decides to change that, if ever).
And because madVR does not dither the perfect integers of 10bit output but 8bit+FRC display is, so not double dithering, everything falls into place like magic.
I'm pretty sure MadVR dithers in the refresh rate of the display if you enable smooth motion. Also, if you output 10 bit then MadVR should dither from 16 bits to 10, which is then dithered from 10 bits to 8 on your monitor, this would be worse than directly dithering to 8 bits. Although it's unlikely that you'll be able to actually see any difference.
Shiandow is offline   Reply With Quote
Old 19th May 2015, 16:52   #30212  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 339
Quote:
Originally Posted by x7007 View Post
Does watching movies in 10bit will be any usefulness ? what will it be good for if not for movies
The usefulness of you running in 10-bit depends on a number of factors. 10-bit provides access to potentially more colors to be displayed. If you don't notice more colors, or an increase in the different shades of colors in the videos you are watching then it will not be useful to you. If your source is providing 8-bit instead of 10-bit the visual you are given will be inferior to what you would have seen if you had a 10-bit source being displayed at 10-bit. If you're screen only supports up to 8-bit then again you will not see the benefit provided by 10-bit.

If you're not noticing a difference between when you run in 8-bit mode compared to 10-bit mode, or you like taking screen shots of full screen video (which doesn't work in exclusive mode) then you might as well stick with the 8-bit setting(s).
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.
Anime Viewer is offline   Reply With Quote
Old 19th May 2015, 16:57   #30213  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by Shiandow View Post
I'm pretty sure MadVR dithers in the refresh rate of the display if you enable smooth motion. Also, if you output 10 bit then MadVR should dither from 16 bits to 10, which is then dithered from 10 bits to 8 on your monitor, this would be worse than directly dithering to 8 bits. Although it's unlikely that you'll be able to actually see any difference.
Good to know about smooth motion dithering at refresh speed, thanks.
So when smooth motion is enabled the dithering by madVR can be considered of better quality than FRC.

Isn't madVR leaves the steps that don't need dithering undithered?
Can we say the same for FRC?
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 19th May 2015 at 17:00.
James Freeman is offline   Reply With Quote
Old 19th May 2015, 17:16   #30214  |  Link
Ver Greeneyes
Registered User
 
Join Date: May 2012
Posts: 447
Technically the FRC dithering could be happening at a higher rate than the overall refresh rate since it just has to toggle between two adjacent colors. One disadvantage of dithering on the monitor is it probably doesn't take into account nonlinearities in the display - e.g. for 10.5 it'll probably just alternate between 10 and 11, even if the correct ratio would be something like 25% to 75% (which madVR could account for with its linear light scaling). Admittedly the difference is likely to be small though.
Ver Greeneyes is offline   Reply With Quote
Old 19th May 2015, 17:22   #30215  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,905
Quote:
Originally Posted by James Freeman View Post
Good to know about smooth motion dithering at refresh speed, thanks.
So when smooth motion is enabled the dithering by madVR can be considered of better quality than FRC.

Isn't madVR leaves the steps that don't need dithering undithered?
Can we say the same for FRC?
it doesn't dither at refreshrate speed it dithers at output speed that could 48 FPS at 60 hz with a 24p source.
huhn is offline   Reply With Quote
Old 19th May 2015, 17:36   #30216  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by Ver Greeneyes View Post
Technically the FRC dithering could be happening at a higher rate than the overall refresh rate since it just has to toggle between two adjacent colors.
Reading this article: http://robotics.ee.uwa.edu.au/eyejr/...nformation.pdf

Quote:
Frame Rate Control is achieved by tuning pixels on and off over several frame
periods.
With sufficient frame refreshing frequency, our human eyes will average
out the darkness of a pixel so that the individual pixel will show as gray.
Quote:
Besides, sufficient high frequency is necessary in order to achieve a smooth display.
Quote:
Originally Posted by TFTCentral
This works by combining four colour frames as a sequence in time, resulting in perceived mixture.
Man, I sure hope that you are right and FRC uses at least four times the refresh rate of the screen to generate the shades.
If not, and FRC uses the refresh rate of the screen, it is the worst method by far and should be used strictly for static images.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 19th May 2015 at 17:59.
James Freeman is offline   Reply With Quote
Old 19th May 2015, 17:50   #30217  |  Link
tickled_pink
Registered User
 
Join Date: Dec 2011
Posts: 12
Regarding Debanding - I can't say one is better than the other. Just when I think one is better, I switch to a different source to see that my preference has changed. My personal preference is "less is more" so I generally prefer the lowest setting. I'd rather see a bit of banding than a washed out scene. My "vote" goes to the more efficient algo which appears to be MadVR (according to GPU-Z).
tickled_pink is offline   Reply With Quote
Old 19th May 2015, 18:18   #30218  |  Link
cvrkuth
Registered User
 
Join Date: Apr 2011
Posts: 54
PotPlayer freeze on pause

I use PotPlayer 1.6.54133 64bit with madVR v0.88.8 (64-bit) on Windows 8.1 (64-bit).
I noticed that when I press pause the video freezes. When press pause again, the video is still frozen, but the sound is continue playing.

This occurs only with 64-bit madVR D3D11 FSE mode 10bit (D3D9 FSE 8bit works well)
__________________
Intel Core i7-4790 CPU @ 3.60GHz, RAM 32 GB Dual-Channel DDR3 @ 665MHz (9-9-9-24),
Panasonic TX-P42G20E, NVIDIA GeForce GTX 970, Win 10 Pro x64,
PotPlayer 1.7.16291 64-bit, madVR v0.92.17
cvrkuth is offline   Reply With Quote
Old 19th May 2015, 18:25   #30219  |  Link
Barnahadnagy
Registered User
 
Join Date: Apr 2014
Posts: 13
Debanding

After some tests in the previous days, I came to the conclusion that I have no conclusive opinion. MadVR's algo does a fine job at most cases, then there are some where Shiandow's is better. The same could be said for detail preservation, thou MadVR's high felt a bit worse overall in this regard. Overall, Shiandow's algo matches MadVR's in its latest iteration (At the beginning I preferred MadVR's, but it improved a lot. Good job, Shiandow!). If anything, the "High" preset should be tested against Shiandow's more, but I'm afraid this is all I could do.
Barnahadnagy is offline   Reply With Quote
Old 19th May 2015, 18:39   #30220  |  Link
vivan
/人 ◕ ‿‿ ◕ 人\
 
Join Date: May 2011
Location: Russia
Posts: 643
Quote:
Originally Posted by James Freeman View Post
Man, I sure hope that you are right and FRC uses at least four times the refresh rate of the screen to generate the shades.
If not, and FRC uses the refresh rate of the screen, it is the worst method by far and should be used strictly for static images.
I have 6-bit + FRC IPS display, dithering noise is easy noticeable, so I'm pretty sure that it runs at 60 hz. But I can't spot the pattern. When comparing it with true 8-bit IPS (and madVR set to 6-bit) - I would say that dithering is far from dynamic ordered - more like dynamic random.
And banding there is terrible, 6-bit output in madVR is the only way to fight it.

But this might be true for those 200+ inflated hz TV panels.
vivan is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 21:57.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.