Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 10th March 2016, 06:49   #36781  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by bcec View Post
is superxbr + sr4 preferable over nnedi3@64 + sr3 when doing 1080p->2160p upscaling?

I seem to have slight preference to the latter, but I am not sure if it is placebo, or if there are indeed noticeable difference making the latter a better option?
I prefer SuperRes (3) at a 2x scaling factor. I don't like an oversharpened image, but find I can improve the image further by adding crispen edges (1.0), as well. This will make higher SuperRes values look better.

super-xbr and NNEDI3 look similar with SuperRes, but super-xbr will enhance any existing ringing, while NNEDI3 will not. If you can't see ringing, or if doesn't bother you, use super-xbr.
Warner306 is offline   Reply With Quote
Old 10th March 2016, 08:10   #36782  |  Link
RyuzakiL
Registered User
 
Join Date: May 2015
Posts: 36
AMD Crimson 16.3 + Win10x64 + MadVR

Has anyone tried the latest Crimson driver 16.3 on Win10x64, MadVR?

and does MadVR work properly on Win10x64? or is it still Win8.1 that MadVR works perfectly.
RyuzakiL is offline   Reply With Quote
Old 10th March 2016, 08:51   #36783  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Quote:
Originally Posted by RyuzakiL View Post
Has anyone tried the latest Crimson driver 16.3 on Win10x64, MadVR?

and does MadVR work properly on Win10x64? or is it still Win8.1 that MadVR works perfectly.
Why not test it yourself?

Windows 10 works perfectly for me using a GTX 960 with latest everything.
ryrynz is offline   Reply With Quote
Old 10th March 2016, 09:27   #36784  |  Link
rxracer
Registered User
 
Join Date: Jul 2015
Posts: 3
Sorry if this has been asked before. 1840 pages was daunting, and I couldn't find an answer searching.

I have 2 displays. If I set the viewing display to secondary I get dropped frames and glitches. If I'm doing nothing on the PC it drops a frame and produces a glitch every 7 seconds on the dot. I timed it.. bizarre. If I start using the pc on the primary monitor the video drops frames and glitches rapidly.

Is any of this a known issue? I've changed and disabled every single setting in MadVR, to no avail. I use Lav Filters and Zoom Player. Thinking about it now I'll try with MPC-HC. I hope it isn''t the player cos I really don't like any others.
rxracer is offline   Reply With Quote
Old 10th March 2016, 10:08   #36785  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by ryrynz View Post
Why not test it yourself?

Windows 10 works perfectly for me using a GTX 960 with latest everything.
there are problem with deinterlancing with windows 10.

of cause this is not an madVR error.
huhn is offline   Reply With Quote
Old 10th March 2016, 11:49   #36786  |  Link
retrue
Registered User
 
Join Date: Apr 2015
Posts: 12
I am using last development versions of MPC-HC and LAVFilters.
I updated yesterday from madVR 0.9010 to 0.9015. When I try to play videos I get sound but not image, not in window mode, neither in fullscreen (curiously, with the last version of PotPlayer+MadVR I have image in fullscreen mode but not in window mode). This happens with vids at 1080p and codec H264. It doesn't happen with vids at 480p and codec H264.
The problem appears with version 0.9011. I have returned to madVR 0.9010 and the problem has disappeared.
retrue is offline   Reply With Quote
Old 10th March 2016, 13:36   #36787  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
are you using image doubling with "always"?

you may just run out of vram.
huhn is offline   Reply With Quote
Old 10th March 2016, 14:04   #36788  |  Link
retrue
Registered User
 
Join Date: Apr 2015
Posts: 12
Quote:
Originally Posted by huhn View Post
are you using image doubling with "always"?

you may just run out of vram.
Image doubling: double luma (64 neurons) and double chroma (32 neurons), always if upscaling is needed.
But with version madvr 0.9010 I don't have problems.
retrue is offline   Reply With Quote
Old 10th March 2016, 14:07   #36789  |  Link
adhara
Registered User
 
Join Date: Jan 2013
Posts: 41
Hi,

When doing 1080p to 2160p upscale with MadVr, I get some video judders (every 15/20 sec) on my UHD TV (2160p/60Hz & HDCP 2.2).
The video input signal on my TV is well seen as 2160p.
I tested several video rates: 23.976Hz, 24 Hz, ... Same issue.

No dropped/delayed/... frame on MadVr. My GC is a Nvidia 970 GTX.

How come ?

When playing back a 2160p video directly on the TV (with an usb key for example), no problem.
adhara is offline   Reply With Quote
Old 10th March 2016, 14:34   #36790  |  Link
fedpul
Registered User
 
Join Date: Feb 2014
Posts: 94
Quote:
Originally Posted by fedpul View Post
Hi, I can confirm that supersampling 720, the difference between SSIM 1D and SSIM 2D is between 2-3 ms. I have a GTX 970 but I can also see that when supersampling 1080 the change in rendering times are kind of abnormal, it is a jump of 30-40 ms between 1D and 2D the same haapens with Jinc. I'm not sure if this is normal or not.

I'm also trying to detect what is causing black screens in 90.15. I'm almost sure that is NNEDI in Chroma Upscaling but I need to do some more testing.
Quoting myself, I have done more tests.

Hi madshi, I kept getting black screens with 90.15, with Chroma NNEDI enabled, Doublind disabled, Chroma Jinc and Doubling NNEDI, and obviously with both using NNEDI. I need to check without using NNEDI at all. I always use NNEDI 32 Neurons for the tests. Black screens are really random, first one occurs after more than 50 minutes of playback, so testing requires lots of time.
fedpul is offline   Reply With Quote
Old 10th March 2016, 15:13   #36791  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
non-framepacked 3D output is broken in 0.90.15.

SBS only has the right eye showing, TB only the bottom, line interleaved is entirely black, and column interleaved behaves just like SBS (which seems odd in itself). On top of that, if i tick the swap eye checkbox, it becomes entirely black all the time.
When I pause playback the image shows up properly, but goes away again when I play again, at least for SBS and TB, interleaved seems to always stay dead.

On the same note, it would be nice if "auto" 3D mode would not switch the resolution if the OS doesn't even offer a stereo 3d mode in the control panel.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 10th March 2016 at 15:23.
nevcairiel is online now   Reply With Quote
Old 10th March 2016, 15:42   #36792  |  Link
RainyDog
Registered User
 
Join Date: May 2009
Posts: 184
Quote:
Originally Posted by aufkrawall View Post
Sometimes my GTX 980 gets stuck in the wrong powerstate when opening videos, it doesn't run with full clock then and drops lots of frames. I suspect this is a randomly triggered bug by NNEDI3 in the OpenCL driver, a bit annoying.
Has someone else observed anything similar? Happens with the current driver (364.51) and some previous ones as well. Win 10 x64.
Add MPC-HC, or whichever video player you use, to 3D applications in the Nvidia control panel and change its power status to prefer maximum power.
RainyDog is offline   Reply With Quote
Old 10th March 2016, 15:53   #36793  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
I don't think that this would help since the use of OpenCL/CUDA forces the card into a specific powerstate anyway.
aufkrawall is offline   Reply With Quote
Old 10th March 2016, 17:44   #36794  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Quote:
Originally Posted by markanini View Post
Enhance details: 1.0 is amazing for resolution
Kinda late to the party and I dunno what voodoo magic is behind "enhance details" but it really looks outstanding @0.7 with NEDI+2X SR@1+FRC+SSIM 2D 25% on SD
leeperry is offline   Reply With Quote
Old 10th March 2016, 17:45   #36795  |  Link
Werewolfy
Registered User
 
Join Date: Feb 2013
Posts: 137
Quote:
Originally Posted by madshi View Post
Are you using DXVA deinterlacing or forced film mode? Does the problem occur with both? And which queue is the first one which is getting into trouble (less than half full)?
I use forced film mode. It seems that decoder queue and ivtc queue limit the other queues, it never fills completely. But the moment I have dropped frames is when render queue drops down.
I tested DXVA and it's not really better. Decoder queue go higher than before but I also have dropped frames when render queue drops down.

Here some videos of my tests, I hope it will be usefull and easier for you to understand.

http://www.filedropper.com/testdeinterlacing1_1

(It's the first time I'm using this file hosting website, I don't know what it worths).

Quote:
Originally Posted by madshi View Post
Would be great if the new NVidia drivers improved the situation there, of course!!
Unfortunately, on my side it still doesn't work as intended even with the new Nvidia drivers, glitch handlling mode enabled or not.

Quote:
Originally Posted by Uoppi View Post
Maybe try upping the GPU queue size even more?

I need to use a GPU queue of at least 20 to avoid dropped frames with interlaced NTSC DVDs. Any less gives me occasional and "unexpected" dropped frames. After extensive testing, 20-20-6 works for me perfectly.

(My setup is Win10, factory-overclocked GTX 960 and a dual core Ivy Bridge Pentium, using D3D11 windowed mode.)
Higher GPU queue doesn't help because it's limited by the decoder that's never filled. On my videos, you can see that I even used 128-24-15 without success.
__________________
Windows 8.1 and 10 x64 - Intel Core i5-4670K (4.2 GHz) - 8 GB DDR3 - MSI Geforce GTX 1080 8 GB - Sony KD-55A1 - Denon AVR-X3600H

Last edited by Werewolfy; 10th March 2016 at 20:43.
Werewolfy is offline   Reply With Quote
Old 10th March 2016, 18:22   #36796  |  Link
Georgel
Visual Novel Dev.
 
Georgel's Avatar
 
Join Date: Nov 2015
Location: Bucharest
Posts: 200
Quote:
Originally Posted by leeperry View Post
Kinda late to the party and I dunno what voodoo magic is behind "enhance details" but it really looks outstanding @0.7 with NEDI+2X SR@1+FRC+SSIM 2D 25% on SD
It does work nicely with real life videos.

It does not work well with animated videos, inducing grain, and what could be perceived as noise.
Georgel is offline   Reply With Quote
Old 10th March 2016, 21:03   #36797  |  Link
Uoppi
Registered User
 
Join Date: Oct 2015
Posts: 99
Quote:
Originally Posted by huhn View Post
there are problem with deinterlancing with windows 10.

of cause this is not an madVR error.
I was just about to say Windows 10 + GTX 960 has worked quite nicely for me too.

Are you referring to some DXVA deinterlacing quality issues? At least deinterlacing seems to work glitch-free to my eyes, so I'm just curious what problems you are referring to.
Uoppi is offline   Reply With Quote
Old 10th March 2016, 22:49   #36798  |  Link
iSeries
Registered User
 
Join Date: Jan 2009
Posts: 625
Quote:
Originally Posted by ryrynz View Post
Only 2ms difference between 1D and 2D? I think something's broken, you should maybe look into resetting your settings as my GTX 960 has about 8ms render times when downscaling 4K content and 16-18ms when using 2D (with AR and LL)
It depends on how much downscaling is required but yes there is always some performance cost over 1d. However, specifically when supersampling, 2d adds 40ms to rendering times over 1d. Using 'Everest' as a test:

Example 1 (1080p supersampling):
Chroma - jinc
Doubling - superxbr 100
Enhancements - crispen edges 1, enhance detail 1, superres 3
Downscaling - ssim 1d
=30ms rendering time

Example 2 (1080p supersampling):
Chroma - jinc
Doubling - superxbr 100
Enhancements - crispen edges 1, enhance detail 1, superres 3
Downscaling - ssim 2d
=70ms rendering time

As you can see, the only difference between the two examples is 1d / 2d, with 2d adding 40ms to rendering times. Surely, that can't be right? Like I said, in every other downscaling or doubling and then downscaling scenario 2d is perfectly usable.
iSeries is offline   Reply With Quote
Old 10th March 2016, 23:13   #36799  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by Uoppi View Post
I was just about to say Windows 10 + GTX 960 has worked quite nicely for me too.

Are you referring to some DXVA deinterlacing quality issues? At least deinterlacing seems to work glitch-free to my eyes, so I'm just curious what problems you are referring to.
amd crimson deint is simply broken it is using a low quality bob. i haven't tested the newest beta yet.

and nvidia deint was kind of "broken" the last time i check and had issue with telecine/hybrid content. the high quality CUVID deint is not usable at all under windows 10.
huhn is offline   Reply With Quote
Old 10th March 2016, 23:14   #36800  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by aufkrawall View Post
I don't think that this would help since the use of OpenCL/CUDA forces the card into a specific powerstate anyway.
and that powerstate is the highes possible so it can help here.
huhn is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 13:07.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.