Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 5th March 2014, 22:17   #24321  |  Link
JonnyRedHed
Registered User
 
Join Date: Nov 2013
Location: Wales
Posts: 37
Buggy on current NV drivers, ah I see. Thank you both.

Downgrade to driver 327.23, ah that's not so easy when your main computer is your games rig as well as your video-hdmi player.

I'll have to wait for a fix then. Has Nvidia acknowledged the problem yet and given a time frame for fixing NNEDI3.
JonnyRedHed is offline   Reply With Quote
Old 5th March 2014, 22:19   #24322  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
Quote:
Originally Posted by JonnyRedHed View Post
Has Nvidia acknowledged the problem yet and given a time frame for fixing NNEDI3.
Acknowledged yes, but you will never get a time frame out of big companies like that.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 5th March 2014, 22:21   #24323  |  Link
JonnyRedHed
Registered User
 
Join Date: Nov 2013
Location: Wales
Posts: 37
I see, thank you.
JonnyRedHed is offline   Reply With Quote
Old 5th March 2014, 22:32   #24324  |  Link
MistahBonzai
Registered User
 
Join Date: Mar 2013
Posts: 101
Quote:
Originally Posted by James Freeman View Post
Try it and thank me later.
I updated the code, so be sure to copy the 1.4.1 version
Thanks - I had been using 1.3.10 (also have 1.3.12 but it is flawed and crashes MPC-HC) 1.4.1 works just fine - I have it enabled as a "post resize shader". BTW: Anyone expecting obvious razor sharp results need to look elsewhere - this shader is very subtle using the defaults when applied to good quality 720P being upsized to 1080P.
MistahBonzai is offline   Reply With Quote
Old 5th March 2014, 23:17   #24325  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,650
MadVR, LAV and XySubFilter updates all at once, it's like they plan these things
ryrynz is offline   Reply With Quote
Old 5th March 2014, 23:54   #24326  |  Link
Q-the-STORM
Registered User
 
Join Date: Sep 2012
Posts: 174
any way to manually install an old nvidia driver on a new card?
Got a GTX 780 Ti, but it came out after 327.23, which means the driver is refusing to install...
Q-the-STORM is offline   Reply With Quote
Old 6th March 2014, 00:15   #24327  |  Link
MistahBonzai
Registered User
 
Join Date: Mar 2013
Posts: 101
Quote:
Originally Posted by QBhd View Post
.
.
.

I have tinkered with a calibration setting of pure power curve of 2.5 gamma and pure power curve 2.35 in the gamma correction... but I wasn't happy with the results. I have also tried other combos, and currently have both set at disabled (again not happy with the results). I don't have access to calibration gear, but thought maybe I could get better results if I understood what my TV's gamma is (pure pc, bt.709, etc. etc.) and what the target should be for gamma correction.

I have included screen shots of the two settings in question if there is any confusion about what I am talking about





Thanks in advance,

QB
It's my understanding - and I might be wrong here - that if you are not actually utilizing an external calibration file (yCMS or 3DLUT) you don't "disable the GPU gamma ramps" - I have done extensive display level correction utilizing test patterns but have not used yCMS or 3DLUT.

Under "Color & Gamma" I don't "enable gamma processing" because I hadn't selected "disable the GPU gamma ramps" under "Calibration".

It seems to me that with both selected you may well be working one against the other..no?
MistahBonzai is offline   Reply With Quote
Old 6th March 2014, 00:16   #24328  |  Link
cyberbeing
Broadband Junkie
 
Join Date: Oct 2005
Posts: 1,859
Quote:
Originally Posted by Q-the-STORM View Post
any way to manually install an old nvidia driver on a new card?
Got a GTX 780 Ti, but it came out after 327.23, which means the driver is refusing to install...
NVIDIA drivers use a unified architecture, so any driver which at least supports Kepler (and maybe even prior) should work on a GTX 780 Ti with an inf mod.

Here I've uploaded a modified nv_disp.inf (English) & nv_dispi.inf (International) x64 which adds support for GTX 780 Ti to 327.23.

Overwrite the inf file in C:\NVIDIA\DisplayDriver\327.23\Win8_WinVista_Win7_64\[International|English]\Display.Driver\ and install as normal using the Setup.exe one directory up. Inf mods such as this strip WHQL certification, so standard rules apply for installing non-WHQL drivers on 64-bit Windows.

Last edited by cyberbeing; 6th March 2014 at 00:23.
cyberbeing is offline   Reply With Quote
Old 6th March 2014, 00:19   #24329  |  Link
JustinChase
Registered User
 
Join Date: Jan 2007
Posts: 33
Quote:
Originally Posted by Boltron View Post
I really would love to use a sharpen filter but I use JRiver and can't find a way to do it. Madshi did mention one or twice that he could add a means to include a filter ...
Quote:
Originally Posted by madshi View Post
Maybe sooner or later, but not in the next build.
Perhaps Nevcariel can help with this, considering his new position with JRiver. Just thinking out loud.

Regardless, thanks Madshi for all that you do for us, and how well you manage all the various opinions and personalities that participate in this thread. I'm constantly amazed at your calmness and rationality!!

Keep up the good work!
JustinChase is offline   Reply With Quote
Old 6th March 2014, 00:29   #24330  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by pirlouy View Post
Congratulations on dithering options design. You made it comprehensible for casual users.
Thanks!

Quote:
Originally Posted by James Freeman View Post
Bug #167 "dithering produced dithering noise on pure black areas" still exists.
This bug was only apparent when 3DLUT or yCMS where active, unlike the dithered Whites (even without 3DLUT) with ATI card which is a different (real) bug altogether.
Bug #167 also occured without 3DLUT/yCMS, but only with ordered dithering. That part is fixed.

Quote:
Originally Posted by 6233638 View Post
Does this build sync dithering to the video framerate now to prevent flickering? I didn't see it mentioned in the release notes.
Nope. That would noticeably increase rendering times, and the flickering only seems to occur with rather low bitdepth settings.

Quote:
Originally Posted by 6233638 View Post
I know that you're not taking feature requests, but would it be possible to have a keyboard shortcut toggle "disable GPU gamma ramps" ?
No, sorry. I'm afraid of side effects. The whole gamma ramp stuff is quite complicated.

Quote:
Originally Posted by flashmozzg View Post
Is DC done by dx9<->dx10/11 interop? If so, how high overhead is?
Yes. The overhead is relatively small. Very very small compared to dx9<->opencl interop (with AMD GPUs).

Quote:
Originally Posted by annovif View Post
Madshi, can i know why do you prefer 7xxx series over the new series?
Quote:
Originally Posted by Gagorian View Post
The 'new' cards i.e. r7 250x, r7 260(x), r7 265, r9 270(x) and r9 280(x) are rebranded 7xxx series GPU's.
^

I only meant to say you should *not* get a 6xxx generation AMD GPU or older. Get a 7xxx generation GPU or newer.

Quote:
Originally Posted by Shiandow View Post
I just succeeded in recreating the 'blinking' effect by just replacing the video with noise. As far as I can tell it is a limitation of the screen. In my case I suspect my screen takes too long to get to full brightness so the frame that is shown longer gets brighter.
Yes, that could be an explanation. It's a weird effect, to be sure. Fortunately this is really only a problem with very low bitdepths, where the dithering dots differ a lot from one frame to the next. The higher the bitdepth, the less changes the screen has to do.

Quote:
Originally Posted by DigitalLF View Post
well the new projector is a Sony VPL-VW500ES 4k projector.. so for me to get the best picture.. what card should i use?
As I said, it all depends. There is no simple answer. I would have to ask you a hundred questions about which algorithms you want to use and which formats and framerates you need support for before I could give you a good answer. And I don't have the time for that.

Quote:
Originally Posted by DigitalLF View Post
most source material is 1080p24
Yes, most. But not all. Take any interlaced source, after deinterlacing it becomes 60p, after scaling to 4K it becomes 4kp60 which is very GPU power consuming.

Quote:
Originally Posted by leeperry View Post
It's often better to be one generation late with computer parts, much higher bang/bucks most of the time
Maybe, but stay away from AMD 6xxx and older, because 7xxx has brought several important improvements (much faster copyback speeds, better compute performance, just to name 2).

Quote:
Originally Posted by jaju123 View Post
I have 2x AMD R9 290s and I can't even use NNEDI doubling Too many dropped frames...
That's too broad a statement. With which source resolution? With which framerate? Only luma doubling or also chroma doubling? With how many neurons? Which algorithms for the other scaling operations? All of this will have an effect. FWIW, I can do luma doubling for 24p content just fine with my HD7770, which should be much slower than your GPU. But this is only with 24p content. For 60p luma doubling my GPU is too slow.

Quote:
Originally Posted by ryrynz View Post
MadVR, LAV and XySubFilter updates all at once, it's like they plan these things
The madVR and XySubFilter updates were actually planned to come together. The new LAV release was not coordinated, though.

Quote:
Originally Posted by JustinChase View Post
thanks Madshi for all that you do for us, and how well you manage all the various opinions and personalities that participate in this thread. I'm constantly amazed at your calmness and rationality!!
Thanks!
madshi is offline   Reply With Quote
Old 6th March 2014, 00:54   #24331  |  Link
6233638
Registered User
 
Join Date: Apr 2009
Posts: 1,019
Quote:
Originally Posted by madshi View Post
Nope. That would noticeably increase rendering times, and the flickering only seems to occur with rather low bitdepth settings.
Quite a few people mentioned that they saw this effect in 8-bit when the dither pattern was changing every frame - perhaps it would be a worthwhile "trade performance for quality" option, rather than using static dither patterns as a low quality workaround?
Quote:
Originally Posted by madshi View Post
No, sorry. I'm afraid of side effects. The whole gamma ramp stuff is quite complicated.
Well I can toggle it on and off without any issues right now, the problem is that I have to bring up the config window to do it. I was just hoping for a quick way to change the setting. But I suppose it might introduce unforeseen problems that you don't want to deal with right now.
6233638 is offline   Reply With Quote
Old 6th March 2014, 01:17   #24332  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by 6233638 View Post
Quite a few people mentioned that they saw this effect in 8-bit when the dither pattern was changing every frame
As far as I understand what they saw in 8-bit was not flickering, but a higher noise floor. The higher noise floor is independent of whether the movie framerate matches the display refreshrate or not. I can see the higher noise floor even if framerate and refreshrate match. The flickering only occurs if there's a framerate/refreshrate mismatch. So these 2 things have different causes. Which means that changing the dither pattern for every VSync refresh at 8-bit would neither help reduce flickering (because flickering doesn't exist at 8-bit in the first place) nor would it reduce the noise floor. So it would serve no purpose.
madshi is offline   Reply With Quote
Old 6th March 2014, 01:32   #24333  |  Link
Shiandow
Registered User
 
Join Date: Dec 2013
Posts: 753
Oh, so the dither pattern didn't actually change every frame when smooth motion was enabled? Well, that solves the mystery of why enabling smooth motion didn't actually help to prevent 'blinking'. Anyway I agree that this 'flickering' probably doesn't occur with 8-bits, and if it does then I can't say for certain whether changing the dither pattern every vsync will actually help. Although it would make all frames equally wrong.
Shiandow is offline   Reply With Quote
Old 6th March 2014, 01:37   #24334  |  Link
seiyafan
Registered User
 
Join Date: Feb 2014
Posts: 162
Here's the result of R9 290X



This is the result of upscaling to 1440 with a H.265 encoded video, rendering time is 10-30% longer with bluray, ouch! If you are upscaling to 4k expect the time to be even longer. I also tried to "maxed out" MadVR with 256 neurons in both luma and chroma doubling, as you can see, it answered my question of why madshi decided to stop at 256 and not go higher. :P

If graphics card performance doubles every two years, then in 5-6 years you can upscale to 1440 with 256 neurons without frame dropping, for 4k maybe in 7-9 years? Maybe in eight years the high end will be 8k instead of 4k, who knows?

Last edited by seiyafan; 6th March 2014 at 02:01.
seiyafan is offline   Reply With Quote
Old 6th March 2014, 02:30   #24335  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Quote:
Originally Posted by madshi View Post
* fixed: #170: Overlay mode sometimes unnecessarily cleared GPU gamma ramps
Nice new version! Thanks a lot for this fix. And everything else too of course.
Asmodian is offline   Reply With Quote
Old 6th March 2014, 02:40   #24336  |  Link
seiyafan
Registered User
 
Join Date: Feb 2014
Posts: 162
Does "use OpenCL to process DXVA NV12 surfaces(Intel, AMD)" do anything? In my test it reduced rendering time from 38.36ms to 38.24ms.
seiyafan is offline   Reply With Quote
Old 6th March 2014, 02:52   #24337  |  Link
cyberbeing
Broadband Junkie
 
Join Date: Oct 2005
Posts: 1,859
Impending Doom
cyberbeing is offline   Reply With Quote
Old 6th March 2014, 03:16   #24338  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 339
Quote:
Originally Posted by seiyafan View Post
Just want to double check, in MPC under LAV video decoder, what's a good hardware decoder to use? Is it none?
There are quite a number of things you could take into consideration when deciding if you should use a hardware LAV video decoder, or choose none.

As has been mentioned before on this board using LAV hardware acceleration can lead to faster ms rendering times.

Do you like, or use Smooth Motion with madVR? During my testing of using different LAV video hardware decoder options I found that I'd get dropped frames somewhere along the way (in some cases not until 6 or 7 minutes of video played) if I had Smooth Motion enabled in madVR and any of the hardware decoding options selected in LAV. I've concluded that my video playback is smooth enough without Smooth Motion enabled on my system, and that I'm not a fan of the blur effect that smooth motion adds to some moving objects when its enabled. If you like using Smooth Motion then you should probably set the LAV hardware decoder to none.

Next thing to consider is the power of your CPU and GPU. The more powerful your GPU the more benefit you may see from selecting a hardware decoder. If you have a strong GPU and weak CPU then using hardware acceleration may be the better choice. On the other end of that if you have a strong CPU, but a weak GPU then you may be better with none selected as software will be used, and thus more of the load will be passed to the CPU.

Some Nvidia users seem to prefer to go with one of the DXVA2 options over NVIDIA CUVID because they say CUVID turns the GPU on for 100% of the time CUVID is being used while DXVA2 can fluctuate the GPU usage. If your using a laptop/notebook and viewing your videos while running on the notebooks battery then you'll probably want to run it with one of the DXVAs (or None) instead of CUVID. If you have a noisy fan or cooling system then likewise you may want to choose DXVA2 or None. If you're watching your videos while plugged into an electrical outlet, and you have a quite fan/cooling system then why not let the GPU kick up to full power.

I found in my testing of watching one video with CUVID set to do LAV Video hardware decoding that I had a consistent 5.51ms rendering and 1.51ms present while viewing. While running with the DXVA2 and none options I saw it steadily creep up from 5.56ms to an ever increasing number as video playback continued (22+ms 2 and a half minutes into viewing with DXVA2, and 16+ms at the same point with None selected ... a bug in the readings perhaps...). If you're using a system that is using an AMD or Intel then you don't even need to consider CUVID.

Like most people on this board you'll probably want to experiment and see what gives you good speed while also providing stability and freedom from glitches during video playback. What works best on one system may not be the best settings on others.
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.
Anime Viewer is offline   Reply With Quote
Old 6th March 2014, 03:22   #24339  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,650
Just wondering if maybe we should have 'Delay playback until render queues are full' enabled by default, not having it enabled shows multiple drops frames on playback of the next file in the playlist for me.
ryrynz is offline   Reply With Quote
Old 6th March 2014, 04:00   #24340  |  Link
seiyafan
Registered User
 
Join Date: Feb 2014
Posts: 162
Are most of you using MPC-HC or BE? I just tried BE and couldn't figure out how to get MadVR to work.
seiyafan is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 06:55.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.