Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 28th November 2016, 14:44   #40881  |  Link
paranoya7
Registered User
 
Join Date: Mar 2013
Posts: 5
v0.91.3 and v0.91.2 cant find any scaling algorithms now. "These settings are unknown to this version of the settings browser." What is that problem and what should i do? v0.91.1 setting come back.
paranoya7 is offline   Reply With Quote
Old 28th November 2016, 14:52   #40882  |  Link
FreeFall
Registered User
 
Join Date: May 2009
Posts: 72
paranoya7,

Try re-installing madVR and then run the restore default settings.bat file and see if that fixes the problem.
FreeFall is offline   Reply With Quote
Old 28th November 2016, 14:54   #40883  |  Link
burfadel
Registered User
 
Join Date: Aug 2006
Posts: 2,229
Quote:
Originally Posted by HillieSan View Post
I have only DXD11 presentation and V-sync enabled. You can also enable exclusive full screen mode. I disabled all 'quality for perfomance' options (may not be necessary). I use DVXA copy-back or none in LAV Video. DXVA native was buggy.

I had instability problems after updating madVR, and I had to run 'install.bat' (in admin mode) once after every update of madVR.
Really? Sounds like in some respects yours is as bad as mine. My old R9-280X can handle these settings and NGU, the RX 480 definitely should! I do tweak it for quality that I like.

You definitely should not have to have anything under the performance options enabled, and be able to use NGU, DX11 dither, image enhancements such as thin edges, enhance detail, adaptive sharpen, anti-ringing, add grain (under upscaling refinement), and smooth motion on the 480 since a 280X can handle it fine!

It's not so much performance now (massively improved but still not ideal in v0.91.3, its the instability. Everything else works fine .
burfadel is offline   Reply With Quote
Old 28th November 2016, 15:16   #40884  |  Link
jkauff
Registered User
 
Join Date: Oct 2012
Location: Akron, OH
Posts: 491
@madshi, I've been testing 0.91.3 this morning, and just like with 0.91.1, I can use higher settings with D3D9 than with D3D11 enabled.

GPU is GTX 1060 6GB running on Win 10 build 1607.

Doesn't bother me to use D3D9, but I'm curious why the performance difference?


EDIT: I'm a UX Designer, and I think the changes you've made to the Settings UI are very good and quite well thought out. If you wanted, you could even add an "Easy" mode for beginners, with an Image Enhancement page with only Low, Medium, High, and Very High choices. The defaults are that good (assuming use of NGU).

Last edited by jkauff; 28th November 2016 at 15:41.
jkauff is offline   Reply With Quote
Old 28th November 2016, 15:25   #40885  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,344
Quote:
Originally Posted by flossy_cake View Post
I can repeatably witness it smooth with my own eyes. I cannot deny my own senses otherwise I'd have to be radically sceptical about everything. Do you believe it when people witness smooth motion on the variable gsync framerate demos such as the windmill/pendulum demo or do you think they are all just stupid and just can't detect judder? Because that is one hell of a conspiracy if you believe it. Gsync is for people who are allergic to judder, it's the complete opposite of everything you are making it out to be. It's the solution, not the problem
You are once again and repeatedly ignoring the fact that those demos are rendered in real-time, they are not recorded videos.
Videos do not work like real-time rendered 3d scenes, which is the entire point several of us have been trying to bring across to you for several posts.

If a video was recoded at 24p, then its frames are exactly 41.6ms apart. Exactly that. No other interval is acceptable. Not 40ms, and not 43ms, and certainly not 35 or 50ms.

G-SYNC and FreeSync are awesome for 3D rendering in games and the like, the increase in smoothness is extremely noticeable. I have a screen myself. However, video playback is not a 3d game - that is the entire and only point everyone has been trying to make.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 28th November 2016 at 15:30.
nevcairiel is offline   Reply With Quote
Old 28th November 2016, 15:38   #40886  |  Link
Backflash
Registered User
 
Join Date: Jan 2016
Posts: 52
Quote:
Originally Posted by leeperry View Post
Well, I'd like to use Jinc AR for chroma if I have chroma NGU disabled and prefer to beef up luma NGU.....Not possible anymore either? But you said that NGU-low is just as fast as Jinc so I'll have a go at all this ASAP
Chroma setting are in chroma upscaling and you can set jinc there no problem.
chroma that is in image upscaling is for double chroma

I will say this again, it took me a while to understand as well.

The only thing that changed is doubling settings, now it's simply condenced in one menu,, and they haven't changed that much you still have same options there as before.
LL for SSIM is one of two things you can't set for double resolution downscaling.
And other thing is superxbr in doubling. No idea why would we need xbr for doubling anymore with new low for NGU though, maybe didn't test enough SD sources.
There is still a lot of confusion how to adjust other settings for NGU, because it conflicts with a lot of refinements and people think that it's worse than tested algorithms when it's in the very least it's equal PQ wise if set up properly.
Backflash is offline   Reply With Quote
Old 28th November 2016, 15:42   #40887  |  Link
pose
Registered User
 
Join Date: Jul 2014
Posts: 69
madshi, some really needed changes with the new build! I like the new interface a lot! NGU works smoother on my RX470 now.
pose is offline   Reply With Quote
Old 28th November 2016, 15:50   #40888  |  Link
MariaX9
Registered User
 
Join Date: May 2016
Posts: 27
I also like the new GUI way more.
MariaX9 is offline   Reply With Quote
Old 28th November 2016, 15:50   #40889  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Uff, it's impossible now to do any kind of NGU quadrupling with 2x the same NGU settings.
I can't test quality like this, this is really bad.
aufkrawall is offline   Reply With Quote
Old 28th November 2016, 16:02   #40890  |  Link
Backflash
Registered User
 
Join Date: Jan 2016
Posts: 52
Quote:
Originally Posted by aufkrawall View Post
Uff, it's impossible now to do any kind of NGU quadrupling with 2x the same NGU settings.
I can't test quality like this, this is really bad.
Wait, it didn't work in previous builds anyway, because it's not finished as far as I remember, and you still can do that for NNDI3.
Backflash is offline   Reply With Quote
Old 28th November 2016, 16:04   #40891  |  Link
flossy_cake
Registered User
 
Join Date: Aug 2016
Posts: 605
Quote:
Originally Posted by nevcairiel View Post
G-SYNC and FreeSync are awesome for 3D rendering in games and the like, the increase in smoothness is extremely noticeable. I have a screen myself. However, video playback is not a 3d game - that is the entire and only point everyone has been trying to make.
I see no reason why it should make a difference, both ways we are just presenting a final 2D raster at a certain time interval.

I'm out of this thread because it's giving me cancer. Bye.
flossy_cake is offline   Reply With Quote
Old 28th November 2016, 16:06   #40892  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Quote:
Originally Posted by Backflash View Post
Wait, it didn't work in previous builds anyway, because it's not finished as far as I remember, and you still can do that for NNDI3.
It worked in previous builds.
I was talking about the classic quadruple (aka 2x double), not new one-step quadruple madshi currently is working on for NGU.

Last edited by aufkrawall; 28th November 2016 at 16:08.
aufkrawall is offline   Reply With Quote
Old 28th November 2016, 16:15   #40893  |  Link
plasma
Registered User
 
Join Date: Nov 2016
Posts: 15
I am in favor of the simplifications of the gui. My account may be new but I have been using madvr since more than a year.
plasma is offline   Reply With Quote
Old 28th November 2016, 16:19   #40894  |  Link
plasma
Registered User
 
Join Date: Nov 2016
Posts: 15
Quote:
Originally Posted by flossy_cake View Post
I see no reason why it should make a difference, both ways we are just presenting a final 2D raster at a certain time interval.

I'm out of this thread because it's giving me cancer. Bye.
The cognitive dissonance is astounding. How dare you use gaming hw such as freesync monitors. Madvr is for HTPC. But requiring high end gaming graphics cards is totally fine
plasma is offline   Reply With Quote
Old 28th November 2016, 16:21   #40895  |  Link
Betroz
Is this for real?
 
Betroz's Avatar
 
Join Date: Mar 2016
Location: Norway
Posts: 168
Newest madVr settings

@ madshi

I have tried the newest madVr version and I miss the ability to set the upscaling settings like before. With the max awailable NGU settings, it does'nt fully utilize my GTX 1080 card anymore. This is my settings (with Chroma Upscaling set to NGU-VeryHigh) :



Here is a OSD screen of a 480p, 29 fps clip :



Max 8.34ms rendering... I could have used much higher settings, but with newest madvr, that is not possible.

Here is a OSD screen of a 720p, 23 fps clip :



Same here. Am I missing something here? My image quality was better with 0.91.1 version of madvr with my (higher) settings there.
__________________
My HTPC : i9 10900K | nVidia RTX 4070 Super | TV : Samsung 75Q9FN QLED
Betroz is offline   Reply With Quote
Old 28th November 2016, 16:30   #40896  |  Link
pose
Registered User
 
Join Date: Jul 2014
Posts: 69
Betroz, you are example of why the changes were made. Smh...
pose is offline   Reply With Quote
Old 28th November 2016, 16:32   #40897  |  Link
Betroz
Is this for real?
 
Betroz's Avatar
 
Join Date: Mar 2016
Location: Norway
Posts: 168
Quote:
Originally Posted by pose View Post
Betroz, you are example of why the changes were made. Smh...
Uhhhh explain please...
__________________
My HTPC : i9 10900K | nVidia RTX 4070 Super | TV : Samsung 75Q9FN QLED
Betroz is offline   Reply With Quote
Old 28th November 2016, 16:35   #40898  |  Link
fedpul
Registered User
 
Join Date: Feb 2014
Posts: 94
Hi, I want to report a bug, I can't select chroma quality very high when using luma NGU High. (I just was testing different profiles and discovered it), looks almost the same with automatic.

I also would like to say that I really like the new GUI, found it more intuitive than before, and easier to use for noobs or starters. But as usual you need to know what you are doing!

Last edited by fedpul; 28th November 2016 at 16:37. Reason: Commenting about the new GUI.
fedpul is offline   Reply With Quote
Old 28th November 2016, 16:43   #40899  |  Link
e-t172
Registered User
 
Join Date: Jan 2008
Posts: 589
Quote:
Originally Posted by flossy_cake View Post
Apparently the Windows high precision timer uses a dedicated hardware clock called HPET.
https://en.wikipedia.org/wiki/High_P...on_Event_Timer
You may need to enable it at the windows command line with the "bcdedit /set useplatformclock true" command.
That's not enough, though. You don't just need a hardware timer, you also need a solution that will quickly take the appropriate action when that timer fires (in this case, triggering a screen refresh). HPET will not do that for you, as it knows nothing about video. Sure, you could use HPET to wake up a software thread that does the refresh (using a present call, I presume), but then you're at the mercy of Windows thread scheduling delays, which translates to jitter. HPET is great for measuring time, but it's not very helpful for triggering actions, because it doesn't fix the thread scheduling problem (and other sources of delays such as in the GPU driver).

This issue does not apply to traditional fixed VSync because the GPU is in charge of refreshing the screen on every VSync interval, and that refresh is both timed and acted upon entirely in hardware by the GPU output circuitry itself, with no software components involved. This guarantees that screen refreshes will occur on time and with negligible jitter.

Quote:
Originally Posted by plasma View Post
The cognitive dissonance is astounding. How dare you use gaming hw such as freesync monitors. Madvr is for HTPC. But requiring high end gaming graphics cards is totally fine
Again, G-Sync/FreeSync would be great for video playback in theory. In practice however, it is not viable for video playback unless GPU manufacturers provide an API for presenting frames in advance with hardware-enforced timestamps in G-Sync/Freesync mode (see madshi's FAQ). This is not about a lack of interest, it's about technical feasibility.

Also, not everyone uses madVR with "high end graphics cards". There are people, such as me, who use madVR mostly for its reliable, smooth playback and features such as dithering, 3DLUT support and Smooth Motion, and have little interest in extreme power-hungry upscaling algorithms. Just because most of the discussion in this thread is about upscaling does not mean everyone cares about upscaling.
e-t172 is offline   Reply With Quote
Old 28th November 2016, 17:05   #40900  |  Link
Crimson Wolf
Registered User
 
Join Date: Dec 2014
Posts: 51
@flossy_cake

It seems perfectly clear to me why madvr can't use adaptive sync tech (freesync/g-sync). You seem to assume madvr can generate in between frames in a video file when it's impossible. It's like setting pre-rendered frame to infinite. If you record a ball flying across the air on a video, if you examine all the frames individually frame by frame, you see it in a certain position as it moves along the screen. If a frame is delayed by half a frame, you can't generate where to ball should be in half a frame delay. For a game, it's easy since it "knows" where the ball should be at ANY TIME.

Unless everyone here that has been trying to explain to you is wrong about the way adaptive sync works, there's no simple and easy way to implement support in madvr.

EDIT: "bcdedit /set useplatformclock true" is bad advice. Just leave it to windows to choose which timer it should use.

Last edited by Crimson Wolf; 28th November 2016 at 17:08.
Crimson Wolf is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 19:53.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.