Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
9th January 2015, 14:35 | #27983 | Link |
Registered User
Join Date: Jul 2008
Posts: 54
|
Is it just me or when i open a 1080p clip with bit rate higher than 3mbps there will be a 2s lag like this? https://www.youtube.com/watch?v=w_H2T6ZaJfo
I show up my madvr setting in the end of the clip. Is that problem with my pc spec or hdd or just madvr nature? |
9th January 2015, 16:26 | #27985 | Link |
Registered User
Join Date: Jul 2008
Posts: 54
|
EVR CP improved it somehow but i still feel it is a bit slow. and yes i am using AMD card
https://www.youtube.com/watch?v=9qvweN_uFmU |
9th January 2015, 16:49 | #27987 | Link |
Registered User
Join Date: Jul 2008
Posts: 54
|
how do i disable DXVA? did you see there is a noticeable jagging/lag/shutter at the beginning with madvr? i just got a bsod open that video the first time which i havent got for like a years.
also i just checked playing it on my ssd drive. the jagging/lag/shutter is still present. |
9th January 2015, 18:33 | #27988 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,926
|
this should do the trick: http://abload.de/img/dxva1mujm.png
i don't get problems opening a BD with 37+ mbit. but i have more than decent hardware. |
9th January 2015, 19:12 | #27989 | Link | |
Registered User
Join Date: Dec 2014
Posts: 34
|
Quote:
Also I have a weird behaviour I just noticed: I have in MadVR http://i58.tinypic.com/dqr89e.jpg and when I start a .wtv with 25fps I get display at 59.9Hz, then I press alt+tab to go on Windows Media Center (Win8.1) and then again alt+tab on MPC-HC and this time I get display at 50Hz and the image is more saturated. What should I do? |
|
9th January 2015, 20:21 | #27992 | Link | |
Registered User
Join Date: Dec 2008
Posts: 496
|
Quote:
What would probably work is that the user had some kind of offline tool that replicates madVRs functionality and settings 100% and calculates a "automatic_profile.mvr", where the actual madVR renderer then can read that profile and you would have your perfect settings for your current config. If you change the graphics card, output resolution or other stuff, you would just do that again, so you would have different automatic_profiles. "Advanced mode" would enable you to access all power features manually, like it is right now. That would certainly make some users happy, others would still prefer to have manual access at all time, so I'm not sure if the time invested in this is worth it. I guess that most people that use madVR are kinda advanced users and know what they are doing, but it would certainly make your first steps easier. |
|
9th January 2015, 20:47 | #27993 | Link | |
Registered User
Join Date: Dec 2008
Posts: 496
|
Quote:
However: It seems that not only is FreeSync more flexible, but according to various CES articles, the first FreeSync monitors are limited to 40Hz at the minimum, or else they will flicker. Not sure if that's because of the pixel matrix / panel driver firmware that just isn't able to go lower, but that's the only info that I came across. More in-depth stuff seems to be still rare. At least theoretically, 9-240Hz seems to be perfect for our needs. |
|
9th January 2015, 21:03 | #27994 | Link |
Registered User
Join Date: Jan 2008
Posts: 589
|
The problem madshi was referring to, I believe, is the lack of explicit G/FreeSync support in APIs (i.e. Direct3D). Frame presentation APIs were never designed with variable refresh rate in mind, and it's not clear how they can be used to achieve precise and arbitrary (custom) timing of frame presentation. It seems the only known way to do this right now is to "pretend to be slow", i.e. behave like a game that can't keep up, but madshi seems to be reluctant to go this way for various (valid) technical reasons.
In that way, I see the fact that recent Windows versions seem to explictly use G/FreeSync for video playback as a good sign, because that might mean that such APIs (or some form of clarifying documentation) are about to be released. Last edited by e-t172; 9th January 2015 at 21:09. |
9th January 2015, 22:14 | #27995 | Link | |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,347
|
Quote:
The reason for the minimal refresh rate is that the screen doesn't actually redraw a frame when no new frame comes in, and the old frame then "decays", which depending on screen allows a minimum of 30 to 40 fps. Of course a theoretical upper limit of 240 fps is also just a paper stat, since we have no screens for that, and for higher resolutions we're also bandwidth limited on DP. GSYNC has its own advantages over FreeSync that AMD is of course not going to mention. For video playback not necessarily important, but AMD doesn't have a framerate limiter when FreeSync is active, which causes tearing when your 3D application/game renders faster than the maximum screen refresh. Apparently this is even a design decision on AMDs side and not just a missing feature. And the fact that its limited to like 2 AMD GPUs and no released screens .. well, we'll see. NVIDIA could always implement FreeSync if the market goes that way, but AMD will never implement GSYNC, so NVIDIA is somewhat in a better position here.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders Last edited by nevcairiel; 9th January 2015 at 22:20. |
|
10th January 2015, 07:23 | #27996 | Link |
Registered User
Join Date: Apr 2009
Posts: 1,019
|
While it has yet to be confirmed, I've seen talk that Adaptive Sync/FreeSync may have two frames of latency just like V-Sync, while G-Sync is polled at 1000Hz for a fixed overhead of 1ms.
And with G-Sync being separate hardware, it guarantees that you have the lowest latency possible. Being able to work with the display's existing processing seems detrimental from a gaming perspective. But Adaptive Sync actually has a chance (though slim) of ending up in televisions rather than just being a thing for PC gaming monitors. I'd prefer that NVIDIA gave us the choice of using G-Sync or Adaptive Sync displays. If G-Sync is better, surely that is what users will buy. If Adaptive Sync is equally good at a lower price, that's good for everyone. It's not like NVIDIA is in the business of selling displays. Last edited by 6233638; 10th January 2015 at 07:29. |
10th January 2015, 13:27 | #27997 | Link | |
Registered User
Join Date: Dec 2014
Posts: 34
|
Quote:
I'm using an AMD APU. |
|
10th January 2015, 14:05 | #27998 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,926
|
Quote:
and is your TV using PC mode there is a huge change your TV can't do PC mode in any other refresh rate than 60 hz (most samsung can't do this for what ever reason)so this may trigger the issue with the saturation. and if this is the case and you want to use PC mode think about 60 HZ only and smooth madVR motion. |
|
10th January 2015, 14:06 | #27999 | Link | |
Registered User
Join Date: Sep 2013
Posts: 919
|
NVIDIA news
Driver 347.09 Quote:
How cool is that? Took'em 10 years but still.... thanks. I'll test and return in a few moments. EDIT: IT WORKS!!! No restarts, no registry hacks, no nothing! Just works in a second.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410. Last edited by James Freeman; 10th January 2015 at 14:19. |
|
10th January 2015, 14:09 | #28000 | Link |
Registered User
Join Date: Oct 2012
Posts: 118
|
anyone please kindly help how to fixed the error? thanks. guys did u noticed when open mpc got some delay? around 2-3 sec
http://www54.zippyshare.com/v/27530580/file.html Last edited by khanmein; 10th January 2015 at 15:51. |
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
|
|