Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
13th November 2013, 21:39 | #20881 | Link | |
Registered User
Join Date: May 2013
Posts: 77
|
Quote:
|
|
13th November 2013, 22:41 | #20882 | Link | |
Registered User
Join Date: Dec 2002
Posts: 5,565
|
Quote:
|
|
13th November 2013, 22:56 | #20883 | Link | |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
|
Quote:
Games are only not affected when they load their own LUT into the video card. I have no idea why some games do this and I hate them. |
|
13th November 2013, 23:14 | #20884 | Link | |
Registered User
Join Date: May 2012
Posts: 447
|
Quote:
Yeah, it's annoying. I use Monitor Calibration Wizard to keep my LUT loaded (games that do reset the LUT generally do it only once, or only when you alt-tab into them), but what I really want is an overlay that loads a madVR 3DLUT for all applications They all assume something like sRGB anyway, so why not? Last edited by Ver Greeneyes; 13th November 2013 at 23:20. |
|
13th November 2013, 23:32 | #20885 | Link | ||||||
Registered Developer
Join Date: Sep 2006
Posts: 9,140
|
Quote:
Quote:
Do you have a couple of low resolution samples where debanding in the source resolution is noticeably worse than debanding after external scaling? Maybe that could convince me to look into this. Quote:
Quote:
What does speed have to do with anything? The topic at hand was totally unrelated to that. Quote:
Quote:
The same reason why I have to hide the MPC-HC "pause" OSD message: If I ask the ISR to rerender an older frame again, the ISR misbehaves. And the "fade in/out" detection of the debanding algorithm does require the first few frames of the fade in/out to be rerendered. There are different workarounds for this problem available, see the other posts. |
||||||
13th November 2013, 23:57 | #20887 | Link |
Kid for Today
Join Date: Aug 2004
Posts: 3,477
|
FWIW This PS script can do it, using 6 different zones at that.
Last edited by leeperry; 14th November 2013 at 00:01. |
13th November 2013, 23:58 | #20888 | Link | ||||
Registered User
Join Date: May 2008
Posts: 1,840
|
Quote:
Quote:
__________________
PC: FX-8320 GTS250 HTPC: G1610 GTX650 PotPlayer/MPC-BE LAVFilters MadVR-Bicubic75AR/Lanczos4AR/Lanczos4AR LumaSharpen -Strength0.9-Pattern3-Clamp0.1-OffsetBias2.0 Last edited by turbojet; 14th November 2013 at 00:25. |
||||
14th November 2013, 00:23 | #20889 | Link | |
Registered User
Join Date: Jan 2008
Posts: 589
|
Quote:
Yes. Yes please. Unfortunately we all know that's not gonna happen; it could be implemented in the driver but I don't think NVidia/ATI care. Theoretically it's not that hard for application developers to Do The Right Thing themselves; there are lots of libraries available to do ICC transformations (such as the easy-to-use LittleCMS), and even a Windows API. It would be great if the developers of the main game engines (e.g. Unreal Engine, Cryengine, Source, Frostbite...) could generate a transformation using these libraries and then store the 3DLUT on the GPU for real-time correction of the game colors (like madVR), but could luck convincing them to spend time to implement that. Last edited by e-t172; 14th November 2013 at 00:25. |
|
14th November 2013, 02:38 | #20891 | Link |
Registered User
Join Date: Sep 2006
Posts: 212
|
madshi, I have since narrowed down the culprit to the Avisynth+ dll file which I started using for realtime script sharpening right around the time I started using the new deband mVR. Replacing the avisynth+ dll file with the original 2.6 Alpha 5 dll and the problem hasn't reoccurred in two days. Guess I will let the avisynth+ thread know about this problem. Cheers
|
14th November 2013, 03:09 | #20892 | Link | ||
Registered User
Join Date: Oct 2011
Posts: 204
|
Quote:
|
||
14th November 2013, 06:42 | #20893 | Link | |
Registered User
Join Date: May 2012
Posts: 447
|
Quote:
|
|
14th November 2013, 09:49 | #20894 | Link | ||
Registered Developer
Join Date: Sep 2006
Posts: 9,140
|
I do, but I only consider a fade a fade when it lasts at least 5 frames. So it's only after the 5th frame of a fade that I know that this really is a fade. So in that moment when that 5th frame runs through the fade detection algorithm, the previous 4 frames are usually already rendered. madVR is heavily multi-threaded. Everything works in parallel. The fade detection can't introduce a 5 frame delay. So it lets every frame pass until a fade is detected. At that point in time it has to go back and rerender the last 4 frames.
Of course it's not ideal to have to go back to rerender past frames. But it's the best solution I came up with. Every other solution had worse disadvantages than this one. Quote:
Quote:
The feedback I was hoping to get is whether I should use "medium" or "high" as a default option for debanding during fade ins/outs. He did contact me in the past, at least one time, maybe twice. Don't remember. Last edited by madshi; 14th November 2013 at 09:57. |
||
14th November 2013, 11:04 | #20897 | Link | |
Registered User
Join Date: Dec 2002
Posts: 5,565
|
Quote:
It's not compromised if your system is fast enough. Last edited by sneaker_ger; 14th November 2013 at 11:06. |
|
14th November 2013, 11:41 | #20898 | Link | |||
Registered Developer
Join Date: Sep 2006
Posts: 9,140
|
Quote:
What I could do is introduce a separate queue just for fade detection. So the fade queue would then be 5 frames longer than the rendering queue. E.g. the fade queue would then be 21 frames long and the render queue would be 16 frames long. In that case the 5 frame delay would still allow the render queue to fill completely. But this design would require the upload queue (and the decoder queue) to be 5 frames longer, too, which would (together with the new fade queue) increase the GPU RAM consumption. With the current solution the render queue is always nicely filled, except in the very moment when the fifth frame of a fade is detected. In that moment the render queue temporarily goes down to 11/16, but fills up quickly again. All this without any increased GPU RAM consumption. I'm not sure if it makes sense to discuss this in detail. Trust me, I've thought all possibilities through and chosen the best possible implementation. The one and only problem with this implementation is that it makes problems with the ISR. But that's really a bug in the ISR, and I'm not willing to use an inferior render design just to work around an ISR bug. If you insist on using the ISR, either disable the fade detection, or disable the ISR queue, or ask the MPC-HC/BE devs to fix the ISR bug. Or make the switch to XySubFilter. Many options for you. Quote:
Quote:
The speed of the system got nothing to do with it. |
|||
14th November 2013, 12:00 | #20899 | Link | |
Registered User
Join Date: Dec 2002
Posts: 5,565
|
I was indeed thinking about just adding an additional queue for the fade and in turn increase its length. I didn't think about the RAM and thought you just didn't feel like adding yet another queue or didn't want to make it too long. I wasn't affected, just wondering.
Quote:
Well, I thought queues are there to ensure fluid playback, i.e. having some security net/buffer for avoiding dropped frames. Resetting a queue fully or partly/reprocessing frames means decreasing that security and results in more dropouts on average even if most users aren't affected and it can be worked-around/turned off easily. Maybe I'm missing something. |
|
14th November 2013, 12:36 | #20900 | Link |
Registered User
Join Date: Oct 2012
Posts: 70
|
Hi everyone,
I currently use the sRGB profile on my monitor and the default settings in Windows Color management (No ICC profile). I've set "this display is already calibrated" (while it's not) with BT.709 and pure power curve 2.20 in the calibration tab. However, some guides about calibration advice to set pure power curve to 2.40 with sRGB. Is it true or not ? Another question : should I keep my actual settings or change for "disable calibration controls for this display" in the calibration tab and enable gamma processing in the color & gamma tab ? Thanks for your help.
__________________
iiyama prolite xb2483hsu 1080p60 Gamma=2.25 - Intel Core i3-2100 3.10GHz - AMD Radeon HD 6850, RGB 4:4:4 Full range - MPC-HC + XYSubFilter + madVR Last edited by Vyral; 14th November 2013 at 12:38. |
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
Thread Tools | Search this Thread |
Display Modes | |
|
|