Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
24th July 2013, 21:12 | #19701 | Link |
Registered User
Join Date: Jan 2009
Posts: 1,210
|
Is it possible to use the GPU on the dedicated graphics card alone or combined with the iGPU for all processing power used by madVR, etc. regardless of whether the monitor is connected to the motherboard's displayport/hdmi/dvi or the graphics card's?
EDIT: Nevermind, found the answer: http://www.lucidlogix.com/product-virtu-mvp.shtml http://www.lucidlogix.com/eshop.shtml#compare Now the question is, will this virtual GPU work well with madVR? I'm not sure if the "HyperPerformance" and "Virtual Vsync" will work with madVR. Last edited by dansrfe; 24th July 2013 at 21:52. |
24th July 2013, 23:30 | #19702 | Link |
Registered User
Join Date: Jan 2009
Posts: 1,210
|
If I already have an 16bit ICC profile generated by an i1 DisplayPro is it possible to extract the 3DLUT from it? Or do I need to do some other steps to get the 3DLUT?
Last edited by dansrfe; 24th July 2013 at 23:39. |
24th July 2013, 23:44 | #19703 | Link | |
Registered User
Join Date: Jan 2008
Posts: 589
|
Quote:
Last edited by e-t172; 24th July 2013 at 23:46. |
|
25th July 2013, 00:18 | #19704 | Link | |
Registered User
Join Date: Jan 2009
Posts: 1,210
|
Quote:
|
|
25th July 2013, 03:36 | #19705 | Link | |
Registered User
Join Date: Oct 2012
Location: Akron, OH
Posts: 491
|
Quote:
You can use Eric Gur's QuickSync decoder in LAV and ffdshow without Virtu when the iGPU is not connected to the display. You need a motherboard that allows you to enable dual GPUs in the BIOS, and then you create a "fake" display for the iGPU in Windows Display Resolution page. You can follow Eric's instructions in the QuickSync thread in the Video Encoding section of the board. |
|
25th July 2013, 03:58 | #19706 | Link |
Registered User
Join Date: Jan 2007
Posts: 13
|
has anyone had any issues with madNvLevelsTweaker working with the latest Nvidia drivers (I'm using 320.49). I see no change to any output after running this tool. are there specific registry edits I can check to see if the tweaker is working on my system?
btb/wtw clips when I set madvr to 0-255. I can see btb/wtw when I set madvr to 16-235, but I see a lot of banding in grayramp from madTestPatternSource. from what I've read I'm guessing that the gpu is still sending out 16-235. setting madvr to 16-235 results in a scaling of the full 0-255 output, so I see the full signal but also get banding. and there is also the nagging feeling of this not being setup correctly. i've tested this using a panasonic plasma (set to full range) connected via HDMI and a pc monitor with a straight dvi-dvi connection. I was expecting the dvi-dvi monitor connection to send btw/wtw with madvr set to full range, but even this is not working correctly. i've also read about editing the ini file during driver install to get full range. I'll probably try that tomorrow. and/or look into getting an ATI card. Last edited by ginhead; 25th July 2013 at 04:40. |
25th July 2013, 04:31 | #19707 | Link |
Registered User
Join Date: Jan 2009
Posts: 1,210
|
madshi, I wonder if it's possible to have madVR to run a quick query for other discrete GPUs and possibly parallelize usage of multiple GPUs for processing? Software SLI essentially.
If the monitor is connected directly to the motherboard, madVR could bypass the iGPU and use any and all discrete graphics cards connected to the motherboard. Assuming of course that SLI isn't enabled already. I know that this would probably be at the the bottom of your list or not on it even if this was possible however, just thought I'd ask. Last edited by dansrfe; 25th July 2013 at 04:33. |
25th July 2013, 04:39 | #19708 | Link | |
Registered User
Join Date: Jan 2007
Posts: 13
|
Quote:
I did just test my HD4000 and the dvi-dvi-monitor setup. Same result. It would be nice if I could find some combination that worked as expected. |
|
25th July 2013, 04:51 | #19709 | Link | |
Registered User
Join Date: Jan 2009
Posts: 1,210
|
Quote:
The levels tweaker should work with the latest stable. Just checked. Did you reboot after applying the tweak? |
|
25th July 2013, 08:34 | #19710 | Link | |
Registered User
Join Date: Oct 2012
Posts: 70
|
Quote:
__________________
iiyama prolite xb2483hsu 1080p60 Gamma=2.25 - Intel Core i3-2100 3.10GHz - AMD Radeon HD 6850, RGB 4:4:4 Full range - MPC-HC + XYSubFilter + madVR |
|
25th July 2013, 11:18 | #19711 | Link | |
Registered User
Join Date: Dec 2012
Posts: 40
|
Quote:
If madVR is set to 0-255, 16-235 is expanded to 0-255 and BTB/WTW are clipped. Here is madshi's configuration recommendation. |
|
25th July 2013, 19:10 | #19712 | Link | |
Registered User
Join Date: Jan 2008
Posts: 589
|
Quote:
|
|
25th July 2013, 20:08 | #19713 | Link |
Registered User
Join Date: Jan 2009
Posts: 1,210
|
It doesn't actually mention in the madVR - ArgyllCMS thread whether or not to apply the icm file within windows as well. Is there a different procedure for that or can we just apply the profile generated in step 7?
In combination with disable GPU gamma ramps it should work correctly that way right? |
25th July 2013, 20:50 | #19714 | Link | |
Registered User
Join Date: Jan 2008
Posts: 589
|
Quote:
It doesn't really matter if you configure your device to use an ICC profile in the Windows Color control panel applet, because that only applies to applications which include a CMM (Color Management Module) and choose to use the Windows configuration to get the current monitor's profile so that they can do their transformations. Web browsers and image editing software typically are in this category. Sadly, madVR is not one of them (and that's unfortunate, hopefully madshi will implement it at some point in the future). Now - and this is where it gets really confusing - this configuration information is also used by small tools typically called loaders (which are typically bundled with color calibration software, though there is a free one called xcalib) that run discreetly on Windows startup and whose sole purpose is to take the vcgt (gamma tables) information from the profile which is currently configured in the Windows configuration and apply it in the GPU. These loaders don't look at the profiling information at all, and in that sense they are the exact opposite of the applications above. So, should you load GPU gamma tables or not? Well, if your profile contains vcgt information (most of them do), then yes you should, because the rest of the profile, and by that I mean the profiling information, describes the behavior of the device with the vcgt applied. So if you're using the profile without loading its gamma tables, the result will be wrong. If you are generating a profile purely for use with madVR, then you should generate a profile without any vcgt information (Argyll has options for that), because that removes one processing step which is likely to be better done by madVR since it does proper dithering, which might or might not be the case of the gamma tables implementation of your GPU. However, if you are using other, non-ICC-aware applications (such as games), then you might want to use the calibration provided by the gamma tables. There is also a way to use Argyll to cleanly remove vcgt information from a profile and end up with a profile that describes the device without the vcgt applied, I believe it is explained in the ArgyLL documentation. Last edited by e-t172; 25th July 2013 at 20:53. |
|
25th July 2013, 21:32 | #19715 | Link |
Registered User
Join Date: Jan 2009
Posts: 1,210
|
Essentially, all I want is the main desktop interface, windows/irfan image viewers, and web browsers to acknowledge the profile as well as madVR.
If I understood correctly, it would be best to create a profile without the videoLUT via Argyll with which I can configure to be used by the device in question in the Windows Color Management panel AND use the 3DLUT generated from this profile in step 8 with madVR. What effect would "disable GPU gamma ramps" have in this scenario? Wouldn't that option be a placebo since the profile that was applied in the color management module doesn't include the videoLUT? Now, as you've said, without the videoLUT information presnt in the profile, the profile produces the correct result without the use of a loader during Windows startup which is a plus point. However, would this produce a less accurate result than creating a profile that includes the videoLUT? What is the point of the videoLUT in the first place if it is not accurate enough for madVR and it has to be manually loaded during Windows startup if the profile was created with the videoLUT? I'm not sure I understand the reason to even create profiles with videoLUTs if the benefits of creating profiles without them via Argyll allow you to load the profile without a loader. Also, on a side note, I'm currently using the ICM profile that I got from someone else who calibrated the same monitor using an i1 Display Pro. When I launch madVR with "disable GPU gamma ramps" unchecked, madVR definitely adheres to the profiling information loaded within the color management module. It shows the corrected colors from the profile in windowed/overlay/exclusive modes as well. Maybe I've misunderstood the functionality of "disable GPU gamma ramps"? Last edited by dansrfe; 25th July 2013 at 21:39. |
25th July 2013, 22:42 | #19716 | Link | |||||||
Registered User
Join Date: Jan 2008
Posts: 589
|
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
Last edited by e-t172; 25th July 2013 at 22:54. |
|||||||
25th July 2013, 23:01 | #19717 | Link | |
Registered User
Join Date: Jan 2009
Posts: 1,210
|
Quote:
So, I think I've closed in on the question that really isn't making sense to me now. What does the ICC profile have without the videoLUT? What does the "profiling" information do? Is the profiling information for color/hue saturation? Is this what an application with a CMM will further correct? How does creation of an ICC profile without a videoLUT affect the color/hue saturation readings in the profile? Does this mean that a 3DLUT contains measurements of profiling information such as color/hue saturation AND the videoLUT and directly loads them into the lookup tables in the GPU? Is this why it is more accurate then including a CMM in madVR and utilizing profiling and videoLUT information? If so, shouldn't 3DLUT be the standard instead of fragmented ICC profiles which may or may not contain the videoLUT and whose profiling information may or may not be applied by an application unless it has a CMM? Last edited by dansrfe; 25th July 2013 at 23:09. |
|
26th July 2013, 09:51 | #19719 | Link | |||||||
Registered User
Join Date: Jan 2008
Posts: 589
|
Quote:
If you see the video LUTs being changed during login, it probably means you have a loader being started upon Windows startup that you don't know about. For example the loader bundled with i1 profiling software is called "XRGamma.exe". Quote:
Bottom line: profiling information is used by CMMs to determine how to accurately produce a given color using your device. It's for everything: hue, saturation and luminance. Well, it's more complicated than that (or simpler, depending on your perspective). The way a CMM works, you have an input colorspace (typically sRGB, Adobe RGB, BT.709), and an output color space (described by your device profile). The CMM is simply a converter: it uses complicated algorithms (gamut mapping) to make sure colors in the input colorspace are accurately reproduced in the output color space. My point is, the CMM calculations are dependent on the input color space. Video LUTs, on the other hand, are applied to everything and do not care what the input color space is, which is why they're quite limited in what they can do. Quote:
Quote:
Basically, a 3DLUT is just a CMM: it converts from one color space to another. However, in the case of a CMM the transformation (3DLUT) is calculated on-the-fly depending on the input and output color spaces, as opposed to a static 3DLUT where the input and output color spaces are chosen when generating the 3DLUT. The 3DLUT doesn't contains "measurements" or even a profile: it is generated from them. Quote:
Quote:
Quote:
Also, keep in mind that to do color correction you need to know the input color space, and only applications know that information, so you need application support anyway. Though one could make a case about assuming sRGB as a default input color space. Last edited by e-t172; 26th July 2013 at 09:57. |
|||||||
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
|
|