Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 5th March 2014, 13:12   #24281  |  Link
THX-UltraII
Registered User
 
Join Date: Aug 2008
Location: the Netherlands
Posts: 850
Quote:
Originally Posted by James Freeman View Post
If I'm not mistaken, with Nvidia cards there is a separate Video Engine (PureVideo) decoder processor on the GPU for that.
So everything madVR uses as GPU processing is done on the main processor of the GPU and not on the Video Engine decoder.
Isn't that the case with ATI also?

On the other hand CPU decoding is "less buggy" and is constantly updated.
Maybe nevcairiel can clarify how it works.
Like you said, CPU processing is stable as hell and since I m not using SVP anymore I have loads of CPU power left.

Aren t you also using a Sony VW1000 as display device like me? What are you preferred settings for 2K to 4K Blu-Ray content upscaling with madVR? Something like the settings I use?
THX-UltraII is offline   Reply With Quote
Old 5th March 2014, 13:16   #24282  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by THX-UltraII View Post
Aren t you also using a Sony VW1000 as display device like me?
No, It wasn't me (although I really wish it was).
I still don't have the 25,000$ () to shed on a projector...
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.
James Freeman is offline   Reply With Quote
Old 5th March 2014, 13:20   #24283  |  Link
THX-UltraII
Registered User
 
Join Date: Aug 2008
Location: the Netherlands
Posts: 850
Quote:
Originally Posted by James Freeman View Post
No, It wasn't me (although I really wish it was).
I still don't have the 25,000$ () to shed on a projector...
I must have mistaken you for someone else. My bad.

But what is your thought about my settings when thinking '2K to 4K' upscaling.
THX-UltraII is offline   Reply With Quote
Old 5th March 2014, 13:34   #24284  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by THX-UltraII View Post
But what is your thought about my settings when thinking '2K to 4K' upscaling.
Debanding (MID), softens your image.
If you watch original (non-internet) Blu-Rays, disable debanding (or set to Low if its a 10GB+ internet content).

JINC is good but, I still prefect Lanczos because it preserves the detail at higher frequencies, whether Jinc blurs them a little (as I see it).


One more (important) thing,
Use this (MPC LumaSharpen) shader Its the best sharpenning shader available (IMO), Yes way better (and smarter) than Sharpen Complex 2.
It will not touch the finest detail (add contrast) but will sharpen the not sharp enough detail/textures.

Copy into a .hlsl (if using MPC-HC) file and activate the pixel shader,

Code:
/*
_____________________

LumaSharpen 1.4.1
_____________________

by Christian Cann Schuldt Jensen ~ CeeJay.dk

It blurs the original pixel with the surrounding pixels and then subtracts this blur to sharpen the image.
It does this in luma to avoid color artifacts and allows limiting the maximum sharpning to avoid or lessen halo artifacts.

This is similar to using Unsharp Mask in Photoshop.

Compiles with 3.0
*/

/*-----------------------------------------------------------.
/                      User settings                          /
'-----------------------------------------------------------*/

#define sharp_strength 0.65
#define sharp_clamp 0.035
#define pattern 8
#define offset_bias 1.0
#define show_sharpen 0

/*-----------------------------------------------------------.
/                      Developer settings                     /
'-----------------------------------------------------------*/
#define CoefLuma float3(0.2126, 0.7152, 0.0722)      // BT.709 & sRBG luma coefficient (Monitors and HD Television)
//#define CoefLuma float3(0.299, 0.587, 0.114)       // BT.601 luma coefficient (SD Television)
//#define CoefLuma float3(1.0/3.0, 1.0/3.0, 1.0/3.0) // Equal weight coefficient

/*-----------------------------------------------------------.
/                          Main code                          /
'-----------------------------------------------------------*/

float4 p0 :  register(c0);
sampler s0 : register(s0);

#define px (1.0 / p0[0])
#define py (1.0 / p0[1])

float4 main(float2 tex : TEXCOORD0) : COLOR0
{
    // -- Get the original pixel --
    float3 ori = tex2D(s0, tex).rgb;       // ori = original pixel
    float4 inputcolor = tex2D(s0, tex);

        // -- Combining the strength and luma multipliers --
        float3 sharp_strength_luma = (CoefLuma * sharp_strength); //I'll be combining even more multipliers with it later on

        /*-----------------------------------------------------------.
        /                       Sampling patterns                     /
        '-----------------------------------------------------------*/
        //   [ NW,   , NE ] Each texture lookup (except ori)
        //   [   ,ori,    ] samples 4 pixels
        //   [ SW,   , SE ]

        // -- Pattern 1 -- A (fast) 7 tap gaussian using only 2+1 texture fetches.
#if pattern == 1

        // -- Gaussian filter --
        //   [ 1/9, 2/9,    ]     [ 1 , 2 ,   ]
        //   [ 2/9, 8/9, 2/9]  =  [ 2 , 8 , 2 ]
        //   [    , 2/9, 1/9]     [   , 2 , 1 ]

        float3 blur_ori = tex2D(s0, tex + (float2(px, py) / 3.0) * offset_bias).rgb;  // North West
        blur_ori += tex2D(s0, tex + (float2(-px, -py) / 3.0) * offset_bias).rgb; // South East

    //blur_ori += tex2D(s0, tex + float2(px,py) / 3.0 * offset_bias); // North East
    //blur_ori += tex2D(s0, tex + float2(-px,-py) / 3.0 * offset_bias); // South West

    blur_ori /= 2;  //Divide by the number of texture fetches

    sharp_strength_luma *= 1.5; // Adjust strength to aproximate the strength of pattern 2

#endif

    // -- Pattern 2 -- A 9 tap gaussian using 4+1 texture fetches.
#if pattern == 2

    // -- Gaussian filter --
    //   [ .25, .50, .25]     [ 1 , 2 , 1 ]
    //   [ .50,   1, .50]  =  [ 2 , 4 , 2 ]
    //   [ .25, .50, .25]     [ 1 , 2 , 1 ]


    float3 blur_ori = tex2D(s0, tex + float2(px, -py) * 0.5 * offset_bias).rgb; // South East
        blur_ori += tex2D(s0, tex + float2(-px, -py) * 0.5 * offset_bias).rgb;  // South West
    blur_ori += tex2D(s0, tex + float2(px, py) * 0.5 * offset_bias).rgb; // North East
    blur_ori += tex2D(s0, tex + float2(-px, py) * 0.5 * offset_bias).rgb; // North West

    blur_ori *= 0.25;  // ( /= 4) Divide by the number of texture fetches

#endif

    // -- Pattern 3 -- An experimental 17 tap gaussian using 4+1 texture fetches.
#if pattern == 3

    // -- Gaussian filter --
    //   [   , 4 , 6 ,   ,   ]
    //   [   ,16 ,24 ,16 , 4 ]
    //   [ 6 ,24 ,   ,24 , 6 ]
    //   [ 4 ,16 ,24 ,16 ,   ]
    //   [   ,   , 6 , 4 ,   ]

    float3 blur_ori = tex2D(s0, tex + float2(0.4*px, -1.2*py)* offset_bias).rgb;  // South South East
        blur_ori += tex2D(s0, tex + float2(-1.2*px, -0.4*py) * offset_bias).rgb; // West South West
    blur_ori += tex2D(s0, tex + float2(1.2*px, 0.4*py) * offset_bias).rgb; // East North East
    blur_ori += tex2D(s0, tex + float2(-0.4*px, 1.2*py) * offset_bias).rgb; // North North West

    blur_ori *= 0.25;  // ( /= 4) Divide by the number of texture fetches

    sharp_strength_luma *= 0.51;
#endif

    // -- Pattern 4 -- A 9 tap high pass (pyramid filter) using 4+1 texture fetches.
#if pattern == 4

    // -- Gaussian filter --
    //   [ .50, .50, .50]     [ 1 , 1 , 1 ]
    //   [ .50,    , .50]  =  [ 1 ,   , 1 ]
    //   [ .50, .50, .50]     [ 1 , 1 , 1 ]

    float3 blur_ori = tex2D(s0, tex + float2(0.5 * px, -py * offset_bias)).rgb;  // South South East
        blur_ori += tex2D(s0, tex + float2(offset_bias * -px, 0.5 * -py)).rgb; // West South West
    blur_ori += tex2D(s0, tex + float2(offset_bias * px, 0.5 * py)).rgb; // East North East
    blur_ori += tex2D(s0, tex + float2(0.5 * -px, py * offset_bias)).rgb; // North North West

    //blur_ori += (2 * ori); // Probably not needed. Only serves to lessen the effect.

    blur_ori /= 4.0;  //Divide by the number of texture fetches

    sharp_strength_luma *= 0.666; // Adjust strength to aproximate the strength of pattern 2
#endif

    // -- Pattern 8 -- A (slower) 9 tap gaussian using 9 texture fetches.
#if pattern == 8

    // -- Gaussian filter --
    //   [ 1 , 2 , 1 ]
    //   [ 2 , 4 , 2 ]
    //   [ 1 , 2 , 1 ]

    half3 blur_ori = tex2D(s0, tex + float2(-px, py) * offset_bias).rgb; // North West
        blur_ori += tex2D(s0, tex + float2(px, -py) * offset_bias).rgb;     // South East
    blur_ori += tex2D(s0, tex + float2(-px, -py)  * offset_bias).rgb;  // South West
    blur_ori += tex2D(s0, tex + float2(px, py) * offset_bias).rgb;    // North East

    half3 blur_ori2 = tex2D(s0, tex + float2(0, py) * offset_bias).rgb; // North
        blur_ori2 += tex2D(s0, tex + float2(0, -py) * offset_bias).rgb;    // South
    blur_ori2 += tex2D(s0, tex + float2(-px, 0) * offset_bias).rgb;   // West
    blur_ori2 += tex2D(s0, tex + float2(px, 0) * offset_bias).rgb;   // East
    blur_ori2 *= 2.0;

    blur_ori += blur_ori2;
    blur_ori += (ori * 4); // Probably not needed. Only serves to lessen the effect.

    // dot()s with gaussian strengths here?

    blur_ori /= 16.0;  //Divide by the number of texture fetches

    //sharp_strength_luma *= 0.75; // Adjust strength to aproximate the strength of pattern 2
#endif

    // -- Pattern 9 -- A (slower) 9 tap high pass using 9 texture fetches.
#if pattern == 9

    // -- Gaussian filter --
    //   [ 1 , 1 , 1 ]
    //   [ 1 , 1 , 1 ]
    //   [ 1 , 1 , 1 ]

    float3 blur_ori = tex2D(s0, tex + float2(-px, py) * offset_bias).rgb; // North West
        blur_ori += tex2D(s0, tex + float2(px, -py) * offset_bias).rgb;     // South East
    blur_ori += tex2D(s0, tex + float2(-px, -py)  * offset_bias).rgb;  // South West
    blur_ori += tex2D(s0, tex + float2(px, py) * offset_bias).rgb;    // North East

    blur_ori += ori.rgb; // Probably not needed. Only serves to lessen the effect.

    blur_ori += tex2D(s0, tex + float2(0, py) * offset_bias).rgb;    // North
    blur_ori += tex2D(s0, tex + float2(0, -py) * offset_bias).rgb;  // South
    blur_ori += tex2D(s0, tex + float2(-px, 0) * offset_bias).rgb; // West
    blur_ori += tex2D(s0, tex + float2(px, 0) * offset_bias).rgb; // East

    blur_ori /= 9;  //Divide by the number of texture fetches

    //sharp_strength_luma *= (8.0/9.0); // Adjust strength to aproximate the strength of pattern 2
#endif


    /*-----------------------------------------------------------.
    /                            Sharpen                          /
    '-----------------------------------------------------------*/

    // -- Calculate the sharpening --
    float3 sharp = ori - blur_ori;  //Subtracting the blurred image from the original image

#if 0 //New experimental limiter .. not yet finished
        float sharp_luma = dot(sharp, sharp_strength_luma); //Calculate the luma
    sharp_luma = (abs(sharp_luma)*8.0) * exp(1.0 - (abs(sharp_luma)*8.0)) * sign(sharp_luma) / 16.0; //I should probably move the strength modifier here

#elif 0 //SweetFX 1.4 code
        // -- Adjust strength of the sharpening --
        float sharp_luma = dot(sharp, sharp_strength_luma); //Calculate the luma and adjust the strength

    // -- Clamping the maximum amount of sharpening to prevent halo artifacts --
    sharp_luma = clamp(sharp_luma, -sharp_clamp, sharp_clamp);  //TODO Try a curve function instead of a clamp

#else //SweetFX 1.5.1 code
        // -- Adjust strength of the sharpening and clamp it--
        float4 sharp_strength_luma_clamp = float4(sharp_strength_luma * (0.5 / sharp_clamp), 0.5); //Roll part of the clamp into the dot

        //sharp_luma = saturate((0.5 / sharp_clamp) * sharp_luma + 0.5); //scale up and clamp
        float sharp_luma = saturate(dot(float4(sharp, 1.0), sharp_strength_luma_clamp)); //Calculate the luma, adjust the strength, scale up and clamp
    sharp_luma = (sharp_clamp * 2.0) * sharp_luma - sharp_clamp; //scale down
#endif

    // -- Combining the values to get the final sharpened pixel	--
    //float4 done = ori + sharp_luma;    // Add the sharpening to the original.
    inputcolor.rgb = inputcolor.rgb + sharp_luma;    // Add the sharpening to the input color.

    /*-----------------------------------------------------------.
    /                     Returning the output                    /
    '-----------------------------------------------------------*/
#if show_sharpen == 1
    //inputcolor.rgb = abs(sharp * 4.0);
    inputcolor.rgb = saturate(0.5 + (sharp_luma * 4)).rrr;
#endif

    return saturate(inputcolor);
}
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 5th March 2014 at 13:51.
James Freeman is offline   Reply With Quote
Old 5th March 2014, 13:44   #24285  |  Link
THX-UltraII
Registered User
 
Join Date: Aug 2008
Location: the Netherlands
Posts: 850
Quote:
Debanding (MID), softens your image. If you watch original (non-internet) Blu-Rays, disable debanding.
I thought that uncompressed Blu-Ray content can have debanding too?

Quote:
JINC is good but, I still prefect Lanczos because it preserves the detail at higher frequencies, whether Jinc blurs them a little (as I see it).
I took JINC because it is a newer algo . What do you think about using NNEDI3 for Chroma Upscaling and Lanczos for Luma Upscaling? Or is it prefferable to always use the same algorithm for both luma and chroma upscaling? (why is NNEDI3 not available for Luma Upscaling btw?)

Quote:
One more (important) thing,
Use this (MPC LumaSharpen) shader (Its the best sharpenning shader available (IMO)
You say to use this on top of the scaling algorithms in madVR? Won t that cause any major artifacts?
THX-UltraII is offline   Reply With Quote
Old 5th March 2014, 13:56   #24286  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by THX-UltraII View Post
I thought that uncompressed Blu-Ray content can have debanding too?
Seldom.
Not deserving MID Debaning treatment.

Quote:
Originally Posted by THX-UltraII View Post
I took JINC because it is a newer algo . What do you think about using NNEDI3 for Chroma Upscaling and Lanczos for Luma Upscaling?
Or is it preferable to always use the same algorithm for both luma and chroma upscaling? (why is NNEDI3 not available for Luma Upscaling btw?)
I use Lanczos 3 AR for all, NNEDI3 still does not work for me.
Maybe someone else can answer you that more specifically.

Quote:
Originally Posted by THX-UltraII View Post
You say to use this on top of the scaling algorithms in madVR? Won t that cause any major artifacts?
Try it and thank me later.
I updated the code, so be sure to copy the 1.4.1 version.


@madshi
Can MadVR load pixel shaders?
I think I hear something similar somewhere...
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 5th March 2014 at 14:01.
James Freeman is offline   Reply With Quote
Old 5th March 2014, 13:59   #24287  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
Quote:
Originally Posted by Shiandow View Post
There seems to be some kind of misunderstanding. The shader is meant to change the input of the dithering algorithm
Oh, ok, I didn't understand it this way. Alright, it could be interesting to test this. I might release a test build which uses your pre-processing.

Quote:
Originally Posted by toniash View Post
What to look for if my render queues don't fill up completely?
Render queues? There is only one render queue. Maybe post a screenshot of the debug OSD, then we might be able to say more.

Quote:
Originally Posted by James Freeman View Post
Debanding (MID), softens your image.
If you watch original (non-internet) Blu-Rays, disable debanding (or set to Low if its a 10GB+ internet content).

[...]

One more (important) thing,
Use this (MPC LumaSharpen) shader Its the best sharpenning shader available (IMO), Yes way better (and smarter) than Sharpen Complex 2.
It will not touch the finest detail (add contrast) but will sharpen the not sharp enough detail/textures.
James, posting suggestions and recommendations is fine, but please make sure you mark these as your personal subjective opinion. You do sound a bit as if you your recommendations would be what everybody agreed on being the best settings. Which is definitely not the case here.

IMHO, using a deband setting of "low" can still make sense for Blu-Ray, but that's only my personal opinion and I know that some people will disagree. Sharpening is also very controversial. Some people hate it. Some people love it.

Quote:
Originally Posted by THX-UltraII View Post
I thought that uncompressed Blu-Ray content can have debanding too?
Yes, it can happen.

Quote:
Originally Posted by THX-UltraII View Post
I took JINC because it is a newer algo
FYI, from what I've seen, most people prefer Jinc. A minority of people prefer Lanczos. You may want to make up your own mind which you prefer.

Quote:
Originally Posted by James Freeman View Post
Can MadVR load pixel shaders?
I think I hear something similar somewhere...
I don't understand what you mean. madVR does support custom pixel shaders, if that is your question.
madshi is offline   Reply With Quote
Old 5th March 2014, 14:07   #24288  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by madshi View Post
James, posting suggestions and recommendations is fine, but please make sure you mark these as your personal subjective opinion. You do sound a bit as if you your recommendations would be what everybody agreed on being the best settings. Which is definitely not the case here.
Of course.
Everything I post is IMO only.
I'm thinking about putting a big disclaimer in my signiture.. Maybe...

Quote:
Originally Posted by madshi View Post
I don't understand what you mean. madVR does support custom pixel shaders, if that is your question.
How?
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.
James Freeman is offline   Reply With Quote
Old 5th March 2014, 14:11   #24289  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,836
Quote:
Originally Posted by James Freeman View Post
How?
Just load them in MPC-HC like you would with its built-in renderer (Options -> Playback -> Shaders), madVR accepts them the same way.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 5th March 2014, 14:11   #24290  |  Link
Boltron
Registered User
 
Boltron's Avatar
 
Join Date: May 2011
Posts: 94
I really would love to use a sharpen filter but I use JRiver and can't find a way to do it. Madshi did mention one or twice that he could add a means to include a filter that wouldn't be too difficult to do. I can only hope.
Boltron is offline   Reply With Quote
Old 5th March 2014, 14:20   #24291  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by nevcairiel View Post
Just load them in MPC-HC like you would with its built-in renderer (Options -> Playback -> Shaders), madVR accepts them the same way.
Thanks, that's what I was doing.
I thought there was a way to load shaders directly to madVR.

P.S
madshi, I realize most people don't like the Ringing or other artifacts sharpening may give (I hate them too)... but not this one.
I'm not just saying that this is the best one there is when I say try it, I really mean it and urge people to try it (if they disliked sharpening before).

This post is IMO only.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.
James Freeman is offline   Reply With Quote
Old 5th March 2014, 14:32   #24292  |  Link
Shiandow
Registered User
 
Join Date: Dec 2013
Posts: 752
Quote:
Originally Posted by madshi View Post
Oh, ok, I didn't understand it this way. Alright, it could be interesting to test this. I might release a test build which uses your pre-processing.
That would be nice. I think it would improve the contrast slightly for people who are using 6bit output (or even lower). For 8 bit output it will only cause minute differences for very dark colours, so in that case it probably won't be really noticeable. But, at least in theory, it should fix any problems caused by the fact that smooth motion and dithering don't use the same gamma curve.

Unfortunately it doesn't seem that it will fix the 'blinking' problem and the more I think about that the less sense it seems to make. I can't understand why it seems to work regardless of smooth motion, but only when the frame rate doesn't match the display frame rate.
Shiandow is offline   Reply With Quote
Old 5th March 2014, 14:36   #24293  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,977
Quote:
Originally Posted by THX-UltraII View Post
I thought that uncompressed Blu-Ray content can have debanding too?
i can show you a lot of bd with banding. anime suffer a lot from this issue. bd are just weak encoded high bitrate h264 stream (in best case)
Quote:
I took JINC because it is a newer algo . What do you think about using NNEDI3 for Chroma Upscaling and Lanczos for Luma Upscaling? Or is it prefferable to always use the same algorithm for both luma and chroma upscaling? (why is NNEDI3 not available for Luma Upscaling btw?)
it is but nnedi can only double the resolution. it can upscale 480p to 960 but it can't do 720 -> 1080.
not sure if any gpu can handle nnedi32 neurons 1080 -> 2160 but give it a try. you find it under image doubling.
Quote:
You say to use this on top of the scaling algorithms in madVR? Won t that cause any major artifacts?
it's sharping you never get this for free. same for debanding.
huhn is offline   Reply With Quote
Old 5th March 2014, 15:02   #24294  |  Link
THX-UltraII
Registered User
 
Join Date: Aug 2008
Location: the Netherlands
Posts: 850
What do you prefer for debanding for Blu-Ray content Madshi?

LOW/LOW
LOW/MID
LOW/HIGH
MID/LOW
MID/MID
MID/HIGH
HIGH/LOW
HIGH/MID
or
HIGH/HIGH
THX-UltraII is offline   Reply With Quote
Old 5th March 2014, 15:09   #24295  |  Link
THX-UltraII
Registered User
 
Join Date: Aug 2008
Location: the Netherlands
Posts: 850
James, why do you prefer LANCZOS over JINC exactly?
THX-UltraII is offline   Reply With Quote
Old 5th March 2014, 15:15   #24296  |  Link
kasper93
MPC-HC Developer
 
Join Date: May 2010
Location: Poland
Posts: 556
Quote:
Originally Posted by THX-UltraII View Post
What do you prefer for debanding for Blu-Ray content Madshi?

LOW/LOW
LOW/MID
LOW/HIGH
MID/LOW
MID/MID
MID/HIGH
HIGH/LOW
HIGH/MID
or
HIGH/HIGH
FIGUREOUTYOURSELF/DONTBELAZY

You haven't even bother to test it... You can't do HIGH/LOW and others...

It's a matter of taste. The same with scaling. If there were "best" option madshi wouldn't have include those settings.

Tapatalk 4 @ GT-I9300
kasper93 is offline   Reply With Quote
Old 5th March 2014, 15:28   #24297  |  Link
seiyafan
Registered User
 
Join Date: Feb 2014
Posts: 161
Quote:
Originally Posted by THX-UltraII View Post
What do you think about using NNEDI3 for Chroma Upscaling and Lanczos for Luma Upscaling?
Can your 280x handle NNEDI3 to 4k?
seiyafan is offline   Reply With Quote
Old 5th March 2014, 15:30   #24298  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by THX-UltraII View Post
James, why do you prefer LANCZOS over JINC exactly?
I have done a bunch of testing with patterns and movies,
The remaining aliasing (ever so small) with Lanczos translates to better detail at higher frequency, whether Jinc will blur the detail.

Jinc will look better (smoother) on perfect diagonals only, on anything else it looks just like Lanczos.
Also, Lanczos 8 uses less juice than Jinc 3.

For Upscaling I use: Lanczos 3 + AR (no LL).
For Downscaling I use: Catmull-Rom (no LL).

This post is IMO only.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 5th March 2014 at 15:39.
James Freeman is offline   Reply With Quote
Old 5th March 2014, 15:36   #24299  |  Link
seiyafan
Registered User
 
Join Date: Feb 2014
Posts: 161
Quote:
Originally Posted by jkauff View Post
you can have LAV using the iGPU while madVR is using the dGPU (this is assuming you have a motherboard/BIOS that allows simultaneous use).
That's interesting. I have 4770k but Quicksync option shows up as inactive, I never knew that has something to do with motherboard.

I wonder if it's possible for Quicksync to do deinterlacing, because if I give the job to MadVR it would need twice the rendering power since it doubles the frames per second.
seiyafan is offline   Reply With Quote
Old 5th March 2014, 15:45   #24300  |  Link
THX-UltraII
Registered User
 
Join Date: Aug 2008
Location: the Netherlands
Posts: 850
Quote:
Originally Posted by seiyafan View Post
Can your 280x handle NNEDI3 to 4k?
Yes. Keep in mind that I am talking about Chroma Upscaling NNEDI3@32, NOT talking about image doubling NNEDI3
THX-UltraII is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 01:57.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.