Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 18th January 2013, 01:45   #17001  |  Link
xabregas
Registered User
 
Join Date: Jun 2011
Posts: 121
Quote:
Originally Posted by noee View Post
xabregas, what driver are you running with your HD7770? Also, which decoder, player and what OS?
MPC HC + LAV filters (lav video descoder + lav audio decoder + lav splitter source) + reclock (wasapi)

windows 7 x86

driver is 12.10 official

going to install 13.1 now, FEAR
xabregas is offline   Reply With Quote
Old 18th January 2013, 01:56   #17002  |  Link
noee
Registered User
 
Join Date: Jan 2007
Posts: 530
I'm running 13.1 now on HD6570. MadVR with MPC-HC and jRiver MC18 is flawless on film SD and HD (bluray rips) material (jinc3/AR for both Chroma/Luma). Running a chessy old Athlon II...10bit x264-encoded sources....Encoding video with x264 at the same time as playback....

Anyway, are any of your madVR queues starved?

Try going into REclock advanced and "Clean up Video Timings Database", then start a video and let it get it's groove back on, see what happens.

Last edited by noee; 18th January 2013 at 02:00.
noee is offline   Reply With Quote
Old 18th January 2013, 02:28   #17003  |  Link
Dodgexander
Registered User
 
Join Date: Jul 2008
Posts: 157
Quote:
Originally Posted by DragonQ View Post
In "PAL" video streams (1080i/25 or 576i/25), there's basically only two types of video you can get: true interlaced content (1080i/25 or 576i/25), or progressive content (1080p/25 or 576p/25). With interlaced content you want to deinterlace using the best algorithm available (e.g. vector/motion adaptive) to get a progressive video suitable for display (1080p/50 or 576p/50). For progressive content, you want to simply merge pairs of fields together because they belong to a single frame with no motion (i.e. weave deinterlacing).

However, when you tell a GPU to perform deinterlacing, it should detect whether the video is actually progressive and thus weave needs to be used rather than vector/motion adaptive. Therefore, you can basically leave deinterlacing on all the time ("video mode") and it'd play back everything perfectly (25p content would be frame-doubled to 50p but this should have no impact on image quality).

Unfortunately, there are apparently some combinations of GPUs and progressive video content where this doesn't happen as intended. Therefore you need to force "film mode" for these to get proper weave deinterlacing so that quality isn't sacrificed by unnecessary vector/motion adaptive deinterlacing.

I don't believe I have any such content (or my GPUs aren't affected) so I can't test this.
Okay thats clear, but how is that related to my problem?

If a channel switches from 1080/25p too 1080/25i, if using dxva deinterlacing, shouldn't it detect it needs to go from "no deinterlacing, or "weave" as you describe from the progressive source. Too "vector" for the interlaced source?

Because in my case it doesn't. It always thinks the video is progressive, even when I change the deinterlacing in the AMD control panel to vector or motion, there is no deinterlacing, or just "weave" deinterlacing for the 1080/25i video.

So since this just happens with Madvr, is this a communication problem between the renderer and the amd driver?

Why doesn't it work to set the deinterlacing to vector manually when this occurs?

Why does it work forcing deinterlacing on in Madvr?

If it could detect the change, should it be sending the information to change the deinterlacing type too the driver?
Dodgexander is offline   Reply With Quote
Old 18th January 2013, 04:56   #17004  |  Link
dansrfe
Registered User
 
Join Date: Jan 2009
Posts: 1,210
Two questions:

1) Is it better to use SVP in conjunction with madVR or does it not make a difference?

2) If I were to use SVP with madVR does the need for ReClock still arise or does SVP eliminate the problem which ReClock is trying to solve as well?
dansrfe is offline   Reply With Quote
Old 18th January 2013, 05:10   #17005  |  Link
QBhd
QB the Slayer
 
QBhd's Avatar
 
Join Date: Feb 2011
Location: Toronto
Posts: 697
I tried SVP but could not stand the look of it... It made everything nice and crisp, but it looked cheap and sterile and looked like a soap opera. Not my cup of tea, but some people love it.

QB
__________________
QBhd is offline   Reply With Quote
Old 18th January 2013, 08:42   #17006  |  Link
Qaq
AV heretic
 
Join Date: Nov 2009
Posts: 422
Quote:
Originally Posted by dansrfe View Post
2) If I were to use SVP with madVR does the need for ReClock still arise or does SVP eliminate the problem which ReClock is trying to solve as well?
As far as I remember SVP author himself used to use ReClock.
Qaq is offline   Reply With Quote
Old 18th January 2013, 09:43   #17007  |  Link
AndreaMG
Registered User
 
AndreaMG's Avatar
 
Join Date: Sep 2012
Location: Turin
Posts: 104
Quote:
Originally Posted by dansrfe View Post
Two questions:

1) Is it better to use SVP in conjunction with madVR or does it not make a difference?

2) If I were to use SVP with madVR does the need for ReClock still arise or does SVP eliminate the problem which ReClock is trying to solve as well?
1) SVP intepolates newly added frames and works fine with MadVR, it does makes a difference if you like (as me) watch videos at 50/59.94fps. It works via avisynth, it would be awsome (for stability and performance) if it could be integrated into the video renderer

2) Reclock is still needed unless you have perfect timing.

Please visit SVP forum. The authors are nice guys and the result SVP is able to give in my case are FAR BETTER than the frame interpolation of my TV set.
__________________
Raven RVZ01 * i7-4790k * 16GB RAM * Zotac GTX 970 4G * SSD 850Evo 500GB * Blu-Ray Burner Slot-In * PSU SFX 80+ Gold 450Watt * Windows 10 64bit * MPCHC+MadVR+SVP * Panasonic 50" VT30 ^^

Last edited by AndreaMG; 18th January 2013 at 10:29.
AndreaMG is offline   Reply With Quote
Old 18th January 2013, 09:56   #17008  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by DragonQ View Post
They all do better than this.
I'm not sure about that. First of all: See my reply after the next quote. Furthermore: Your screenshot of the Cheese Slice test pattern does not show how proper motion adaptive deinterlacing with diagonal filtering looks like. Proper motion adaptive deinterlacing looks *much* better than that! You're putting way too much stock into the Cheese Slice tests. Furthermore, AMD's and NVidia's implementation of motion adaptive deinterlacing seems to be far from optimal, as far as I can see...

Quote:
Originally Posted by DragonQ View Post
That sounds about right for motion adaptive deinterlacing but modern GPUs use vector adaptive deinterlacing, which is far better at reproducing the "original" full resolution progressive video than motion adaptive deinterlacing. Both the GTS250 and GT430 do some form of vector adaptive deinterlacing.

These examples are from the Cheese Slice test
Those Cheese Slice test patterns were the reason that I said that NVidia and ATI *might* do some sort of motion compensation. But I've just checked the Cheese Slice test patterns again, and I have to *totally* change my opinion. Actually, it seems everybody has been misinterpreting the Cheese Slice tests! Because, I've just found out that they're actually encoded as *FILM* with a 2:2 cadence. Just try it out: Play those Cheese Slice tests with madVR film mode. madVR will detect a 2:2 cadence for both 50i and 60i Cheese Slice tests. In the moment when motion starts, madVR will take a moment to switch onto the correct cadence, but once it has switched, playback is perfect. After that I've looked at the separate fields in the video stream, and really, every 2 fields belong together to form a progressive frame. So the Cheese Slice tests must be totally disregarded as *video* mode deinterlacing tests. Motion compensation wouldn't work at all for this test pattern because motion compensation expects movement between every field, which this test pattern does not have.

Edit: Btw, you can now use the 1080i50 Cheese Slice test patterns to check whether your GPU IVTC algorithm can handle PAL film content well... Compare it to madVR film mode. Use frame stepping and compare pixel by pixel. Or use the 1080i60 pattern to check how well your GPU IVTC can handle 2:2 cadences in 60i content. FWIW, madVR takes too long to switch to the correct cadence here. That's something I'll work on in the future.

Edit2: Disable the option "only look at pixels in the frame center" in the madVR settings, and madVR film mode will play all 1080i Cheese Slice test patterns with identical quality to the original progressive source, which is much better than anything AMD and NVidia seem to be able to do.

Quote:
Originally Posted by xabregas View Post
And im paying for that already, this gpu drivers are making me crazy, madvr is not smooth at all, i disabled everything i found in CCC video section and still sometimes is good and smooth, others is not smooth...

Maybe is powerplay changing the clocks for power saving or other crap saving, how can i disable that waste of technology in CCC?? I think the problem is CCC, i can see the clocks go to 1000HZ and then drop to 300 and up and thats making madvr exclusive mode unstable
You see the clocks changing while video playback is running? That's pretty bad. I think there are some tweak tools which allow you to fix the clocks at a specific value, but I'm not sure which tools can do that. FWIW, I'm using a 7770 in my development PC and on win7 x64 playback is always smooth in fullscreen exclusive mode. So I'm not sure why you have problems.

Quote:
Originally Posted by xabregas View Post
only have pulldown detection checked to auto, forcing smooth playback and thats all...
Please disable "forcing smooth playback". Nobody even knows what that does exactly.

Quote:
Originally Posted by xabregas View Post
Ah, i have aero disabled. With aero enabled i only get errors, f&$/&$/ aero
You shouldn't get errors with Aero enabled. What kind of errors do you get?

Some suggestions:

(1) Try with software decoding (just in case you have DXVA decoding enabled).
(2) Try without Reclock.
(3) If you still have problems, make a screenshot of the OSD in the situation when the problem occurs. If we're talking about FSE mode, then maybe PrintScreen won't work? (Not sure). In that case write down numbers of all the queues in the moment when the problem occurs. Then we may be able to identify where the bottleneck is.
(4) The madVR OSD probably does report frame drops when the problem occurs?
(5) Are you playing progressive files or interlaced files? Try progressive files (e.g. Blu-Ray), because they're easier to play. Does the problem only occur with interlaced files, or with progressive files, too?

Quote:
Originally Posted by Dodgexander View Post
Okay thats clear, but how is that related to my problem?
Not at all. Nobody said it would... Film vs video has nothing to do with your problem.

Last edited by madshi; 18th January 2013 at 11:25.
madshi is offline   Reply With Quote
Old 18th January 2013, 11:33   #17009  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Quote:
Originally Posted by leeperry View Post
It's only recently that I've gained interest in interlaced material, but CUVID does a hell of a job deinterlacing 29.97fps video material to 59.94fps. Quite frankly, it looks fantastic to me. James Cameron keeps saying that frame rate is more important than resolution(tring to push for 1080/48p instead of 4K/24p) and it sure doesn't look 288p to me

Now how much resolution do we lose when watching a NTSC (video mode) DVD deinterlaced to 59.94fps? GPU's use fairly advanced algorithms IIRC.
It's less "resolution loss" and more "deinterlacing artefacts". But they're usually difficult to spot at normal viewing distances with most material, assuming your deinterlacer is good.

Quote:
Originally Posted by Dodgexander View Post
Okay thats clear, but how is that related to my problem?

If a channel switches from 1080/25p too 1080/25i, if using dxva deinterlacing, shouldn't it detect it needs to go from "no deinterlacing, or "weave" as you describe from the progressive source. Too "vector" for the interlaced source?

Because in my case it doesn't. It always thinks the video is progressive, even when I change the deinterlacing in the AMD control panel to vector or motion, there is no deinterlacing, or just "weave" deinterlacing for the 1080/25i video.

So since this just happens with Madvr, is this a communication problem between the renderer and the amd driver?

Why doesn't it work to set the deinterlacing to vector manually when this occurs?

Why does it work forcing deinterlacing on in Madvr?

If it could detect the change, should it be sending the information to change the deinterlacing type too the driver?
I believe this is a bug in MadVR that madshi mentioned a few pages back - basically it only checks whether the video is interlaced or not at the start. You can get around this by setting LAV Video Decoder to "aggressive" deinterlacing.

With this enabled, my MBAFF videos recorded from DVB-T2 correctly switch between progressive and interlaced.

Quote:
Originally Posted by madshi View Post
I'm not sure about that. First of all: See my reply after the next quote. Furthermore: Your screenshot of the Cheese Slice test pattern does not show how proper motion adaptive deinterlacing with diagonal filtering looks like. Proper motion adaptive deinterlacing looks *much* better than that! You're putting way too much stock into the Cheese Slice tests. Furthermore, AMD's and NVidia's implementation of motion adaptive deinterlacing seems to be far from optimal, as far as I can see...
I've mentioned them once and haven't used them in years, so I disagree with that comment.

I know what crappy deinterlacing ("288p") looks like - for example, the UK Coupling DVDs all exhibit this (576p/25 made from 576i/25 but they've clearly just taken every other field and line doubled them). I do not see this with "progressive video in an interlaced stream" material on any of my PCs or TVs.

Quote:
Originally Posted by madshi View Post
Those Cheese Slice test patterns were the reason that I said that NVidia and ATI *might* do some sort of motion compensation. But I've just checked the Cheese Slice test patterns again, and I have to *totally* change my opinion. Actually, it seems everybody has been misinterpreting the Cheese Slice tests! Because, I've just found out that they're actually encoded as *FILM* with a 2:2 cadence. Just try it out: Play those Cheese Slice tests with madVR film mode. madVR will detect a 2:2 cadence for both 50i and 60i Cheese Slice tests. In the moment when motion starts, madVR will take a moment to switch onto the correct cadence, but once it has switched, playback is perfect. After that I've looked at the separate fields in the video stream, and really, every 2 fields belong together to form a progressive frame. So the Cheese Slice tests must be totally disregarded as *video* mode deinterlacing tests. Motion compensation wouldn't work at all for this test pattern because motion compensation expects movement between every field, which this test pattern does not have.

Edit: Btw, you can now use the 1080i50 Cheese Slice test patterns to check whether your GPU IVTC algorithm can handle PAL film content well... Compare it to madVR film mode. Use frame stepping and compare pixel by pixel. Or use the 1080i60 pattern to check how well your GPU IVTC can handle 2:2 cadences in 60i content. FWIW, madVR takes too long to switch to the correct cadence here. That's something I'll work on in the future.

Edit2: Disable the option "only look at pixels in the frame center" in the madVR settings, and madVR film mode will play all 1080i Cheese Slice test patterns with identical quality to the original progressive source, which is much better than anything AMD and NVidia seem to be able to do.
This is really interesting, I'll have a look when I get home later. Doesn't change the fact that my GPUs seem to handle 2:2 cadence fine but maybe there is a better way of seeing the effects of different deinterlacing algorithms on native interlaced content.

Presumably the best way to do this would be to create a 1080p/50 video, then cut away half of each frame to get a 1080i/25 video, then compare the two with different algorithms.


EDIT: Hmm, for some reason forcing film or video mode doesn't do anything on my laptop (HD4000).
EDIT 2: Oh OK, forcing film mode only works for software decoding.
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7

Last edited by DragonQ; 18th January 2013 at 14:57.
DragonQ is offline   Reply With Quote
Old 18th January 2013, 14:55   #17010  |  Link
mzso
Registered User
 
Join Date: Oct 2009
Posts: 930
Quote:
Originally Posted by xabregas View Post
I got the ati, i read it too late

And im paying for that already, this gpu drivers are making me crazy, madvr is not smooth at all, i disabled everything i found in CCC video section and still sometimes is good and smooth, others is not smooth...
Don't get to bothered. It often sucks with nvidia too. Sometimes the video gets jerky no matter what renderer you use. Only a system or driver restart helps.

Last time it happened I noticed on the madvr osd that the clock deviation was like 20%, which is funny...

For a while it seemed the problem was fixed then it came back again.
mzso is offline   Reply With Quote
Old 18th January 2013, 14:56   #17011  |  Link
mzso
Registered User
 
Join Date: Oct 2009
Posts: 930
Quote:
Originally Posted by QBhd View Post
I tried SVP but could not stand the look of it... It made everything nice and crisp, but it looked cheap and sterile and looked like a soap opera. Not my cup of tea, but some people love it.

QB
It still amazes me how people can fall in love with artifacts/noise and the sort.

Quote:
Originally Posted by Qaq View Post
As far as I remember SVP author himself used to use ReClock.
Speaking of SVP is it only possible to use it with ffdshow software filter? Since it runs on the cpu and without HW accel the decoding also my cpu is far from sufficient and the video card's acceleration capabilities are wasted.

Last edited by mzso; 18th January 2013 at 15:10.
mzso is offline   Reply With Quote
Old 18th January 2013, 15:06   #17012  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Quote:
Originally Posted by mzso View Post
Don't get to bothered. It often sucks with nvidia too. Sometimes the video gets jerky no matter what renderer you use. Only a system or driver restart helps.

Last time it happened I noticed on the madvr osd that the clock deviation was like 20%, which is funny...

For a while it seemed the problem was fixed then it came back again.
I remember trying MadVR on my HTPC when I first got it a year ago - couldn't avoid frame drops no matter what settings I used, despite GPU and CPU usage being nowhere near 100%. Might get around to trying it again now that MadVR, LAV Filters and MPC-HC have all matured a bit.

The only time I get dodgy playback in MediaPortal (EVR) is when the GPU and/or driver craps out. This happens rarely but inexplicably. Basically I just get frame drops all over the place for no apparent reason and I have to restart the PC. The other day it was only happening during certain scenes on certain channels - e.g. the adverts would all be fine, but the programme would be crappy (despite everything being interlaced) and another channel would be fine.

Oh, the joys of video playback.
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7
DragonQ is offline   Reply With Quote
Old 18th January 2013, 15:10   #17013  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
@DragonQ, a year ago the NVidia drivers had serious bugs when using madVR fullscreen exclusive mode. These bugs have been fixed in the meanwhile (at least when using win7 or win8 x64). For windowed mode you can now also use Overlay in madVR which could also potentially help avoiding frame drops.
madshi is offline   Reply With Quote
Old 18th January 2013, 15:24   #17014  |  Link
truexfan81
Registered User
 
truexfan81's Avatar
 
Join Date: Nov 2012
Posts: 138
Quote:
Originally Posted by madshi View Post
@DragonQ, a year ago the NVidia drivers had serious bugs when using madVR fullscreen exclusive mode. These bugs have been fixed in the meanwhile (at least when using win7 or win8 x64). For windowed mode you can now also use Overlay in madVR which could also potentially help avoiding frame drops.
i use overlay, and right now 36minutes into a dvd i have
1 frame repeat every 29.94 seconds
dropped frames 11
delayed frames 3

all queues are full, i guess its the vsync causing the frame drops?
truexfan81 is offline   Reply With Quote
Old 18th January 2013, 15:52   #17015  |  Link
AndreaMG
Registered User
 
AndreaMG's Avatar
 
Join Date: Sep 2012
Location: Turin
Posts: 104
Quote:
Originally Posted by mzso View Post
Speaking of SVP is it only possible to use it with ffdshow software filter? Since it runs on the cpu and without HW accel the decoding also my cpu is far from sufficient and the video card's acceleration capabilities are wasted.
It runs under Avisynth so ffdshow raw video filter is needed. It runs on CPU+GPU (Open CL) or CPU only.
__________________
Raven RVZ01 * i7-4790k * 16GB RAM * Zotac GTX 970 4G * SSD 850Evo 500GB * Blu-Ray Burner Slot-In * PSU SFX 80+ Gold 450Watt * Windows 10 64bit * MPCHC+MadVR+SVP * Panasonic 50" VT30 ^^

Last edited by AndreaMG; 18th January 2013 at 15:57.
AndreaMG is offline   Reply With Quote
Old 18th January 2013, 15:59   #17016  |  Link
Qaq
AV heretic
 
Join Date: Nov 2009
Posts: 422
Quote:
Originally Posted by mzso View Post
Speaking of SVP is it only possible to use it with ffdshow software filter? Since it runs on the cpu and without HW accel the decoding also my cpu is far from sufficient and the video card's acceleration capabilities are wasted.
1. SVP - http://www.svp-team.com/
Quote:
SVP allows you to watch any video on your PC file with frame interpolation (like you can watch it on high-end TVs and projectors). It increases frame rate by generating intermediate animation frames between existing ones to produce very smooth, fluid and clear motion.
SVP provides GPU acceleration and it's possible to watch FullHD 1080p-video recalculated to 60Hz in real-time with mid-range CPU and almost any GPU hardware.
2. Dmitri Render - http://dmitrirender.narod.ru/ Even more GPU assisted. Not freeware, free 30-day trial.
Qaq is offline   Reply With Quote
Old 18th January 2013, 16:11   #17017  |  Link
Dodgexander
Registered User
 
Join Date: Jul 2008
Posts: 157
Quote:
Originally Posted by DragonQ View Post
I believe this is a bug in MadVR that madshi mentioned a few pages back - basically it only checks whether the video is interlaced or not at the start. You can get around this by setting LAV Video Decoder to "aggressive" deinterlacing.

With this enabled, my MBAFF videos recorded from DVB-T2 correctly switch between progressive and interlaced.
Brilliant thanks!! And a coupling fan? Even more brilliant, should be more famous than it is!

So if I want to use YADIF2 deinterlacing in lav for standard def dvb channels, is there anyway to use lav without it applying yadif to hd channels also?

At the moment the only solution I have is to uncheck lav to use h264, but then I have to use cyberlink decoder for 1080i instead and id like to use lav instead.
Dodgexander is offline   Reply With Quote
Old 18th January 2013, 16:17   #17018  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Is there a particular reason you want to use YADIF? In most cases hardware deinterlacing is better.
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7
DragonQ is offline   Reply With Quote
Old 18th January 2013, 16:33   #17019  |  Link
xabregas
Registered User
 
Join Date: Jun 2011
Posts: 121
Quote:
Originally Posted by madshi View Post

You see the clocks changing while video playback is running? That's pretty bad. I think there are some tweak tools which allow you to fix the clocks at a specific value, but I'm not sure which tools can do that. FWIW, I'm using a 7770 in my development PC and on win7 x64 playback is always smooth in fullscreen exclusive mode. So I'm not sure why you have problems.
I solved the problem by creating a profile and changing the profile in notepad where i put the clock to 1000 in every powerplay default config (there were 3), in the memory i put 1125 and voltage to stock value which is 175, memory voltage is 0000 in all settings, so i guess it can`t be changed...

Now the clocks are locked

Quote:
Originally Posted by madshi View Post
Please disable "forcing smooth playback". Nobody even knows what that does exactly.
Ok, will disable it

Quote:
Originally Posted by madshi View Post
You shouldn't get errors with Aero enabled. What kind of errors do you get?

Some suggestions:

(1) Try with software decoding (just in case you have DXVA decoding enabled).
(2) Try without Reclock.
(3) If you still have problems, make a screenshot of the OSD in the situation when the problem occurs. If we're talking about FSE mode, then maybe PrintScreen won't work? (Not sure). In that case write down numbers of all the queues in the moment when the problem occurs. Then we may be able to identify where the bottleneck is.
(4) The madVR OSD probably does report frame drops when the problem occurs?
(5) Are you playing progressive files or interlaced files? Try progressive files (e.g. Blu-Ray), because they're easier to play. Does the problem only occur with interlaced files, or with progressive files, too?
with aero enabled i feel vsync is applied twice somehow, maybe is only in my head DO i have to disable desktop composition in general settings when using exclusive mode with aero enabled or is fine with only exclusive mode checked?

(1) i always try with software decoding, tried with lav and coreavc, coreavc is much more smooth

(2) yes, guess what, madvr started being smooth after disabling reclock. sound is bad now, but i went to the loft and brought my old pci xfi xtrememusic and sound is good again, no need for reclock

(3) I will put here my osd after playing a progressive video, with everything you said to disable, now with locked clocks and with ati brand new drivers 13.1

Thanks

Last edited by xabregas; 18th January 2013 at 16:37.
xabregas is offline   Reply With Quote
Old 18th January 2013, 16:35   #17020  |  Link
xabregas
Registered User
 
Join Date: Jun 2011
Posts: 121
Quote:
Originally Posted by mzso View Post
Don't get to bothered. It often sucks with nvidia too. Sometimes the video gets jerky no matter what renderer you use. Only a system or driver restart helps.

Last time it happened I noticed on the madvr osd that the clock deviation was like 20%, which is funny...

For a while it seemed the problem was fixed then it came back again.
damn

Problem is i think this gpu is very fast, the PQ is amazing with jinc and lanczo with AR enabled, but there is too many things screwing madvr in CCC amazing settings
xabregas is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 13:04.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.