Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 1st July 2016, 07:17   #38441  |  Link
kiwijunglist
Registered User
 
Join Date: Jun 2013
Posts: 23
Question regarding 3d madvr reclock and mpchc.

I like to use reclock to speed up 23.976 to 24. However reclock doesn't seem to play nice with 3d movies using madvr and mpchc.

What are people's solutions for this? Is this a problem for you. Is there a good automatic solution?
kiwijunglist is offline   Reply With Quote
Old 1st July 2016, 07:46   #38442  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,291
Quote:
Originally Posted by Ranguvar View Post
Sure. But what else can I spend the power on, if I have it to spare and want to?
I think that's the wrong way to look at it. You should compare before and after screenshots and judge what to run based on those results.

Your call, just seems pointless using ~100W to do something you don't even notice. For lower res content those settings would make more of an impact but upscaling 1080P with NNEDI3 then downscaling with SSIM (likely 2D AR LL) isn't going to do much to the final result. Hell you might be better off just using Jinc AR. If it was a 4K screen, well that'd be a different story.

Focus on results not specifically on how many neurons of NNEDI3 you can set, this does seem to be a rather reoccurring theme here with people just taking the short route ticking boxes and setting things as high as they'll go and running into performance issues as a result..
ryrynz is offline   Reply With Quote
Old 1st July 2016, 08:07   #38443  |  Link
Blackfyre
Registered User
 
Join Date: Dec 2014
Posts: 71
Quote:
Originally Posted by Blackfyre View Post
Upgrading my GPU tomorrow; going from an AMD HD 7970 (equivalent of 280X), to an MSI GTX 1070 Gaming X.

Yep, that's a massive jump. Looking forward to finally tone up some of my Madvr settings.

Anyone here with a GTX 1070? How are you experiences thus far?
Can someone explain to me what's going on here, I just got the GTX 1070, and I'm getting dropped frames like no tomorrow using the same settings I had with my 5 year old AMD card... the HD7970. Mind you I'm running SVP @ 75FPS too with custom settings.

So without everything is working fine with Madvr @24FPS, but otherwise @ 75FPS I get dropped frames...



Is this a driver issue? A what issue exactly? I haven't used an nVidia GPU with Madvr before...

Even without SVP, I can barely push Madvr settings above the HD7970 settings I had. You guys have no idea how disappointed I am right now...

Blackfyre is offline   Reply With Quote
Old 1st July 2016, 08:19   #38444  |  Link
kiwijunglist
Registered User
 
Join Date: Jun 2013
Posts: 23
Quote:
Originally Posted by jerryleungwh View Post
Can someone please help me with it? With some detailed step by step methods? I'll really appreciate it. Thanks!
you have to install the intel software codec mentioned a few pages back. i had the same problem, view the replies to my prev questions.
kiwijunglist is offline   Reply With Quote
Old 1st July 2016, 10:55   #38445  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by Blackfyre View Post
Can someone explain to me what's going on here,
The right way would be to teach you how to troubleshoot it yourself layer be layer.

First thing first, disable SVP.

The rendering time (ms) should be less than the movie frame time (ms), this can only happen when the queues are full, all of them.
If one of the queues is not filling there is a problem in that region responsible for filling that queue, you'll see that as rendering time greater than frame time.

Please report what is the status of your queues and rendering time (average).
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 1st July 2016 at 10:59.
James Freeman is offline   Reply With Quote
Old 1st July 2016, 11:15   #38446  |  Link
jerryleungwh
Registered User
 
Join Date: Jun 2016
Posts: 39
Quote:
Originally Posted by kiwijunglist View Post
you have to install the intel software codec mentioned a few pages back. i had the same problem, view the replies to my prev questions.
May I ask why is the intel driver affecting it? I know I have both intel and Nvidia gpu but I've manualy set it to use Nvidia so why does the intel driver need to do anything?
jerryleungwh is offline   Reply With Quote
Old 1st July 2016, 11:22   #38447  |  Link
Blackfyre
Registered User
 
Join Date: Dec 2014
Posts: 71
Quote:
Originally Posted by James Freeman View Post
The right way would be to teach you how to troubleshoot it yourself layer be layer.

First thing first, disable SVP.

The rendering time (ms) should be less than the movie frame time (ms), this can only happen when the queues are full, all of them.
If one of the queues is not filling there is a problem in that region responsible for filling that queue, you'll see that as rendering time greater than frame time.

Please report what is the status of your queues and rendering time (average).
Thanks for the quick reply. Alright I seem to have fixed the issue, the GTX 1070's core and memory speeds were fluctuating (going from 200Mhz to 2000Mhz+), so I went to nVidia control panel, and set MPC.exe &MPC64 both to performance mode basically. This stopped the fluctuation of Core & Memory speeds and set them to MAX constant as soon as I open MPC.

Which worked like a charm. Here's a picture with SVP running, with my SVP Settings, and with the chroma upscaling you see + in upscaling refinement I also have SuperRes now selected at (4, linear light, anti-bloating, and anti-ringing enabled, strength @ 100%).

http://imgur.com/OonTMo4

With the 280X I couldn't do SuperRes + Jinc (well I did, but GPU usage was around 90% and fans turned really loud with the 280x)....

Anyway I still can't do image doubling, as soon as I select double luma resolution and chroma resolution with NEDI, NEDI3, or Super-XBR, the dropped frames begin to happen.
Blackfyre is offline   Reply With Quote
Old 1st July 2016, 12:37   #38448  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,291
Quote:
Originally Posted by Blackfyre View Post
Anyway I still can't do image doubling, as soon as I select double luma resolution and chroma resolution with NEDI, NEDI3, or Super-XBR, the dropped frames begin to happen.
And that would be because you're maxing out the GPU?
ryrynz is offline   Reply With Quote
Old 1st July 2016, 13:11   #38449  |  Link
SamuelMaki
Registered User
 
Join Date: Sep 2011
Posts: 56
Quote:
Originally Posted by Blackfyre View Post
http://imgur.com/OonTMo4

Anyway I still can't do image doubling, as soon as I select double luma resolution and chroma resolution with NEDI, NEDI3, or Super-XBR, the dropped frames begin to happen.
May I ask why you have chosen the integrated Intel one as a processing device instead of the nVidia GPU?
SamuelMaki is offline   Reply With Quote
Old 1st July 2016, 14:51   #38450  |  Link
Blackfyre
Registered User
 
Join Date: Dec 2014
Posts: 71
Quote:
Originally Posted by ryrynz View Post
And that would be because you're maxing out the GPU?
Yes, GPU usage increases from 60% before enabling double luma & chroma resolutions, to 99%+ after enabling them and the dropped frames begin.

That's with SVP running @ 75FPS of course.

EDIT: Note that 720p videos upscaled to 1080p run fine with NEDI double luma and chroma (with GPU @ 93% usage), but 1080p videos drop frames.

Quote:
Originally Posted by SamuelMaki View Post
May I ask why you have chosen the integrated Intel one as a processing device instead of the nVidia GPU?
Because that's for SVP not Madvr. It allows for extra headroom for Madvr to work with.

My HD 4600 Overclocked to 1.5Ghz can assist SVP with GPU acceleration without too much trouble. This allowed my 280X and now my GTX 1070 to have more headroom to work with in Madvr.

Last edited by Blackfyre; 1st July 2016 at 14:55.
Blackfyre is offline   Reply With Quote
Old 1st July 2016, 17:41   #38451  |  Link
Xaurus
Registered User
 
Join Date: Jun 2011
Posts: 286
Quote:
Originally Posted by Warner306 View Post
What are your maximum settings for 1080p content? Many would be curious.
I haven't finalized the settings yet. But I did a quick comparison with my 980 Ti settings.

You can see it here:
https://docs.google.com/spreadsheets...IdYcnZuIWkQMA/

I also underclock my gfx cards to be 100% silent, meaning no noise from the fans what so ever (I am very picky about noise). So the gfx card speeds are not what you use out of the box.

All in all, very happy with the 1080 for madvr!

edit: I am able to bump the NNEDI3 image doubling from 64 to 128 now with the GTX 1080 for my main 1080p profile of typical 23.976fps content at 4k res.
__________________
SETUP: Win 10, MPC-HC, LAV, MadVR
HARDWARE: Corsair 400Q | Intel Xeon E3-1260L v5 | Noctua NH-U9S | SuperMicro X11SSZ-TLN4F | Samsung 2x8GB DDR4 ECC | Samsung 850 EVO 1TB | MSI GTX 1060 | EVGA G2 750

Last edited by Xaurus; 1st July 2016 at 17:44.
Xaurus is offline   Reply With Quote
Old 1st July 2016, 17:52   #38452  |  Link
jerryleungwh
Registered User
 
Join Date: Jun 2016
Posts: 39
I am having problems with the 3D play back. I am using a optimus computer which uses both intel and Nvidia graphic cards and for some reason I can't select the Nvidia for MPC- HC to work on by normal means. I found a workaround online which is making a copy of the exe file of mpc- hc and change 5he name of it then I can choose to use the Nvidia graphic card. However when I play a MVc mkv with it, it immediately crashes my computer and a blue screen pops up saying there is a problem with the intel graphic driver. When I'm using the original one which only uses the intel graphic card, it displays 3D content correctly but with no sound. What can I do to fix it? My set up us to bit stream directly to my AMP but it's not working when it's a 3D video

Last edited by jerryleungwh; 1st July 2016 at 21:01.
jerryleungwh is offline   Reply With Quote
Old 1st July 2016, 20:09   #38453  |  Link
AngelGraves13
Registered User
 
Join Date: Dec 2010
Posts: 239
Quote:
Originally Posted by ryrynz View Post
Still recommend Super-XBR 100-125 for chroma upscaling, also it's a lot less resource hungry than NNEDI3. Would recommend comparing them and deciding for yourself.
100 is great, anything higher is a bit too sharp. Easily noticeable in animation...causes ringing.

Truth be told, I prefer Jinc for all my scaling needs. It has minimal artifacts and is clean without excessive sharpness. You can always super-res it up a bit to get it cleaner.
AngelGraves13 is offline   Reply With Quote
Old 1st July 2016, 22:51   #38454  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,475
Quote:
Originally Posted by AngelGraves13 View Post
100 is great, anything higher is a bit too sharp. Easily noticeable in animation...causes ringing.

Truth be told, I prefer Jinc for all my scaling needs. It has minimal artifacts and is clean without excessive sharpness. You can always super-res it up a bit to get it cleaner.
AB for SuperRes + SSIM AR allow you to increase the sxbr sharpness, anyway it's all about finding the magical looking combo to your eyes and I sure as hell did with SSIM 2D 100% + sxbr75 + SR@2 LL AB@75%. Not keen on messing with anything anymore as everything looks glorious to me
leeperry is offline   Reply With Quote
Old 2nd July 2016, 01:50   #38455  |  Link
Barnahadnagy
Registered User
 
Join Date: Apr 2014
Posts: 13
Quote:
Originally Posted by jerryleungwh View Post
May I ask why is the intel driver affecting it? I know I have both intel and Nvidia gpu but I've manualy set it to use Nvidia so why does the intel driver need to do anything?
Its not the Intel driver, just Intel made the decoding library. The two are not related in any way.
Barnahadnagy is offline   Reply With Quote
Old 2nd July 2016, 02:36   #38456  |  Link
Sunspark
Registered User
 
Join Date: Nov 2015
Posts: 44
Hi, maybe this is just a lack of understanding on my part in which case I apologize, but I wonder if the aspect ratio automatically selected for this 16:9 display is wrong.

For a tv file with resolution 1920x1088 and a draw resolution of 1920x1074, 1.79:1 is auto selected. 1920x1088 is 1.77:1 and 16:9 is 1.78:1. If one overrides it to 16:9 it pulls it a tiny bit taller.

Same with a cinema file, draw of 1920x808=2.38:1, 21:9 (2.33:1) is auto selected, file resolution is 1920x816 which is 2.35. Manually selecting 2.35:1 pulls it a tiny bit taller.

The effect of pulling it taller makes circles more round and less squat.

What am I missing here? Thanks.
Sunspark is offline   Reply With Quote
Old 2nd July 2016, 03:05   #38457  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,232
if the file is correctly created you don't have to change anything.

1088p and 816p are not a standard but as long as the file is correctly created the result should be fine.
huhn is offline   Reply With Quote
Old 2nd July 2016, 11:36   #38458  |  Link
jerryleungwh
Registered User
 
Join Date: Jun 2016
Posts: 39
Quote:
Originally Posted by Barnahadnagy View Post
Its not the Intel driver, just Intel made the decoding library. The two are not related in any way.
I've installed the driver and if I choose to use Nvidia gpu it still crashes and if I use the intel gpu, 3D image shows correctly but the audio is gone. I usually bit stream the audio but when I'm doing this there's no audio at all
jerryleungwh is offline   Reply With Quote
Old 2nd July 2016, 12:46   #38459  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 1,590
Quote:
Originally Posted by Xaurus View Post
I haven't finalized the settings yet. But I did a quick comparison with my 980 Ti settings.

You can see it here:
https://docs.google.com/spreadsheets...IdYcnZuIWkQMA/

I also underclock my gfx cards to be 100% silent, meaning no noise from the fans what so ever (I am very picky about noise). So the gfx card speeds are not what you use out of the box.

All in all, very happy with the 1080 for madvr!

edit: I am able to bump the NNEDI3 image doubling from 64 to 128 now with the GTX 1080 for my main 1080p profile of typical 23.976fps content at 4k res.
Thanks for the comparison! There's a significant difference between the 2 cards.
I also like the idea of underclocking (I do it with my laptop as well). What util do you use for that? (I use nvidiainspector.)
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v385.28),Win10 LTSB 1607,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED65B8(2160p@23/24/25/29/30/50/59/60Hz)
chros is offline   Reply With Quote
Old 2nd July 2016, 13:29   #38460  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,291
Madshi were you still interested in splitting the neuron pass settings up like in MPDN?
ryrynz is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 10:56.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, vBulletin Solutions Inc.