Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 26th February 2013, 12:52   #17781  |  Link
G_M_C
Registered User
 
Join Date: Feb 2006
Posts: 1,076
Quote:
Originally Posted by chros View Post
Hm, looks promising.
What do you think a smaller vga: nvidia 640m or 645m ? (and a smaller CPU) So in this case the heating won't be such a problem ...

Thanks!
I've done just the opposite; I've installed a MSI 7850 Power Edition, and underclocked & undervolted it. I've got plenty of GPU power, and very low noise. GPU temps don't rise higher than 46 degrees C, and fan doesn't go faster than ~35%; And that is when running benchmarks like Unigine and so forth. Power draw is also minimized because of the underclocking/undervolting.

The option of 'going big' and undervolting/underclocking also leaves room for expansion of madVR's options, without having to switch GPU again because of lack of performance.

But it's just a thought
G_M_C is offline   Reply With Quote
Old 26th February 2013, 12:52   #17782  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Quote:
Originally Posted by Hypernova View Post
Are you sure about that? From my testing, HD4000 can handle Jinc3 quite well. It had no problem with 1080p resolution, but for 2560x1600, the highest overclock I can do on mine chip almost make it possible. Too bad it's just almost. I don't really know with AR turned on, but I remember trying it and it didn't make much of a difference. And that was before the Jinc3 preformence improvement madshi did a while ago.
Well I wasn't overclocking but my HD4000 definitely can't handle Jinc3. Maybe it was just for interlaced content, I can't remember.
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7
DragonQ is offline   Reply With Quote
Old 26th February 2013, 13:54   #17783  |  Link
Heuer
Registered User
 
Join Date: Jan 2012
Posts: 15
Just tried cyberbeings madVR settings and they work very well for me. I am using an i3 and a MSI GT650Ti.
Heuer is offline   Reply With Quote
Old 26th February 2013, 14:13   #17784  |  Link
namaiki
Registered User
 
Join Date: Sep 2009
Location: Sydney, Australia
Posts: 1,073
What are people doing to get MPC-HC and madVR to use the "High-performance NVIDIA processor" in an NVIDIA Optimus computer setup? Is it still just renaming mpc-hc.exe and setting it up manually in the NVIDIA Control Panel?

Also, are there any other specific optimisations that I should know of as I'm getting random stutters here and there since I enabled Optimus? I didn't have any stuttering with manual switching enabled, but I think Optimus is supposed to be more convenient to use.

Last edited by namaiki; 26th February 2013 at 14:25.
namaiki is offline   Reply With Quote
Old 26th February 2013, 14:54   #17785  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
IIRC I was able to get MadVR & MPC-HC to use my nVidia card simply using the nVidia Control Panel (no renaming necessary). This is easily checked using GPU-Z anyway.

Regarding the random stuttering, have you checked your Windows power settings? "Optimize video quality" should be enabled at least.
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7
DragonQ is offline   Reply With Quote
Old 26th February 2013, 14:58   #17786  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Force flushing for everything has solved my issue with presentation glitches for complex video, especially with FRC.
aufkrawall is offline   Reply With Quote
Old 26th February 2013, 15:12   #17787  |  Link
namaiki
Registered User
 
Join Date: Sep 2009
Location: Sydney, Australia
Posts: 1,073
Quote:
IIRC I was able to get MadVR & MPC-HC to use my nVidia card simply using the nVidia Control Panel (no renaming necessary). This is easily checked using GPU-Z anyway.
On my setup, in the Nvidia Control Panel, 'mpc-hc.exe' is stuck on 'integrated' and it's greyed out not being able to change that to high-performance so it seems I have to rename mpc-hc.exe to something else to create a new program profile for Optimus. With 'mpc-hc.exe' GPU-z reports only the Intel card being used even if I use the right-click context menu to open mpc-hc.exe selecting the 'High-performance' option from there.

Quote:
Force flushing for everything has solved my issue with presentation glitches for complex video, especially with FRC.
Which level of flushing did you have enabled? Just 'flush' or did you use any of the "flush and wait" options?
namaiki is offline   Reply With Quote
Old 26th February 2013, 16:11   #17788  |  Link
wanezhiling
Registered User
 
Join Date: Apr 2011
Posts: 1,184
Quote:
Originally Posted by namaiki View Post
On my setup, in the Nvidia Control Panel, 'mpc-hc.exe' is stuck on 'integrated' and it's greyed out not being able to change that to high-performance so it seems I have to rename mpc-hc.exe to something else to create a new program profile for Optimus. With 'mpc-hc.exe' GPU-z reports only the Intel card being used even if I use the right-click context menu to open mpc-hc.exe selecting the 'High-performance' option from there.
You can use mpc-be instead.
wanezhiling is offline   Reply With Quote
Old 26th February 2013, 16:42   #17789  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Quote:
Originally Posted by namaiki View Post
Which level of flushing did you have enabled? Just 'flush' or did you use any of the "flush and wait" options?
Flush & sleep led to dropped frames.

Are there any disadvantages when using flush for everything?
aufkrawall is offline   Reply With Quote
Old 26th February 2013, 17:39   #17790  |  Link
pie1394
Registered User
 
Join Date: May 2009
Posts: 212
Quote:
Originally Posted by aufkrawall View Post
Flush & sleep led to dropped frames.
I saw this random frame dropping issue on my HD7970 after the upgrade from nVidia GTX260+. After all settings are changed to No Flush, this issue seems to disappear.
pie1394 is offline   Reply With Quote
Old 26th February 2013, 17:54   #17791  |  Link
pie1394
Registered User
 
Join Date: May 2009
Posts: 212
Hi Madshi,

Do you even investigate if it is possible for madVR to present the 10-bit or 16-bit frame buffer to AMD/ATI GPU's HDMI interface, which is sending with 1080p 10-bit color depth signal to TV?

Or it has been already supported automatically?

I just noticed that Catalyst 13.2 beta6 driver sets the HD7970 HDMI output with 1080p 10-bit depth signal to the Sony KDL-65HX920 TV automatically. Although the TV's super-resolution engine still does quite a good job, I think it should show some additional advantages on scaled Chroma pixels or 10-bit content's contiuous color shades.
pie1394 is offline   Reply With Quote
Old 26th February 2013, 18:38   #17792  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by G_M_C View Post
I've done just the opposite; I've installed a MSI 7850 Power Edition, and underclocked & undervolted it. I've got plenty of GPU power, and very low noise. GPU temps don't rise higher than 46 degrees C, and fan doesn't go faster than ~35%; And that is when running benchmarks like Unigine and so forth. Power draw is also minimized because of the underclocking/undervolting.

The option of 'going big' and undervolting/underclocking also leaves room for expansion of madVR's options, without having to switch GPU again because of lack of performance.

But it's just a thought
Thanks for the feedback (and I agree with you in your case), but I was talking about a laptop GPU: GT640m
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 26th February 2013, 19:16   #17793  |  Link
fairchild
Registered User
 
Join Date: Sep 2010
Posts: 321
So I just changed all my flush settings in the Exclusive mode settings to "Don't Flush" and my rendering times dropped dramatically. I had been seeing rendering times of 20-40ms on average depending on scaling and if Smooth Motion was on, now I'm sub 1ms. (it hovers around 0.05-0.10ms) I also have "Use a seperate device for presentation" set to enabled. Hopefully there is no negative to doing this and everything will remain smooth as can be. I run Bicubic75 with AR enabled for Chroma/Image Upscaling and Catmull-Rom with AR + LL enabled for Image Downscaling on my 1080p plasma which only has 60hz refresh rate mode.
__________________
MPC-HC/MPC-BE, Lav Filters, MadVR
CPU: AMD Ryzen 5 1600, Video: AMD Radeon RX Vega 56 -> TCL S405 55", Audio: Audio-Technica M50S
fairchild is offline   Reply With Quote
Old 26th February 2013, 19:25   #17794  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Quote:
Originally Posted by pie1394 View Post
I saw this random frame dropping issue on my HD7970 after the upgrade from nVidia GTX260+. After all settings are changed to No Flush, this issue seems to disappear.
Yup, no flush works for me too.

Quote:
Originally Posted by fairchild View Post
I also have "Use a seperate device for presentation" set to enabled. Hopefully there is no negative to doing this and everything will remain smooth as can be.
With this enabled I'm getting lots of presentation glitches with complex video.
aufkrawall is offline   Reply With Quote
Old 26th February 2013, 20:04   #17795  |  Link
Coldblackice
Registered User
 
Join Date: Jul 2011
Posts: 10
Quote:
Originally Posted by fairchild View Post
So I just changed all my flush settings in the Exclusive mode settings to "Don't Flush" and my rendering times dropped dramatically. I had been seeing rendering times of 20-40ms on average depending on scaling and if Smooth Motion was on, now I'm sub 1ms. (it hovers around 0.05-0.10ms) I also have "Use a seperate device for presentation" set to enabled. Hopefully there is no negative to doing this and everything will remain smooth as can be. I run Bicubic75 with AR enabled for Chroma/Image Upscaling and Catmull-Rom with AR + LL enabled for Image Downscaling on my 1080p plasma which only has 60hz refresh rate mode.
Quote:
Originally Posted by aufkrawall View Post
Yup, no flush works for me too.


With this enabled I'm getting lots of presentation glitches with complex video.
Is there somewhere that defines things such as "flushing", "separate device for presentation", etc. mean? I can sort of quasi-understand the mechanics, but not well enough to understand by tinkering around on my own.

Much obliged, if so.
Coldblackice is offline   Reply With Quote
Old 26th February 2013, 20:41   #17796  |  Link
konakona
Registered User
 
Join Date: Dec 2012
Posts: 26
Hello there. I've got a problem with madvr-after changing video card from gtx260 to gtx660 i cant playback movies using madvr (evr works just fine)-i got like 3 frames per second,rest is dropped.

http://i52.tinypic.com/34ri2i0.jpg -there are only a few dropped frames on the screenshot but its because i ctrl+r'd just before taking it-was like 600+ dropped before.
Im using a custom resolution forced in nvidia control panel-72hz with some custom reduced blanking.

also,card seems to be pretty unstable when it comes to refresh rate-when i managed to play 8bit video without problems (i cant do it now for some reason) refresh rate was around 70-72,thats a huge difference. (i think it shows lies-it once shown over 77hz- my screen would black out after 76.6hz or so.

most of the time,the video remains black,i cant even launch the osd debug screen,but sound is intact,sometimes pausing and unpausing fixes it for 8bit videos and they are watchable,but 10bit videos drop like hell.

my filter chain is lav-xyvsfilter-madvr,im also using reclock and haali splitter for mkv files.

Last edited by konakona; 26th February 2013 at 20:50.
konakona is offline   Reply With Quote
Old 26th February 2013, 20:47   #17797  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,336
The problem is that your display refresh rate could not be measured. I suggest to do a clean install of the driver, and carefully attempt your custom resolution again, its very well possible that the custom resolution is not working properly causing issues in the driver.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 26th February 2013, 21:05   #17798  |  Link
konakona
Registered User
 
Join Date: Dec 2012
Posts: 26
Quote:
Originally Posted by nevcairiel View Post
The problem is that your display refresh rate could not be measured. I suggest to do a clean install of the driver, and carefully attempt your custom resolution again, its very well possible that the custom resolution is not working properly causing issues in the driver.
it worked flawlessly on 260,with exact same parameters,also worked on hd4870 using powerstrip,screen osd reports 1680x1050@72hz

is the problem caused because i didnt reinstall nvidia drivers? it shouldnt matter as there is one driver for all the cards,and driver reports the card correctly,as gtx660.

Last edited by konakona; 26th February 2013 at 21:16.
konakona is offline   Reply With Quote
Old 26th February 2013, 21:36   #17799  |  Link
iSunrise
Registered User
 
Join Date: Dec 2008
Posts: 496
Quote:
Originally Posted by konakona View Post
it worked flawlessly on 260,with exact same parameters,also worked on hd4870 using powerstrip,screen osd reports 1680x1050@72hz

is the problem caused because i didnt reinstall nvidia drivers? it shouldnt matter as there is one driver for all the cards,and driver reports the card correctly,as gtx660.
Just do a clean install (you can select that option after you started the driver setup) of the NV driver anyway. If you have problems when you just switched the card, there´s not much else that can go wrong, especially not if you´re still on NV.
iSunrise is offline   Reply With Quote
Old 26th February 2013, 21:57   #17800  |  Link
Blight
Software Developer
 
Blight's Avatar
 
Join Date: Oct 2001
Location: Israel
Posts: 1,005
madshi:
This is on fullscreen on an nvidia GT520, windowed overlay (win7 64bit), FSE disabled.
I notice it instantly the moment the GPU is overloaded (I can see it hit 100% in GPUz).
__________________
Yaron Gur
Zoom Player . Lead Developer
Blight is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 06:20.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.