Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 13th March 2013, 00:47   #17981  |  Link
Niyawa
Registered User
 
Niyawa's Avatar
 
Join Date: Dec 2012
Location: Neverland, Brazil
Posts: 169
Hey guys, another question, this time it might be a little more easier but I need someone to confirm this to me: would a GTX 675M be able to run Jinc? There's this fellow who has an Alienware but always get dropped frames with Jinc. That's kinda strange since far as I know that same GPU is more powerful than a GTX 260.
__________________
madVR scaling algorithms chart - based on performance x quality | KCP - A (cute) quality-oriented codec pack
Niyawa is offline   Reply With Quote
Old 13th March 2013, 02:15   #17982  |  Link
Dodgexander
Registered User
 
Join Date: Jul 2008
Posts: 157
Quote:
Originally Posted by Niyawa View Post
Hey guys, another question, this time it might be a little more easier but I need someone to confirm this to me: would a GTX 675M be able to run Jinc? There's this fellow who has an Alienware but always get dropped frames with Jinc. That's kinda strange since far as I know that same GPU is more powerful than a GTX 260.
I can run Jinc 3 chroma and upscaling using a wolfdale @ 3ghz with a 9600gt.

Using CUDA helps a lot I think. Was amazed to see that work for me.
Dodgexander is offline   Reply With Quote
Old 13th March 2013, 02:32   #17983  |  Link
pie1394
Registered User
 
Join Date: May 2009
Posts: 212
Quote:
Originally Posted by Niyawa View Post
Hey guys, another question, this time it might be a little more easier but I need someone to confirm this to me: would a GTX 675M be able to run Jinc? There's this fellow who has an Alienware but always get dropped frames with Jinc. That's kinda strange since far as I know that same GPU is more powerful than a GTX 260.
There might be something wrong in his setup...
What's his setup and player programs?

With recent GeForce driver + such powerful GPU, I think all madVR FLUSH options can be safely turned off. I have turned all of them OFF for my ION system (Win7x64, 314.07, DVI 1920x1200p60, VSync ON). No dropped / repeated / delayed frame is observed if the GPU is not overloaded.


The GTX675M is obviously at least twice as powerful as the GTX260 in raw performance. The architecture improving part is not even included yet. The GTX260+ OC has been already capable of handling more than 30 fps Jinc3+AR to 1920x1080 while there is no GPU-based HQ deinterlacing loading.

GTX675M, GF114 Fermi-core 384 SP, 620 / 1240 MHz, 256-bit GDDR5-6000 (192GB/s)

GTX260, GT200 Tesla-core 192 SP, 576 / 1242 MHz, 448-bit GDDR3-2000 (110GB/s)

GTX260+ OC, GT200b Tesla-core 216 SP, 680 / 1466 MHz, 448-bit GDDR3-2100 (115GB/s)
pie1394 is offline   Reply With Quote
Old 13th March 2013, 03:48   #17984  |  Link
wanezhiling
Registered User
 
Join Date: Apr 2011
Posts: 1,184
Because that Alienware guy's madVR runs on Intel GPU.
http://forum.doom9.org/showpost.php?...ostcount=10533
wanezhiling is offline   Reply With Quote
Old 13th March 2013, 05:48   #17985  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Oh well, I don't mean to get OT but one thing that shouldn't overlooked IMHO is the performance/watt ratio(and even more so within an HTPC environment): http://www.hardware.fr/articles/879-...nces-watt.html

This site doesn't allow pictures hotlinking so I can't provide a translated url, but the 650Ti delivers 67 fps/100W in Battlefield3 and the 660 only 54fps

I really can't wrap my head around tossing 180€ on a 660, especially when the 660SE is around the corner and that I don't care about games...so I'm pulling the trigger on a 90€ lightly used 1GB 650Ti

Buying used 5xx doesn't much sense as these are far less power efficient(34 fps/100W in B3 for the 560Ti ), second hand warranties aren't nearly as long anymore and their prices just aren't competitive(560Ti's regularly go for +100€ used).

Surely the 650Ti runs on a 128bit power controller but I'm quite sure it'll still be plenty fast in mVR anyway

Last edited by leeperry; 13th March 2013 at 07:33.
leeperry is offline   Reply With Quote
Old 13th March 2013, 07:06   #17986  |  Link
turbojet
Registered User
 
Join Date: May 2008
Posts: 1,840
I'm having a bit of a problem with the display mode changer. Currently have 60, 72 and 75 resolutions set up, they all look fine. In madvr this is in the mode changer: 1680x1050p60, 1680x1050p72, 1680x1050p75 It works fine for 24->72 and 30->60 but with 25 fps sources it insists on setting 60hz even if the monitor is set to 75hz. Then I installed reclock, set it to slowdown 25 to 24 and in madvr treat 25 fps as 24. I get 24 fps according to madvr status but it always changes to 60hz. When 75hz is set, madvr/reclock displays 74.984. Any idea what could be going on?

Also with 24 fps source, 72hz refresh rate (71.99139 in madvr status) with smoothmotion set to only when judder, it's enabled even though with it disabled there is no judder.

Last edited by turbojet; 13th March 2013 at 07:52.
turbojet is offline   Reply With Quote
Old 13th March 2013, 07:48   #17987  |  Link
artios
Registered User
 
Join Date: Jan 2005
Posts: 7
Quote:
Originally Posted by pie1394 View Post
There might be something wrong in his setup...
What's his setup and player programs?

With recent GeForce driver + such powerful GPU, I think all madVR FLUSH options can be safely turned off. I have turned all of them OFF for my ION system (Win7x64, 314.07, DVI 1920x1200p60, VSync ON). No dropped / repeated / delayed frame is observed if the GPU is not overloaded.


The GTX675M is obviously at least twice as powerful as the GTX260 in raw performance. The architecture improving part is not even included yet. The GTX260+ OC has been already capable of handling more than 30 fps Jinc3+AR to 1920x1080 while there is no GPU-based HQ deinterlacing loading.

GTX675M, GF114 Fermi-core 384 SP, 620 / 1240 MHz, 256-bit GDDR5-6000 (192GB/s)

GTX260, GT200 Tesla-core 192 SP, 576 / 1242 MHz, 448-bit GDDR3-2000 (110GB/s)

GTX260+ OC, GT200b Tesla-core 216 SP, 680 / 1466 MHz, 448-bit GDDR3-2100 (115GB/s)
Hello, I am the member that Niyawa talked about.

I understand that my card is powerful. When I use the Intel GPU to play the following file
32.2 GB
Total Bitrate: 31.06 Mbps
Video: MPEG-4 AVC Video / 23275 kbps / 1080p / 23.976 fps / 16:9 / High Profile 4.1

I am getting maybe 3-4 drops per minute. The Nvidia decided to lock the mpc-hc.exe to Intel GPU. The renaming trick worked to use nvidia GPU but unfortunately I am getting more drops this way. I think its short of incompatibility or settings issue.

Can I try madVR through another program to see if its not mpc-hc related?

also madVR settings show me that "Sharpness" needs to be dealt with (is in green color)
artios is offline   Reply With Quote
Old 13th March 2013, 08:06   #17988  |  Link
Mangix
Audiophile
 
Join Date: Oct 2006
Posts: 353
Quote:
Originally Posted by Dodgexander View Post
Using CUDA helps a lot I think. Was amazed to see that work for me.
madVR does not use CUDA in any way.
Mangix is offline   Reply With Quote
Old 13th March 2013, 08:10   #17989  |  Link
Niyawa
Registered User
 
Niyawa's Avatar
 
Join Date: Dec 2012
Location: Neverland, Brazil
Posts: 169
Quote:
Originally Posted by Dodgexander View Post
I can run Jinc 3 chroma and upscaling using a wolfdale @ 3ghz with a 9600gt.
I know! I've seen people use a Intel HD 4000 for that too.

Quote:
Originally Posted by pie1394 View Post
There might be something wrong in his setup... What's his setup and player programs?
Basically my guide itself. MPC-HC + madVR + LAV Filters blah blah, you know the drill.

Quote:
Originally Posted by wanezhiling View Post
Because that Alienware guy's madVR runs on Intel GPU.
http://forum.doom9.org/showpost.php?...ostcount=10533
As you can see from his own comment, he tried that, didn't work at all. I still believe it's a setting matter but apparently he tried everything and nothing.
__________________
madVR scaling algorithms chart - based on performance x quality | KCP - A (cute) quality-oriented codec pack
Niyawa is offline   Reply With Quote
Old 13th March 2013, 13:52   #17990  |  Link
Dodgexander
Registered User
 
Join Date: Jul 2008
Posts: 157
Quote:
Originally Posted by Mangix View Post
madVR does not use CUDA in any way.
You're missing the point. Deinterlacing etc is done by LAV Cuda decoder. Frees up more room to leave the video card too process Jinc3.

Same would apply to software decoding.

I have an a better AMD system, but it can't handle dxva or dxva copy back with the same MadVr settings, even though its more powerful. I have to stick that on software decoding in LAV.

Try using different decoders and different modes.
Dodgexander is offline   Reply With Quote
Old 13th March 2013, 22:42   #17991  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Quote:
Originally Posted by Dodgexander View Post
You're missing the point. Deinterlacing etc is done by LAV Cuda decoder. Frees up more room to leave the video card too process Jinc3.
Doesn't make any difference. Using CUDA on my laptop so the HD4000 only has to do scaling doesn't let me use any higher settings in MadVR, compared to using the HD4000 for everything.
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7
DragonQ is offline   Reply With Quote
Old 14th March 2013, 02:44   #17992  |  Link
pie1394
Registered User
 
Join Date: May 2009
Posts: 212
Quote:
Originally Posted by artios View Post
Hello, I am the member that Niyawa talked about.

I understand that my card is powerful. When I use the Intel GPU to play the following file
32.2 GB
Total Bitrate: 31.06 Mbps
Video: MPEG-4 AVC Video / 23275 kbps / 1080p / 23.976 fps / 16:9 / High Profile 4.1

I am getting maybe 3-4 drops per minute. The Nvidia decided to lock the mpc-hc.exe to Intel GPU. The renaming trick worked to use nvidia GPU but unfortunately I am getting more drops this way. I think its short of incompatibility or settings issue.

Can I try madVR through another program to see if its not mpc-hc related?

also madVR settings show me that "Sharpness" needs to be dealt with (is in green color)
Did you use 23.976 / 24 Hz signal output or still 60 Hz like your NB LCD panel? How about EVR renderer?

The hybrid display design on most notebooks might cause some issues. It really depends how the board circuit is designed. For example the output ports like LVDS / HDMI might be hard-wired to Intel's GPU even the selected one is nVidia / AMD(ATi) ... As a result, it creates extra latency and needs PCI-e bandwidth to transmit the displayed contents between 2 GPUs.

For the comparison, how about NOT using DXVA2 HW H.264 decoding on this content? I guess the CPU of your NB should be powerful enough for such BD stream's decoding.
pie1394 is offline   Reply With Quote
Old 14th March 2013, 06:17   #17993  |  Link
baii
Registered User
 
Join Date: Dec 2011
Posts: 180
You can always flash a unlocked bios and disable optimus as last resort.
baii is offline   Reply With Quote
Old 14th March 2013, 09:36   #17994  |  Link
artios
Registered User
 
Join Date: Jan 2005
Posts: 7
Quote:
Originally Posted by artios View Post
Hello, I am the member that Niyawa talked about.

I understand that my card is powerful. When I use the Intel GPU to play the following file
32.2 GB
Total Bitrate: 31.06 Mbps
Video: MPEG-4 AVC Video / 23275 kbps / 1080p / 23.976 fps / 16:9 / High Profile 4.1

I am getting maybe 3-4 drops per minute. The Nvidia decided to lock the mpc-hc.exe to Intel GPU. The renaming trick worked to use nvidia GPU but unfortunately I am getting more drops this way. I think its short of incompatibility or settings issue.

Can I try madVR through another program to see if its not mpc-hc related?

also madVR settings show me that "Sharpness" needs to be dealt with (is in green color)
Hi, I figured out what was wrong.

In order Nvidia high speed processor to be used at Jinc, the laptop must be charging. Not possible to nvidia to handle Jinc whilst laptop is on battery, even at simple tv rips of 720p I experience constant frame drops.

Sorry I didn't figure it out sooner. Kind of odd though as it can handle with ease Bicubic and Lanczos on battery.

Can anyone confirm that? Is there a workaround?
artios is offline   Reply With Quote
Old 14th March 2013, 11:49   #17995  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Quote:
Originally Posted by artios View Post
In order Nvidia high speed processor to be used at Jinc, the laptop must be charging. Not possible to nvidia to handle Jinc whilst laptop is on battery, even at simple tv rips of 720p I experience constant frame drops.

Sorry I didn't figure it out sooner. Kind of odd though as it can handle with ease Bicubic and Lanczos on battery.

Can anyone confirm that? Is there a workaround?
Change your Windows power settings for running on battery. In the advanced settings there's a part where you can set "When playing video" in "Multimedia settings" to "Optimise video quality" - this will let you use the same settings on battery as when charging.
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7
DragonQ is offline   Reply With Quote
Old 14th March 2013, 11:55   #17996  |  Link
artios
Registered User
 
Join Date: Jan 2005
Posts: 7
Quote:
Originally Posted by DragonQ View Post
Change your Windows power settings for running on battery. In the advanced settings there's a part where you can set "When playing video" in "Multimedia settings" to "Optimise video quality" - this will let you use the same settings on battery as when charging.
when playing video was already in optimize video quality mode.

I changed:

Graphic Processor > on battery > changed from balanced mode to Max performance

and I thought that did the trick but it didn't. I am getting unusual frame drops only on battery and only using Jinc.

It has to be plugged in. Is there anyone with Alienware notebook to verify this?

Last edited by artios; 14th March 2013 at 16:59.
artios is offline   Reply With Quote
Old 14th March 2013, 19:02   #17997  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,403
Look for other power settings, maybe you need to allow active cooling while on battery? There are a lots of power saving features that might push Jinc over the edge.

Jinc takes a lot more GPU power and memory bandwidth than the other resizers, it isn't at all unusual that you are only having issues with it. Your laptop must be throttling something to save power when on battery.

Last edited by Asmodian; 14th March 2013 at 19:05.
Asmodian is offline   Reply With Quote
Old 14th March 2013, 23:09   #17998  |  Link
jkauff
Registered User
 
Join Date: Oct 2012
Location: Akron, OH
Posts: 491
The GPU itself may have some power-saving features built in, and you probably can't modify them.
jkauff is offline   Reply With Quote
Old 15th March 2013, 02:43   #17999  |  Link
Dodgexander
Registered User
 
Join Date: Jul 2008
Posts: 157
Quote:
Originally Posted by DragonQ View Post
Doesn't make any difference. Using CUDA on my laptop so the HD4000 only has to do scaling doesn't let me use any higher settings in MadVR, compared to using the HD4000 for everything.
How does this apply to what I was saying? You're talking about two different devices.

I am talking about using software decoding on one pc and Cuda on another allowing for more room for gpu with madVr.

If I use dxva, the settings I usually use create dropped frames until I turn them down.

I'm guessing either you're integrated graphics is already hitting a wall, or the behavior of a mixed system or Intel dxva is different.

Sent from my Blade S using Tapatalk 2

Last edited by Dodgexander; 15th March 2013 at 02:45.
Dodgexander is offline   Reply With Quote
Old 15th March 2013, 03:18   #18000  |  Link
Mangix
Audiophile
 
Join Date: Oct 2006
Posts: 353
Quote:
Originally Posted by Dodgexander View Post
You're missing the point. Deinterlacing etc is done by LAV Cuda decoder. Frees up more room to leave the video card too process Jinc3.

Same would apply to software decoding.

I have an a better AMD system, but it can't handle dxva or dxva copy back with the same MadVr settings, even though its more powerful. I have to stick that on software decoding in LAV.

Try using different decoders and different modes.
LAV is not madVR. madVR supports DXVA deinterlacing.

And yes, DXVA sucks on AMD platforms.

I use LAV's CUDA decoder to free up CPU for SmoothVideo.
Mangix is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 03:28.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.