Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
13th March 2013, 00:47 | #17981 | Link |
Registered User
Join Date: Dec 2012
Location: Neverland, Brazil
Posts: 169
|
Hey guys, another question, this time it might be a little more easier but I need someone to confirm this to me: would a GTX 675M be able to run Jinc? There's this fellow who has an Alienware but always get dropped frames with Jinc. That's kinda strange since far as I know that same GPU is more powerful than a GTX 260.
__________________
madVR scaling algorithms chart - based on performance x quality | KCP - A (cute) quality-oriented codec pack |
13th March 2013, 02:15 | #17982 | Link | |
Registered User
Join Date: Jul 2008
Posts: 157
|
Quote:
Using CUDA helps a lot I think. Was amazed to see that work for me. |
|
13th March 2013, 02:32 | #17983 | Link | |
Registered User
Join Date: May 2009
Posts: 212
|
Quote:
What's his setup and player programs? With recent GeForce driver + such powerful GPU, I think all madVR FLUSH options can be safely turned off. I have turned all of them OFF for my ION system (Win7x64, 314.07, DVI 1920x1200p60, VSync ON). No dropped / repeated / delayed frame is observed if the GPU is not overloaded. The GTX675M is obviously at least twice as powerful as the GTX260 in raw performance. The architecture improving part is not even included yet. The GTX260+ OC has been already capable of handling more than 30 fps Jinc3+AR to 1920x1080 while there is no GPU-based HQ deinterlacing loading. GTX675M, GF114 Fermi-core 384 SP, 620 / 1240 MHz, 256-bit GDDR5-6000 (192GB/s) GTX260, GT200 Tesla-core 192 SP, 576 / 1242 MHz, 448-bit GDDR3-2000 (110GB/s) GTX260+ OC, GT200b Tesla-core 216 SP, 680 / 1466 MHz, 448-bit GDDR3-2100 (115GB/s) |
|
13th March 2013, 03:48 | #17984 | Link |
Registered User
Join Date: Apr 2011
Posts: 1,184
|
Because that Alienware guy's madVR runs on Intel GPU.
http://forum.doom9.org/showpost.php?...ostcount=10533 |
13th March 2013, 05:48 | #17985 | Link |
Kid for Today
Join Date: Aug 2004
Posts: 3,477
|
Oh well, I don't mean to get OT but one thing that shouldn't overlooked IMHO is the performance/watt ratio(and even more so within an HTPC environment): http://www.hardware.fr/articles/879-...nces-watt.html
This site doesn't allow pictures hotlinking so I can't provide a translated url, but the 650Ti delivers 67 fps/100W in Battlefield3 and the 660 only 54fps I really can't wrap my head around tossing 180€ on a 660, especially when the 660SE is around the corner and that I don't care about games...so I'm pulling the trigger on a 90€ lightly used 1GB 650Ti Buying used 5xx doesn't much sense as these are far less power efficient(34 fps/100W in B3 for the 560Ti ), second hand warranties aren't nearly as long anymore and their prices just aren't competitive(560Ti's regularly go for +100€ used). Surely the 650Ti runs on a 128bit power controller but I'm quite sure it'll still be plenty fast in mVR anyway Last edited by leeperry; 13th March 2013 at 07:33. |
13th March 2013, 07:06 | #17986 | Link |
Registered User
Join Date: May 2008
Posts: 1,840
|
I'm having a bit of a problem with the display mode changer. Currently have 60, 72 and 75 resolutions set up, they all look fine. In madvr this is in the mode changer: 1680x1050p60, 1680x1050p72, 1680x1050p75 It works fine for 24->72 and 30->60 but with 25 fps sources it insists on setting 60hz even if the monitor is set to 75hz. Then I installed reclock, set it to slowdown 25 to 24 and in madvr treat 25 fps as 24. I get 24 fps according to madvr status but it always changes to 60hz. When 75hz is set, madvr/reclock displays 74.984. Any idea what could be going on?
Also with 24 fps source, 72hz refresh rate (71.99139 in madvr status) with smoothmotion set to only when judder, it's enabled even though with it disabled there is no judder. Last edited by turbojet; 13th March 2013 at 07:52. |
13th March 2013, 07:48 | #17987 | Link | |
Registered User
Join Date: Jan 2005
Posts: 7
|
Quote:
I understand that my card is powerful. When I use the Intel GPU to play the following file 32.2 GB Total Bitrate: 31.06 Mbps Video: MPEG-4 AVC Video / 23275 kbps / 1080p / 23.976 fps / 16:9 / High Profile 4.1 I am getting maybe 3-4 drops per minute. The Nvidia decided to lock the mpc-hc.exe to Intel GPU. The renaming trick worked to use nvidia GPU but unfortunately I am getting more drops this way. I think its short of incompatibility or settings issue. Can I try madVR through another program to see if its not mpc-hc related? also madVR settings show me that "Sharpness" needs to be dealt with (is in green color) |
|
13th March 2013, 08:10 | #17989 | Link | |||
Registered User
Join Date: Dec 2012
Location: Neverland, Brazil
Posts: 169
|
Quote:
Quote:
Quote:
__________________
madVR scaling algorithms chart - based on performance x quality | KCP - A (cute) quality-oriented codec pack |
|||
13th March 2013, 13:52 | #17990 | Link |
Registered User
Join Date: Jul 2008
Posts: 157
|
You're missing the point. Deinterlacing etc is done by LAV Cuda decoder. Frees up more room to leave the video card too process Jinc3.
Same would apply to software decoding. I have an a better AMD system, but it can't handle dxva or dxva copy back with the same MadVr settings, even though its more powerful. I have to stick that on software decoding in LAV. Try using different decoders and different modes. |
13th March 2013, 22:42 | #17991 | Link |
Registered User
Join Date: Mar 2007
Posts: 934
|
Doesn't make any difference. Using CUDA on my laptop so the HD4000 only has to do scaling doesn't let me use any higher settings in MadVR, compared to using the HD4000 for everything.
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7 |
14th March 2013, 02:44 | #17992 | Link | |
Registered User
Join Date: May 2009
Posts: 212
|
Quote:
The hybrid display design on most notebooks might cause some issues. It really depends how the board circuit is designed. For example the output ports like LVDS / HDMI might be hard-wired to Intel's GPU even the selected one is nVidia / AMD(ATi) ... As a result, it creates extra latency and needs PCI-e bandwidth to transmit the displayed contents between 2 GPUs. For the comparison, how about NOT using DXVA2 HW H.264 decoding on this content? I guess the CPU of your NB should be powerful enough for such BD stream's decoding. |
|
14th March 2013, 09:36 | #17994 | Link | |
Registered User
Join Date: Jan 2005
Posts: 7
|
Quote:
In order Nvidia high speed processor to be used at Jinc, the laptop must be charging. Not possible to nvidia to handle Jinc whilst laptop is on battery, even at simple tv rips of 720p I experience constant frame drops. Sorry I didn't figure it out sooner. Kind of odd though as it can handle with ease Bicubic and Lanczos on battery. Can anyone confirm that? Is there a workaround? |
|
14th March 2013, 11:49 | #17995 | Link | |
Registered User
Join Date: Mar 2007
Posts: 934
|
Quote:
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7 |
|
14th March 2013, 11:55 | #17996 | Link | |
Registered User
Join Date: Jan 2005
Posts: 7
|
Quote:
I changed: Graphic Processor > on battery > changed from balanced mode to Max performance and I thought that did the trick but it didn't. I am getting unusual frame drops only on battery and only using Jinc. It has to be plugged in. Is there anyone with Alienware notebook to verify this? Last edited by artios; 14th March 2013 at 16:59. |
|
14th March 2013, 19:02 | #17997 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
Look for other power settings, maybe you need to allow active cooling while on battery? There are a lots of power saving features that might push Jinc over the edge.
Jinc takes a lot more GPU power and memory bandwidth than the other resizers, it isn't at all unusual that you are only having issues with it. Your laptop must be throttling something to save power when on battery. Last edited by Asmodian; 14th March 2013 at 19:05. |
15th March 2013, 02:43 | #17999 | Link | |
Registered User
Join Date: Jul 2008
Posts: 157
|
Quote:
I am talking about using software decoding on one pc and Cuda on another allowing for more room for gpu with madVr. If I use dxva, the settings I usually use create dropped frames until I turn them down. I'm guessing either you're integrated graphics is already hitting a wall, or the behavior of a mixed system or Intel dxva is different. Sent from my Blade S using Tapatalk 2 Last edited by Dodgexander; 15th March 2013 at 02:45. |
|
15th March 2013, 03:18 | #18000 | Link | |
Audiophile
Join Date: Oct 2006
Posts: 353
|
Quote:
And yes, DXVA sucks on AMD platforms. I use LAV's CUDA decoder to free up CPU for SmoothVideo. |
|
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
|
|