Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 4th June 2011, 11:38   #1  |  Link
P.J
Δ
 
Join Date: Jun 2008
Posts: 535
Playing AVCHD on Nvidia 8600 without stutter

I can't play my camcorder (Canon HF R100) or bluray AVCHD video files smoothly.
I tried most of the programs/codecs in Windows 7 x64/XP SP3 with the latest driver
The only way I found is to reinstall the graphic driver without reboot then it plays great until I don't reboot or run any program that uses GPU.
I can play with my CPU (T7500) and Splash Pro without quality loss but it uses all of my CPU.
My graphic card is 8600M GT 256MB DDR2. Here's the information of the camcorder and bluray video files:

Quote:
General
ID : 0
Format : BDAV
Format/Info : Blu-ray Video
File size : 151 MiB
Duration : 1mn 20s
Overall bit rate : 15.8 Mbps
Maximum Overall bit rate : 18.0 Mbps

Video
ID : 4113 (0x1011)
Menu ID : 1 (0x1)
Format : AVC
Format/Info : Advanced Video Codec
Format profile : High@L4.0
Format settings, CABAC : Yes
Format settings, ReFrames : 2 frames
Duration : 1mn 19s
Bit rate : 14.9 Mbps
Width : 1 920 pixels
Height : 1 080 pixels
Display aspect ratio : 16:9
Frame rate : 29.970 fps
Resolution : 8 bits
Colorimetry : 4:2:0
Scan type : Interlaced
Scan order : Top Field First
Bits/(Pixel*Frame) : 0.240
Stream size : 142 MiB (94%)

Audio
ID : 4352 (0x1100)
Menu ID : 1 (0x1)
Format : AC-3
Format/Info : Audio Coding 3
Duration : 1mn 20s
Bit rate mode : Constant
Bit rate : 256 Kbps
Channel(s) : 2 channels
Channel positions : L R
Sampling rate : 48.0 KHz
Video delay : -67ms
Stream size : 2.44 MiB (2%)
Quote:
General
ID : 0 (0x0)
Format : BDAV
Format/Info : Blu-ray Video
File size : 14.4 GiB
Duration : 1h 15mn
Overall bit rate : 27.3 Mbps
Maximum Overall bit rate : 48.0 Mbps

Video
ID : 4113 (0x1011)
Menu ID : 1 (0x1)
Format : AVC
Format/Info : Advanced Video Codec
Format profile : High@L4.1
Format settings, CABAC : Yes
Format settings, ReFrames : 4 frames
Format settings, GOP : M=1, N=24
Codec ID : 27
Duration : 1h 15mn
Bit rate mode : Variable
Bit rate : 22.4 Mbps
Maximum bit rate : 33.0 Mbps
Width : 1 920 pixels
Height : 1 080 pixels
Display aspect ratio : 16:9
Frame rate : 29.970 fps
Color space : YUV
Chroma subsampling : 4:2:0
Bit depth : 8 bits
Scan type : MBAFF
Bits/(Pixel*Frame) : 0.360
Stream size : 11.8 GiB (82%)
Color primaries : BT.709-5, BT.1361, IEC 61966-2-4, SMPTE RP177
Transfer characteristics : BT.709-5, BT.1361
Matrix coefficients : BT.709-5, BT.1361, IEC 61966-2-4 709, SMPTE RP177

Audio #1
ID : 4352 (0x1100)
Menu ID : 1 (0x1)
Format : DTS
Format/Info : Digital Theater Systems
Format profile : MA / Core
Muxing mode : Stream extension
Codec ID : 134
Duration : 1h 15mn
Bit rate mode : Variable
Bit rate : 1 576 Kbps / 1 510 Kbps
Channel(s) : 6 channels
Channel positions : Front: L C R, Side: L R, LFE
Sampling rate : 48.0 KHz
Bit depth : 24 bits
Compression mode : Lossless / Lossy

Audio #2
ID : 4353 (0x1101)
Menu ID : 1 (0x1)
Format : PCM
Format settings, Endianness : Big
Format settings, Sign : Signed
Muxing mode : Blu-ray
Codec ID : 128
Duration : 1h 15mn
Bit rate mode : Constant
Bit rate : 2 304 Kbps
Channel(s) : 2 channels
Channel positions : Front: L R
Sampling rate : 48.0 KHz
Bit depth : 24 bits
Stream size : 1.21 GiB (8%)
After reinstalling graphic driver (smooth):



Anytime else (not smooth):

P.J is offline   Reply With Quote
Old 4th June 2011, 23:08   #2  |  Link
Mangix
Audiophile
 
Join Date: Oct 2006
Posts: 353
does your gpu have any power saving features or downclocking features? if so you might want to turn them off. on one of the gpu-z screenshots, even though the clocks are the same, it looks like there's a bigger range.

also, you might want to try overclocking. it might help :\
Mangix is offline   Reply With Quote
Old 4th June 2011, 23:33   #3  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,926
Hmm the GPU barely copes with the deinterlacing but that it suddenly stops working on later plays @ all until the driver is reinstalled is very strange indeed (first time i hear someone is mentioning such Nvidia behavior)

Overclocking wont help as it works on the first run if you read what he wrote carefully, what is your current driver revision ?
Did you also tried other decoders for example LAV Cuvid and or Cyberlink (PowerDVD) ? also on Win7 the WMP itself with Microsofts Decoder and DXVA that should always work if other fail (you just have to get the input working see Lav Splitter and the registry hacks to get the Dshow filter working)
Mirillis Player/Decoder is a very new one and surely full of bugs (last time i tested it their where a heavy amount of issues)

Not sure but 256 MB is also hard on the Edge for 1080p though in your case scaled down it seems to work

Though you could indeed also try to set the Full Power profile and see if that influences it, fluctuation in such a hard edge case is never a good thing.

The Video Engine is always correctly under load but the GPU (deinterlacing) seems to have problems on the other run very strange.

1. Try Lav Cuvid and Cyberlink DXVA + MPC-HC as player on EVR (Win7) and VMR7 (XP) (i would advice for Cyberlink DXVA for several reasons )
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 5th June 2011 at 00:03.
CruNcher is offline   Reply With Quote
Old 5th June 2011, 08:31   #4  |  Link
P.J
Δ
 
Join Date: Jun 2008
Posts: 535
First thanks for the replies

Quote:
Originally Posted by Mangix View Post
on one of the gpu-z screenshots, even though the clocks are the same, it looks like there's a bigger range.
Yes, nice point

Quote:
Originally Posted by CruNcher View Post
1. Try Lav Cuvid
Edit: It didn't help too.

Last edited by P.J; 5th June 2011 at 08:51.
P.J is offline   Reply With Quote
Old 5th June 2011, 09:58   #5  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,926
See above their are still possibilities i wouldn't give up that fast especially knowing it worked, also Bios updating the Laptop/Notebook might be an option,
but first install the latest whql verde drivers http://www.nvidia.com/object/noteboo...ql-driver.html or XP http://www.nvidia.com/object/noteboo...ql-driver.html.
The biggest chances though are via DXVA and a performant renderer (Overlay Mixer,VMR7 Overlay), try DXVA checker and see how that works out http://bluesky23.yu-nagi.com/en/#DXVAChecker
And as i said i would try Cyberlinks DXVA (Just install powerDVD11 trial) its' very nice optimized and has the highest chances as well as Microsofts own Decoder in Win7 via WMP to work

And if really everything fails their still would be the possibility to try Linux and get it working via VDPAU
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 5th June 2011 at 10:11.
CruNcher is offline   Reply With Quote
Old 5th June 2011, 12:56   #6  |  Link
P.J
Δ
 
Join Date: Jun 2008
Posts: 535
I followed you but nothing helped me. It's working fine on my cousin's desktop 8600GT GDDR3 256MB.
The only way is Splash + CPU for me or that crazy way
P.J is offline   Reply With Quote
Old 5th June 2011, 13:10   #7  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,926
puhh then im out of ideas it sounds crazy nothing impossible though in it(c) different systems different complexities and possible factors for issues, not sure but one last possibility would be in the bios itself looking for gfx related options, though the bios is optimized by the vendors and mostly not changeable @ all on Mobile platforms, dunno about your system
So in the end you might need to contact the Laptop/Notebook vendor and bring in your issues and ask for Help (though he most probably will reply with something like "it can work yes but it's unstable and never was a advertised feature we only advertised 720p playback so we see no reason to support you in your efforts to make it work" ).
Does lower resolution bitstreams playing fine you only have problems with Full HD 1080p and only Interlaced or also with progressive 1080p streams ?
And you can watch that whole Blu-Ray with full Hardware acceleration and without issues but only 1 time after driver installation or are the acceleration failing problems already appearing while the 1st playback ?

Another option you would have if you still want to use Hardware acceleration would be a external Decoder DSP like a Broadcom Crystal (also Splash Player supports it)l if you have a Mini PCI-E slot free for it
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 5th June 2011 at 13:39.
CruNcher is offline   Reply With Quote
Old 5th June 2011, 16:45   #8  |  Link
P.J
Δ
 
Join Date: Jun 2008
Posts: 535
Quote:
Originally Posted by CruNcher View Post
Does lower resolution bitstreams playing fine you only have problems with Full HD 1080p and only Interlaced or also with progressive 1080p streams ?
And you can watch that whole Blu-Ray with full Hardware acceleration and without issues but only 1 time after driver installation or are the acceleration failing problems already appearing while the 1st playback ?

Another option you would have if you still want to use Hardware acceleration would be a external Decoder DSP like a Broadcom Crystal (also Splash Player supports it)l if you have a Mini PCI-E slot free for it
I have problem only with AVCHD 1920x1080i High profile.
AVCHD 1920x1080p24 High profile plays well.

Yes, I can watch the whole video without problem even if I run it again.
But I'll get problem again as soon as another program uses GPU.
P.J is offline   Reply With Quote
Old 6th June 2011, 11:44   #9  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,926
So just to be absolutely sure i understood you correct you lose the ability to hardware playback those 1080i streams after you used another application that uses the GPU (for example a Game) or reboot immediately after the driver installation. And then only reinstalling the Driver lets you playback that 1080i file flawless again Hardware accelerated but only as long as you don't again either reboot or start another Application that uses the GPU, is this correct how i understood it ?.

Or do you mean some specific application (that doesn't use the GPU but the Video Engine for example) or a application that keeps running after execution the whole time and uses the GPU ?
I mean a very good example would be on Win7 the Aero interface eg DWM which constantly uses the GPU, though you said these problems are the same on XP and Win7 ?.

But if their are the same case on XP and Win7 then i wonder how after the immediate reinstallation of the Driver you get Hardware Playback working on XP @ all that shouldn't work only after a reboot (actually im not 100% sure of that as i never tested it with the Video Part but at least the 3D Part shouldn't be able to function before a reboot on XP @ all after installing the Nvidia driver).
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 6th June 2011 at 12:09.
CruNcher is offline   Reply With Quote
Old 6th June 2011, 21:08   #10  |  Link
P.J
Δ
 
Join Date: Jun 2008
Posts: 535
First thanks for your replies

Quote:
Originally Posted by CruNcher View Post
So just to be absolutely sure i understood you correct you lose the ability to hardware playback those 1080i streams after you used another application that uses the GPU (for example a Game) or reboot immediately after the driver installation. And then only reinstalling the Driver lets you playback that 1080i file flawless again Hardware accelerated but only as long as you don't again either reboot or start another Application that uses the GPU, is this correct how i understood it?

But if their are the same case on XP and Win7 then i wonder how after the immediate reinstallation of the Driver you get Hardware Playback working on XP @ all that shouldn't work only after a reboot (actually im not 100% sure of that as i never tested it with the Video Part but at least the 3D Part shouldn't be able to function before a reboot on XP @ all after installing the Nvidia driver).
Exactly.

I didn't still try that method in XP but will tell you soon.

I found that method when my GPU crashed while playing some HD videos because it played well after that.
So I found out that I need a way to reset the video driver.

Edit: AVCHD 1920x1080i Main profile and AVCHD 1440x1080i High profile play fine.

Last edited by P.J; 6th June 2011 at 21:10.
P.J is offline   Reply With Quote
Old 6th June 2011, 21:58   #11  |  Link
Mikey2
Registered User
 
Join Date: Nov 2010
Posts: 80
I have never owned an AVCHD Camcorder; however, I do run an 8600 GT graphics card and use it all the time to watch HD movies.

With only 256MB of GPU memory on the 8600 I found that doing hardware decoding/rendering of HD material is pushing it at best (your screenshots with the GPU-Z values attest to that.)

HOWEVER, I just found out that if I use software decoding instead of hardware decoding, suddenly all of my HD files now play flawlessly! Take a look at your GPU-Z screenshots (both of them) - Notice that the "Video Engine Load" is running at 91%. Well, once I switched to software decoding, that line on my graph is *0%* (!!!)

Now, as far as going into details on how to do this, that gets somewhat tricky depending on the flexibility of your display methods and how much you know about this stuff. If you can play this using DirectShow Filters/Codecs similar to what we use for h.264 and/or BluRay movies, then I can show you how to do this. (For instance, I use Media Player Classic-HC with LAVSplitter, ffdShow, and MadVR...the trick is that I stopped using my CoreAVC/Cuda hardware-decoding filter, and instead just let ffdshow do the decoding with ffmeg-mt via software decoding. In addition, I do NOT use DXVA. There is absolutely no difference in quality - video decoding is simply a mathematical algorithm that is identical regardless of it's being done on the CPU or the Graphics card. (Moreover, due to the limitations to DXVA, one can make the picture look better given the extra flexibility when not using DXVA.)

Overall, software decoding just balances out the work and lets the CPU do the decoding while leaving the GPU to simply do the rendering.

I am not sure how far off I am nor how much you know about this stuff....I hope this makes sense. Personally, after having so many problems similar to yours, I was ecstatic to learn that via this method I do not need to buy a new Video-Card.

I am happy to continue to try to help if you would like...

Good luck!
MikeY
Mikey2 is offline   Reply With Quote
Old 7th June 2011, 00:04   #12  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,926
Quote:
Originally Posted by Mikey2 View Post
I have never owned an AVCHD Camcorder; however, I do run an 8600 GT graphics card and use it all the time to watch HD movies.

With only 256MB of GPU memory on the 8600 I found that doing hardware decoding/rendering of HD material is pushing it at best (your screenshots with the GPU-Z values attest to that.)

HOWEVER, I just found out that if I use software decoding instead of hardware decoding, suddenly all of my HD files now play flawlessly! Take a look at your GPU-Z screenshots (both of them) - Notice that the "Video Engine Load" is running at 91%. Well, once I switched to software decoding, that line on my graph is *0%* (!!!)

Now, as far as going into details on how to do this, that gets somewhat tricky depending on the flexibility of your display methods and how much you know about this stuff. If you can play this using DirectShow Filters/Codecs similar to what we use for h.264 and/or BluRay movies, then I can show you how to do this. (For instance, I use Media Player Classic-HC with LAVSplitter, ffdShow, and MadVR...the trick is that I stopped using my CoreAVC/Cuda hardware-decoding filter, and instead just let ffdshow do the decoding with ffmeg-mt via software decoding. In addition, I do NOT use DXVA. There is absolutely no difference in quality - video decoding is simply a mathematical algorithm that is identical regardless of it's being done on the CPU or the Graphics card. (Moreover, due to the limitations to DXVA, one can make the picture look better given the extra flexibility when not using DXVA.)

Overall, software decoding just balances out the work and lets the CPU do the decoding while leaving the GPU to simply do the rendering.

I am not sure how far off I am nor how much you know about this stuff....I hope this makes sense. Personally, after having so many problems similar to yours, I was ecstatic to learn that via this method I do not need to buy a new Video-Card.

I am happy to continue to try to help if you would like...

Good luck!
MikeY
Everything you say is nice and correct but you know why you use Hardware acceleration do you ?
I mean especially for Mobile Devices it's about Power Consumption (saving battery life) on the Desktop that isn't so important (though not true either their are many user who think it is equally important) it can help realizing some different workloads where you torture the CPU with less things like Decoding though Hardware Decoding surely is always limited and no ISV or OSS yet can use both efficiently based on different scenarios.

They all have have Design flaws either you use the Hardware Playback or the Software but non lets you efficiently combine both where it would make sense
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 7th June 2011 at 00:09.
CruNcher is offline   Reply With Quote
Old 7th June 2011, 01:51   #13  |  Link
Mikey2
Registered User
 
Join Date: Nov 2010
Posts: 80
Quote:
Originally Posted by CruNcher View Post
Everything you say is nice and correct but you know why you use Hardware acceleration do you ?
I mean especially for Mobile Devices it's about Power Consumption (saving battery life) on the Desktop that isn't so important (though not true either their are many user who think it is equally important) it can help realizing some different workloads where you torture the CPU with less things like Decoding though Hardware Decoding surely is always limited and no ISV or OSS yet can use both efficiently based on different scenarios.

They all have have Design flaws either you use the Hardware Playback or the Software but non lets you efficiently combine both where it would make sense
Oh yes, I am beginning to have a deeper understanding of the pros/cons of graphics Hardware Acceleration and the inevitable trade-offs common to all engineering solutions.

My point is that I think technology is back to a point where many people should consider at least trying to go back to a configuration that uses "software decoding" to rely less heavily on the GPU, especially in a situation (such as ours) where our CPU's/Bus speeds are superior to our graphics cards. (FYI I have 2 8600GT's in SLI but have a much superior overclocked quad-core Q6600 CPU.)

Most of us come from either gaming backgrounds and/or from a time when using HW Acceleration on new-gen GPU's not only freed up computational power to the CPU, but also optimized the calculations due to the customized PLA of the GPU, rather than the generalized architecture of the CPU/Mobo.

Thus the rule of thumb was to always do HW Acceleration whenever possible. The trends have been apparent in the marketplace: Mobo's and CPU's have been dropping in price and increasing in bandwidth folloing Moore's law. However, with this craze to try to do everything on the GPU, those cards are often the most powerful and expensive parts in most high-end gamer's machines.

This makes sense for gamers, since not only are their graphics purposes utilizing many more features specific to gaming, but it also frees-up the CPU for multitasking and computational algorithms, communications, etc used in the game-engine logic.

However, for those of us who are primarily using our systems for an HTPC, I see two fundamental faults with this trend:
  1. Even when processing h.264/AVC/etc, much of the GPU's functionality is not utilized (specifically many 3D features....even in MadVR!) And since they are such specialized PLA's, it cannot be used for anything else.
  2. Watching (or even editing) a video does not require as much "generic" work that a gamer or engineer needs, such as database queries or complex algorithms...

Thus Many of us have enough excess processing power on our main CPU/Bus to pave the way for using that free computational power on the CPU to do the intensive task of decoding, rather than cramming it into the GPU, which must do the rendering step no matter what.

Furthermore, since Moore's law is now crippled by other limiting facts such as bus-speed and HDD access, most decent processors still have the excess processing power ready to be used while it is queued up waiting in the pipeline for other commands. (For instance, even I still see latency when my SataII 6Gb/s SSD is communicating via my RAM and northbridge to the CPU...thus I hardly ever peg my CPU even in the most demanding tasks.)

That is where I disagree: at least in my experience, I have had excellent luck simultaneously running software [CPU] decoding and hardware [GPU] rendering. The proof is that I not only see no dropped frames, but I can also multi-task fairly smoothly. When I used DXVA, Cuda, or CPUID for HW decoding, naturally the CPU usage was nil; however, GPU-Z was showing my graphics cards [remember I have SLI] maxed out. Furthermore, I ran a DPC-Latency checker and it was...spotty to say the best.

Now I see all of my performance monitors hovering around 40-60% across all cores and processors without any considerable latency. In other words, I am now fully utilizing my entire system.

I should note that this is a very new successful test and change of thinking for me. I believe that much has to do with the very latest NVidia Graphics drivers, madVR, FFmpeg, LAV Splitter, and particularly the new ffmpeg-mt multi-threaded software decoder for H.264/AVC files (used in lieu of libavcodec in ffdShow....))

I admit, I was taking a little bit of a shot in the dark in regards to this particular issue; however, VERY long story short (lol sorry) I have had much better luck balancing the tasks across both the CPU and the GPU rather than trying to dump-off as much as I can to the GPU while letting the CPU do next to nothing.

Phew, I apologize for the length of that! ...In short, I think it is worthwhile to at least give it a shot. I am glad I did - I was one day away from buying a new (overpriced) GPU before I figured out this magic!

MikeY
Mikey2 is offline   Reply With Quote
Old 7th June 2011, 07:17   #14  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,926
The Problem is tweaking the System for GPU (throughput/latency between GPU/system memory) Performance (CPU clocks/async) and tweaking it for CPU performance are different especially for discreet systems in Systems like Sandy Bridge or Brazos you wont have these problems, and you have to differentiate between Video Engine Load and GPU Load Video Engine Load isn't problematic @ all. And especially for lower Bandwith systems like Laptops/Netbooks you should always try to use DXVA instead of NVcuvid (Nvidias Video API falsely called CUDA by many).
And yes getting CPU playback flawless on a High Performance Dual or Quadcore is rather easy if Multithreading is supported, and granted their are still a lot of issues with GPUs especialy OS wise we have no real preemptive multitasking yet on the Driver side not with WDDM 1.1 and it still floats in the Air how the Plans are here from Microsoft and GPU Vendors for Windows 8 we might see the first changes

I hope we soon get some more information what happened in those regards http://download.microsoft.com/downlo...ri103_wh06.ppt is a little outdated these days

We are surely not their yet and so i will never be surprised about user who report frame drops especially those who doesn't take other problematic stuff with this into account like low timers and applications especially like Aida,Afterburner or other Hardware poolers that can interfere here running in the background, unfortunately Windows is no RTOS :P

I hope @ PDC this year we finally get a update on the state above WDDM 1.1 and im a little surprised to hear nothing about it from the Windows 8 milestones yet seems no one looked as far down and is only interested in the stuff above which is imho absolutely uninteresting consumer stuff.
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 7th June 2011 at 07:56.
CruNcher is offline   Reply With Quote
Old 7th June 2011, 07:18   #15  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,336
Quote:
Originally Posted by Mikey2 View Post
That is where I disagree: at least in my experience, I have had excellent luck simultaneously running software [CPU] decoding and hardware [GPU] rendering. The proof is that I not only see no dropped frames, but I can also multi-task fairly smoothly. When I used DXVA, Cuda, or CPUID for HW decoding, naturally the CPU usage was nil; however, GPU-Z was showing my graphics cards [remember I have SLI] maxed out. Furthermore, I ran a DPC-Latency checker and it was...spotty to say the best.
DXVA decoding (or any kind of hardware decoding) does not benefit from SLI, and i'm unsure if rendering a video does.
The 8600 is a pretty old and slow card (and severly memory limited), running the same on a fast CPU is not comparable.

If you have a recent GPU, and assuming there are no bugs in the decoder and the GPU driver, GPU decoding is equally smooth as CPU decoding, and frees up some resources for other tasks, for example frame interpolation that some people like to do (and every % of CPU is valuable there, I've been told).

Me personally, i use GPU decoding mostly to keep my CPU cool, running CPU decoding in my HTPC makes the poor thing heat up so much, and the fan gets noisy. The GPU doesn't have that problem.
In addition to this, with LAV CUVID i also get awesome deinterlacing with madVR, otherwise i would have to do that in software too (with a probably lower quality)
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 7th June 2011, 08:13   #16  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,926
Quote:
Originally Posted by nevcairiel View Post
DXVA decoding (or any kind of hardware decoding) does not benefit from SLI, and i'm unsure if rendering a video does.
The 8600 is a pretty old and slow card (and severly memory limited), running the same on a fast CPU is not comparable.

If you have a recent GPU, and assuming there are no bugs in the decoder and the GPU driver, GPU decoding is equally smooth as CPU decoding, and frees up some resources for other tasks, for example frame interpolation that some people like to do (and every % of CPU is valuable there, I've been told).

Me personally, i use GPU decoding mostly to keep my CPU cool, running CPU decoding in my HTPC makes the poor thing heat up so much, and the fan gets noisy. The GPU doesn't have that problem.
In addition to this, with LAV CUVID i also get awesome deinterlacing with madVR, otherwise i would have to do that in software too (with a probably lower quality)
Yep a very daunting task it took me myself some time to optimize my system to be able to play 1080p flawless in Realtime @ Double Framerate with lowest CPU utilization possible (60% is though still heavy) on a Quadcore and the guys from SVP http://svp-team.com/ did a major step here taking of the load of the CPU (and there is still room for improvement) it's a really recommended thing it blows every ISV solution into Space @ least in quality (Corel,Arcsoft,Cyberlink,Mirilis). Hehe the Russians again, though without manao we wouldn't most probably be their yet so hail to france too

And doing stuff like this http://www.greenparrotpictures.com/slomo.php in realtime or like them www.thefoundry.co.uk where even these days big workstations are used on a consumer system is amazing (with a 4 year old GPU )

The big problem it needs to be a master to use it yet efficiently in a NLE enviroment hehe http://www.youtube.com/watch?v=M9TJuxo6lNI not very convenient for avg joe users

PS: I also let me be carried away into off topic still the main task was to help the thread creator but i have to say im pretty @ the end of possible answer to this/his issue it's so crazy strange this works 1 time then never again until driver reinstall it almost sounds like some kind of restriction that gets forced after a reboot or either the use of the GPU @ all (not the Video Engine) by the driver :P (though unlikely,not impossible but unlikely). Better explanation would be the Video Memory is never fully cleaned up and then 1080p I fails (in the context of only 256 MB) but that's also unlikely as it doesn't work after a reboot @ all again
Wait did you tried to turn off Aero on Win7 you can't unfortunately turn of DWM entirely that only works on Vista if i remember right, but then again it also shows the same issue on XP (where i still can't believe that a driver re install actually makes it possible to use the Video Engine @ all without a reboot) as you said, so nah im to confused with this issue if anyone else has maybe some thesis or knows about this (temporal works) issue i guess many would like to know what's going on on this users Nvidia Windows System .
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 7th June 2011 at 09:39.
CruNcher is offline   Reply With Quote
Old 7th June 2011, 20:35   #17  |  Link
P.J
Δ
 
Join Date: Jun 2008
Posts: 535
Quote:
Originally Posted by CruNcher View Post
PS: I also let me be carried away into off topic still the main task was to help the thread creator but i have to say im pretty @ the end of possible answer to this/his issue it's so crazy strange this works 1 time then never again until driver reinstall it almost sounds like some kind of restriction that gets forced after a reboot or either the use of the GPU @ all (not the Video Engine) by the driver :P (though unlikely,not impossible but unlikely). Better explanation would be the Video Memory is never fully cleaned up and then 1080p I fails (in the context of only 256 MB) but that's also unlikely as it doesn't work after a reboot @ all again
Wait did you tried to turn off Aero on Win7 you can't unfortunately turn of DWM entirely that only works on Vista if i remember right, but then again it also shows the same issue on XP (where i still can't believe that a driver re install actually makes it possible to use the Video Engine @ all without a reboot) as you said, so nah im to confused with this issue if anyone else has maybe some thesis or knows about this (temporal works) issue i guess many would like to know what's going on on this users Nvidia Windows System .
for backing to the topic.

I'm sure it's not low memory problem because it doesn't get problem only after running a heavy graphic program.
I did even a clean install but got the same result.
It seems I need an easier way to reset the graphic driver

Edit: You were right. It didn't work in XP because it doesn't reset the video driver (no blank screen while installing and it asked to restart)

Last edited by P.J; 7th June 2011 at 21:42.
P.J is offline   Reply With Quote
Old 9th June 2011, 04:48   #18  |  Link
Mikey2
Registered User
 
Join Date: Nov 2010
Posts: 80
Quote:
Originally Posted by P.J View Post
for backing to the topic.

I'm sure it's not low memory problem because it doesn't get problem only after running a heavy graphic program.
I did even a clean install but got the same result.
It seems I need an easier way to reset the graphic driver

Edit: You were right. It didn't work in XP because it doesn't reset the video driver (no blank screen while installing and it asked to restart)
I apologize P.J for forcing this a bit off-topic (although if you really wanted to get into it; much of this may apply to you...mainly that you might also see a performance boost using software decoding...)

And thanks to the rest of you for bearing with me and actually responding with real insight to my overly long post. It involved some topics that I are quite pressing to me. (For instance, I also just discovered LAV de-interlacing...I love it! )

While this stuff fascinates me, I do not have the time for another long post, but I am learning a lot - when I have time I may start another thread. But to sum it up, I think it comes down to the fact that P.J. and I are just really pushing it running the 8600 GT...not only because of the video-memory, which is too high for my comfort in GPU-Z, but also because I agree with P.J. that that doesn't seem to be the main issue.

P.J. - So is your issue fixed now? I overlooked the fact that you are on XP - that changes several things. (E.g. Aero does much more than making windows look pretty and transparent...)

Thanks again for all of the info!
MikeY

PS - I was shocked to see that SLI did help me out when using Cuda (or whatever it is called ) ...Shockingly, GPU-Z showed ALL of the "Video Engine Load" on the secondary GPU. With it turned off, it was obviously on the main GPU. Consequently, when running higher-powered upscaling like 4-tap Spline, I received far fewer frame-drops. This is of course only when using HW decoding, since that is what maps to that process-view.
Mikey2 is offline   Reply With Quote
Old 9th June 2011, 07:45   #19  |  Link
P.J
Δ
 
Join Date: Jun 2008
Posts: 535
No problem. Thanks for your reply
Overclocking doesn't help. Now I have only two ways: using that crazy method or Splash+CPU.
Splash was the only program that could play like GPU quality. It doesn't need any 3rd-party program too
P.J is offline   Reply With Quote
Old 9th June 2011, 09:46   #20  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,926
Quote:
Originally Posted by Mikey2 View Post
I apologize P.J for forcing this a bit off-topic (although if you really wanted to get into it; much of this may apply to you...mainly that you might also see a performance boost using software decoding...)

And thanks to the rest of you for bearing with me and actually responding with real insight to my overly long post. It involved some topics that I are quite pressing to me. (For instance, I also just discovered LAV de-interlacing...I love it! )

While this stuff fascinates me, I do not have the time for another long post, but I am learning a lot - when I have time I may start another thread. But to sum it up, I think it comes down to the fact that P.J. and I are just really pushing it running the 8600 GT...not only because of the video-memory, which is too high for my comfort in GPU-Z, but also because I agree with P.J. that that doesn't seem to be the main issue.

P.J. - So is your issue fixed now? I overlooked the fact that you are on XP - that changes several things. (E.g. Aero does much more than making windows look pretty and transparent...)

Thanks again for all of the info!
MikeY

PS - I was shocked to see that SLI did help me out when using Cuda (or whatever it is called ) ...Shockingly, GPU-Z showed ALL of the "Video Engine Load" on the secondary GPU. With it turned off, it was obviously on the main GPU. Consequently, when running higher-powered upscaling like 4-tap Spline, I received far fewer frame-drops. This is of course only when using HW decoding, since that is what maps to that process-view.
Yeah his problem is a real edge case and he said he tried on both XP as well as 7 and indeed Vista/7 with DWM and Aero can be beneficial but on the other side might be not in such edge cases im not so sure about that yet (my own experience with NT 6 is limited) on 1 side it's beneficial to have such a GUI running GPU (performance, main memory reduction, application system stall prevention ect) on the other side it also brings problems that you need to cleverly compensate (latency, rendering modes, multitasking ect) and we are just in WDDM 1.1 currently on Win7 with a lot room for improvement which hopefully the next step we gonna see with Windows 8
Anyways his problem is crazy because of the fact that it actually works 1 time (and logic says if the GFX memory would be indeed to low for 1080i then it wouldn't work @ all not even 1 time, or at least 1 time and then after a reboot again 1 time) but then stops forever and can only be recovered (and again only temporarily) after a reinstall of the driver (Win7) that is something very unusual i never heard of in troubleshooting Nvidia based Systems before (actually i know that Nvidias Driver has a adaptive decision system build in for Deinterlacing that makes quality decisions depending on the different hardware capabilities and might fail here being a edge case somehow), but the possibilities for this on his system can go deep down to the Hardware level especially Bios problems where most of us have no access anyways (don't forget this is a laptop/notebook system) and only the manufacture could in the end handle this, though as i already explained he most probably wont do that in his case as he doesn't need to as this was no advertised feature of the System (1080i Playback) when he bought it and the support or guarantee might be run out already also, so hes left alone.

As i said also their would still possibilities @ least for getting a better overview if this is OS dependent also (on his machine) by installing a Linux and try it via Vdpau instead of DXVA or Nvcuvid (though also in the Linux driver this decision system is active but can @ least be overridden manually)
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 9th June 2011 at 10:28.
CruNcher is offline   Reply With Quote
Reply

Tags
avchd, bluray, nvidia, stutter, vp2

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 05:10.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.