Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > MPEG-4 AVC / H.264

Closed Thread
 
Thread Tools Search this Thread Display Modes
Old 6th September 2008, 03:49   #1401  |  Link
Ranguvar
Registered User
 
Ranguvar's Avatar
 
Join Date: Feb 2007
Location: ::1
Posts: 1,236
This I am definitely looking forward to. However, is there a way you can gauge GPU usage? I use FFT3DGPU a lot, and while I'm pretty sure the GPU should do both, I'd like to have an estimate on how much H.264 decoding taxes the GPU Though I bet the load off the CPU when encoding will make up for it.

Thanks for all your work
Ranguvar is offline  
Old 6th September 2008, 03:51   #1402  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
My use will be on the VP2 core, which is different from the rest of the GPU. We'll have to wait and see.
Guest is offline  
Old 6th September 2008, 07:31   #1403  |  Link
lucassp
Registered User
 
Join Date: Jan 2007
Location: Romania, Timisoara
Posts: 223
Quote:
Originally Posted by neuron2 View Post
What video card do you own?
I can also test on 8800GT and Vista x64.
lucassp is offline  
Old 6th September 2008, 07:52   #1404  |  Link
opieant
Registered User
 
Join Date: May 2004
Posts: 27
Found another minor file name bug. When saving a project, the default output file name is the substring of the file name before the first period character. The name should probably be the substring before the last period so that things like "Dr." and "Mr." are allowed to be part of the file name.
opieant is offline  
Old 6th September 2008, 08:09   #1405  |  Link
Ranguvar
Registered User
 
Ranguvar's Avatar
 
Join Date: Feb 2007
Location: ::1
Posts: 1,236
I have XP Pro x64 and a GeForce 9600GT 512MiB if you need testers
Ranguvar is offline  
Old 6th September 2008, 08:57   #1406  |  Link
bob0r
Pain and suffering
 
bob0r's Avatar
 
Join Date: Jul 2002
Posts: 1,337
Quote:
Originally Posted by neuron2 View Post
I think so.

Waiting for the naysayers to chime in...
Just keep your GPU burning for 24 hours.... then lets see what you think
Many have fried.... We need the best of both worlds!
bob0r is offline  
Old 6th September 2008, 09:33   #1407  |  Link
lucassp
Registered User
 
Join Date: Jan 2007
Location: Romania, Timisoara
Posts: 223
Quote:
Originally Posted by bob0r View Post
Just keep your GPU burning for 24 hours.... then lets see what you think
Many have fried.... We need the best of both worlds!
The GPU will be as hot as playing a movie with PowerDVD with DXVA on.
lucassp is offline  
Old 6th September 2008, 09:53   #1408  |  Link
crypto
@DVBPortal
 
crypto's Avatar
 
Join Date: Feb 2004
Posts: 434
Quote:
Originally Posted by neuron2 View Post
What video card do you own?
GeForce 8600 GTS
Bios Information: Version 60.84.50.0.8
Video RAM 256 MB
Shared RAM 766 MB
Total 1022 MB

Diver Version 7.15.11.7783

I have run a CUDA bandwidth check:

Host to Device 2472 MB/s
Devive to Host 2471 MB/s
Device to Device 9727 MB/s
crypto is offline  
Old 6th September 2008, 10:07   #1409  |  Link
Renzz
Registered User
 
Join Date: Jan 2004
Posts: 55
Quote:
Originally Posted by neuron2 View Post
Just for fun, I benchmarked Nvidia 8500GT versus CoreAVC on E8500 @ 3.8GHz using a 1080P25 AVC video. I forced the frame rate to 120fps using an Avisynth script.

Nvidia GPU: 66 fps (CPU 4%)
CoreAVC: 60 fps (CPU 60%)

Which would you prefer?
How about both? What about using GPU and CPU - 126 fps??
Renzz is offline  
Old 6th September 2008, 11:40   #1410  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
Quote:
Originally Posted by opieant View Post
Found another minor file name bug. When saving a project, the default output file name is the substring of the file name before the first period character. The name should probably be the substring before the last period so that things like "Dr." and "Mr." are allowed to be part of the file name.
Yes, and for the name of the demuxed video too. I'll fix it. Thank you for pointing it out.
Guest is offline  
Old 6th September 2008, 11:48   #1411  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
Hmm, I just looked at the code and it looked as if it DOES use the right most '.' as you want. I tested it and it works as it should. How do you make it fail?
Guest is offline  
Old 6th September 2008, 12:05   #1412  |  Link
squid_80
Registered User
 
Join Date: Dec 2004
Location: Melbourne, AU
Posts: 1,963
Quote:
Originally Posted by lucassp View Post
The GPU will be as hot as playing a movie with PowerDVD with DXVA on.
No; playing back a movie means frames are processed at the original framerate while DGAVCDecode processes frames as fast as the host application calls for them.
squid_80 is offline  
Old 6th September 2008, 13:47   #1413  |  Link
lucassp
Registered User
 
Join Date: Jan 2007
Location: Romania, Timisoara
Posts: 223
Quote:
Originally Posted by squid_80 View Post
No; playing back a movie means frames are processed at the original framerate while DGAVCDecode processes frames as fast as the host application calls for them.
Yes, true. But it uses only the VP2. No shaders/ROP involved in this.
lucassp is offline  
Old 6th September 2008, 14:09   #1414  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
And the VP2 is the slowest clock domain at 450MHz. Decoding as fast as I can is not going to overheat the GPU.

Currently the VP2 is clocked the same on all the GPUs, so the low end 8500GT will decode as well as the 280! Nvidia tells me the VP2 clock rate will be increased in forthcoming GPUs.

Progress report: I have just finished abstracting out the libavcodec support into a general decoder interface in DGAVCIndex. I will now implement the CUDA decoder to this interface. To support any new decoder, you just supply a file that implements these functions. This is what the interface looks like:

Code:
extern int decoder_open(void);
extern void decoder_close(void);
extern int decoder_reset(void);
extern int decoder_decode_nalu(int *frameFinished, unsigned char *buf, int len);
extern void decoder_copy_frame(unsigned char *y, unsigned char *u, unsigned char *v);
extern unsigned int decoder_get_width(void);
extern unsigned int decoder_get_height(void);
extern int decoder_get_poc();
Later, these will be function pointers to allow for changing the decoder without recompiling.

Last edited by Guest; 6th September 2008 at 15:58.
Guest is offline  
Old 7th September 2008, 14:40   #1415  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,926
Great Donald also that you have access to the Deinterlacer and IVTC now we can also compare them in real with the other (software) ones available
if you still need someone to test it out im here (8800GT G92 512mb)
Can you get access to them without being dependent on the video input @ all (Mpeg-2,VC-1,H.264), so something like nvidiadeint() nvidiaivtc() is possible ?
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 7th September 2008 at 14:48.
CruNcher is offline  
Old 8th September 2008, 06:11   #1416  |  Link
opieant
Registered User
 
Join Date: May 2004
Posts: 27
Quote:
Originally Posted by neuron2 View Post
Hmm, I just looked at the code and it looked as if it DOES use the right most '.' as you want. I tested it and it works as it should. How do you make it fail?
Whoops. It happens when there is a period in the name of a file with no extension, as it should be doing when cutting off at the last period.

Looks like Windows considers a file name with periods to have no extension if any spaces follow the final period. Of course, I could just put extensions back on my files to work around this, but what fun would that be?
opieant is offline  
Old 8th September 2008, 12:12   #1417  |  Link
CiNcH
Registered User
 
CiNcH's Avatar
 
Join Date: Jan 2004
Posts: 567
Quote:
Currently the VP2 is clocked the same on all the GPUs, so the low end 8500GT will decode as well as the 280! Nvidia tells me the VP2 clock rate will be increased in forthcoming GPUs.
Think later GPU's should be clocked higher as they officially support two H.264 streams to be decoded in parallel!? Or do you think that older GPU's are just restricted in an artificial way?
__________________
Bye
CiNcH is offline  
Old 8th September 2008, 14:12   #1418  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
If they are in parallel, why would the clock rate have to be higher? Nvidia told me they all use the same clock rate. But I just found out that the 8500GT has a smaller memory bus size and later chips have a bigger one, so I can gain about 4 fps by upgrading. I've asked them which chip is the lowest end one that has the bigger bus.

BTW, GPU decoding is basically working now in DGAVCDec. I used a non-optimized approach first to expose possible issues. It revealed that once you open the GPU decoder in a thread, that thread must stay alive. But DGAVCDec was architected to kill the decode thread after a play/preview ends and then recreate it for the next play/preview. So I had to change the decode thread to a server thread that stays alive and waits for a wakeup from the GUI. It also revealed the correct way to reset the decoder for seeking is NOT what they first told me. We had to destroy and recreate the video parser. That's OK because it's effectively just a delete and new, so it's fast.

Next I plan to optimize the solution, fix some GUI issues, and roll out something you can play with.

Last edited by Guest; 8th September 2008 at 14:19.
Guest is offline  
Old 8th September 2008, 14:55   #1419  |  Link
noee
Registered User
 
Join Date: Jan 2007
Posts: 530
neuron2,
I use both DGIndex and DGAVCIndex with regularity, this is very exciting news. However, I'm strictly an ATI guy right now.

Sorry if I missed it before, but do you envision this capability on the ATI GPUs with DGAVCIndex in the future?
noee is offline  
Old 8th September 2008, 15:19   #1420  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
I haven't ruled it out, but to be honest, I generally spend my time only on things that I need myself. Now, if you showed that ATI GPU decoders perform better than Nvidia ones, then I would need it.
Guest is offline  
Closed Thread

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 09:55.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.