Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > MPEG-4 AVC / H.264

Reply
 
Thread Tools Search this Thread Display Modes
Old 10th October 2008, 10:31   #1  |  Link
shades
Registered User
 
Join Date: Apr 2005
Location: Australia
Posts: 34
DXVA Encode

Well it's obvious that the new ATi and Nvidia chips can accelerate a decode for certain video compression, so is it possible to do an encode on these GPUs too?

I know the stanford protein folding program used the GPU to do it's folding work, is there any interest or development looking in to accelerated encoding options for the .h264 etc video compression types being experimented on?
shades is offline   Reply With Quote
Old 10th October 2008, 10:44   #2  |  Link
nm
Registered User
 
Join Date: Mar 2005
Location: Finland
Posts: 2,643
search -> badaboom:
H.264 baseline GPU accelerated encoder
GPU transcoding
80% faster encoding? [hardware]
DXVA encoding era has started
CUDA encoders for H.264, hows the quality?
...
nm is offline   Reply With Quote
Old 10th October 2008, 15:28   #3  |  Link
Sagekilla
x264aholic
 
Join Date: Jul 2007
Location: New York
Posts: 1,752
Can we make a sticky thread about all this GPU accelerated encoding nonsense? It'd save a lot of time and questions -- plus provide one central thread for all the information.
__________________
You can't call your encoding speed slow until you start measuring in seconds per frame.
Sagekilla is offline   Reply With Quote
Old 10th October 2008, 15:35   #4  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,949
yep i suggest the baseline thread and merge everything else into it because baseline is what badaboom will be released as there will also be no PRO version anymore only Badaboom, speed for the last beta4 increased a little
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004
CruNcher is offline   Reply With Quote
Old 13th October 2008, 00:50   #5  |  Link
shades
Registered User
 
Join Date: Apr 2005
Location: Australia
Posts: 34
Quote:
Originally Posted by CruNcher View Post
yep i suggest the baseline thread and merge everything else into it because baseline is what badaboom will be released as there will also be no PRO version anymore only Badaboom, speed for the last beta4 increased a little
Isn't Badaboom only for Nvidia? what about ATi ?
shades is offline   Reply With Quote
Old 13th October 2008, 02:13   #6  |  Link
Ranguvar
Registered User
 
Ranguvar's Avatar
 
Join Date: Feb 2007
Location: ::1
Posts: 1,236
Bottom line, GPUs rock at floating-point computations, but not so much with everything else. There's little-to-no floating-point in encoding. Therefore, unless either of those two conditions changes, just get a good CPU
Ranguvar is offline   Reply With Quote
Old 13th October 2008, 06:28   #7  |  Link
shades
Registered User
 
Join Date: Apr 2005
Location: Australia
Posts: 34
Quote:
Originally Posted by Ranguvar View Post
Bottom line, GPUs rock at floating-point computations, but not so much with everything else. There's little-to-no floating-point in encoding. Therefore, unless either of those two conditions changes, just get a good CPU
If this was true, then why offload decoding to GPU for acceleration?
shades is offline   Reply With Quote
Old 13th October 2008, 06:37   #8  |  Link
Shinigami-Sama
Solaris: burnt by the Sun
 
Shinigami-Sama's Avatar
 
Join Date: Oct 2004
Location: /etc/default/moo
Posts: 1,923
Quote:
Originally Posted by shades View Post
If this was true, then why offload decoding to GPU for acceleration?
because untill recently software decoders sucked like a [insert euphemism here]
so dxva was needed
now that decoders have sped up and hardware has gotten much cheaper its not really needed
but when it works right(which can be a pain) its much smoother than software
__________________
Quote:
Originally Posted by benjust View Post
interlacing and telecining should have been but a memory long ago.. unfortunately still just another bizarre weapon in the industries war on image quality.
Shinigami-Sama is offline   Reply With Quote
Old 13th October 2008, 06:58   #9  |  Link
lucassp
Registered User
 
Join Date: Jan 2007
Location: Romania, Timisoara
Posts: 223
Quote:
Originally Posted by shades View Post
If this was true, then why offload decoding to GPU for acceleration?
because the GPUs have a independent small piece of silicon (VP1/2/3 and UVD/HD) that only does decoding. for the encoding you need to write a app which uses the shader hardware.
lucassp is offline   Reply With Quote
Old 13th October 2008, 07:35   #10  |  Link
G_M_C
Registered User
 
Join Date: Feb 2006
Posts: 1,076
Quote:
Originally Posted by shades View Post
If this was true, then why offload decoding to GPU for acceleration?
I've got the feeling it is mostly because it is a fashionable thing to say ....

(in marketing terms, that is )
G_M_C is offline   Reply With Quote
Old 6th September 2009, 23:03   #11  |  Link
Pensive
Registered User
 
Join Date: Mar 2006
Posts: 11
I've noticed that dxva encoded movies dont look anywhere near as crisp as CPU encoded ones. Obviously this has a lot to do with the particular implementation of the shader hardware, but there may also be inherent video "colourations" from using the graphics chip to do this job.....this is a guess really, but old school canopus DV Rex RT hardware mpeg encoders tended to apply a "smoothing" algorithm beforehand which reduced data processing requiremetns and gave a perceived "better quality encode". It wasnt better though. It looked "smoother" on CRT telly's at the time, but detail was lost at the end of the day.

Shaders are just like little processers though..right? Is there a way to utilise them without the image being "processed" by the graphics chip with its rendering pipeline unavoidables?
__________________
_______________________________________
www.blurayjedi.com
Transcode, Store, and Enjoy your BluRay Discs
Pensive is offline   Reply With Quote
Old 7th September 2009, 08:36   #12  |  Link
nm
Registered User
 
Join Date: Mar 2005
Location: Finland
Posts: 2,643
Quote:
Originally Posted by Pensive View Post
I've noticed that dxva encoded movies dont look anywhere near as crisp as CPU encoded ones. Obviously this has a lot to do with the particular implementation of the shader hardware, but there may also be inherent video "colourations" from using the graphics chip to do this job.....this is a guess really,
Nope, graphics hardware doesn't cause any artifacts, it's just a computing device. Quality issues are due to the (software) encoder implementations using the GPU. Currently they are simply poor compared to good CPU-based encoders.

Last edited by nm; 7th September 2009 at 08:40.
nm is offline   Reply With Quote
Old 7th September 2009, 08:51   #13  |  Link
deets
Registered User
 
Join Date: Jan 2005
Location: london, england
Posts: 509
not directly related, but if you offload the resizing and deinterlacing onto the graphics card via neuron2's nvidia CUDA specific progs, you will see a nice speed up as the cpu no longer needs to worry about those things.

edit: plus the deinterlacing quality is the best if seen if done via the hardware

Last edited by deets; 7th September 2009 at 08:51. Reason: added some more
deets is offline   Reply With Quote
Old 7th September 2009, 13:31   #14  |  Link
Pensive
Registered User
 
Join Date: Mar 2006
Posts: 11
So the shader's are limited in their functionality; And this results in more inaccurate processes taking place, than that which would be used by a CPU using x264 encoding.

The end result I have seen is that all the detail from a 1080P movie is lost, so much so that I can't tell all that much difference between it and a 720P cpu-coded version.

Great for realtime video editing for example (as long as the final render to disc takes place CPU-bound.), but I would say that those shaders will always be poor quality compared to the flexibility and mathematical accuracy of a CPU. With quad cores running at 4 and 5 Ghz, ok you're not at realtime but what really is the benefit.....? You encode a video just once, you watch it many times....surely its worth the extra hour or two for the huge increase in sharpness?
__________________
_______________________________________
www.blurayjedi.com
Transcode, Store, and Enjoy your BluRay Discs
Pensive is offline   Reply With Quote
Old 7th September 2009, 13:50   #15  |  Link
ajp_anton
Registered User
 
ajp_anton's Avatar
 
Join Date: Aug 2006
Location: Stockholm/Helsinki
Posts: 775
Quote:
Originally Posted by Pensive View Post
With quad cores running at 4 and 5 Ghz, ok you're not at realtime
No, you're probably at multiple times realtime if you encode at the same quality as the GPU-based encoders.
ajp_anton is offline   Reply With Quote
Old 7th September 2009, 14:26   #16  |  Link
nm
Registered User
 
Join Date: Mar 2005
Location: Finland
Posts: 2,643
Quote:
Originally Posted by Pensive View Post
So the shader's are limited in their functionality; And this results in more inaccurate processes taking place, than that which would be used by a CPU using x264 encoding.
Shaders or stream processors are limited when compared to a CPU core but they don't cause inaccurate results. As I said, quality is entirely up to the encoder implementation (software algorithms run on both the CPU and the GPU). One could certainly write an encoder that uses the GPU and gives as high quality as x264. It's just that nobody has done that yet.
nm is offline   Reply With Quote
Old 7th September 2009, 18:22   #17  |  Link
Zelos
Registered User
 
Join Date: May 2007
Location: Marseille
Posts: 73
http://forum.doom9.org/showthread.php?t=141104

neuron did it.
Zelos is offline   Reply With Quote
Old 7th September 2009, 18:31   #18  |  Link
Dark Shikari
x264 developer
 
Dark Shikari's Avatar
 
Join Date: Sep 2005
Posts: 8,689
Nice thread necromancy.
Dark Shikari is offline   Reply With Quote
Old 7th September 2009, 18:33   #19  |  Link
nm
Registered User
 
Join Date: Mar 2005
Location: Finland
Posts: 2,643
Quote:
Originally Posted by Zelos View Post
That's a decoder, using dedicated hardware.

Last edited by nm; 7th September 2009 at 18:36.
nm is offline   Reply With Quote
Old 7th September 2009, 22:43   #20  |  Link
Pensive
Registered User
 
Join Date: Mar 2006
Posts: 11
Quote:
Originally Posted by nm View Post
Shaders or stream processors are limited when compared to a CPU core but they don't cause inaccurate results. As I said, quality is entirely up to the encoder implementation (software algorithms run on both the CPU and the GPU).
Not to be contrary (In fact I'm finding your responses most enlightening), but I've been coding since around 1990, and if I've learnt one thing - its that a LongDIV instruction on one cpu can have a wildly differing level of accuracy on another CPU.

Now these shaders - they have an instruction set. Probably a smaller instruction set than a CPU; I'm going to assume so (rather naughtily) - without having googled it as its late.

Therefore, an "implementation" of, for example, inverse discrete cosine transforms (such as those required for DVD playback) can be coded into an ATI K6-2-500mhz using 3dNOW! and also into a P3-733 MMX and get differing quality.
The reason is that the speed which can be achieved from these cores stems from the "estimated" but very fast high speed instructions from the mmx chipset (or 3dNOW! set).

Quote:
Originally Posted by nm View Post
One could certainly write an encoder that uses the GPU and gives as high quality as x264. It's just that nobody has done that yet.
If you could write the algorithm to work at identical quality as the CPU implementation;

Would you actually get a speed increase?

(answer is of course dependant on GPU + SLI setup etc., assume one Radeon 4800).
__________________
_______________________________________
www.blurayjedi.com
Transcode, Store, and Enjoy your BluRay Discs
Pensive is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 10:30.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.