Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > MPEG-4 AVC / H.264

Reply
 
Thread Tools Search this Thread Display Modes
Old 17th January 2013, 16:07   #1  |  Link
KornX
primus inter pares
 
KornX's Avatar
 
Join Date: Feb 2004
Posts: 70
openCL decompression

hi,
anybody ever thought about or tried openCL decompression of h264 (or maybe vc1)?
The reason I am asking is many hardware video decoders are only capable of ~1080p content...

KornX
__________________
Field Application Engineer
Advanced Micro Devices
KornX is offline   Reply With Quote
Old 17th January 2013, 16:21   #2  |  Link
LoRd_MuldeR
Software Developer
 
LoRd_MuldeR's Avatar
 
Join Date: Jun 2005
Location: Last House on Slunk Street
Posts: 13,248
"Hardware" decoding doesn't happen on the actual GPU, but in a special "video engine". NVidia calls that "PureVideo HD" (VPx), AMD calls theirs "Unified Video Decoder" (UVD).

Regardless of what API you use to access the "hardware" decoder (DXVA, CUVID, VDPAU or maybe OpenCL), the limitations on what works (or doesn't work) is defined by the hardwired decoding engine...
__________________
Go to https://standforukraine.com/ to find legitimate Ukrainian Charities 🇺🇦✊
LoRd_MuldeR is offline   Reply With Quote
Old 17th January 2013, 16:37   #3  |  Link
KornX
primus inter pares
 
KornX's Avatar
 
Join Date: Feb 2004
Posts: 70
I know where it happens. thats the reason why my q is to offload some off the sw decoding from the cpu to the gpu via openCL...
So there comes no api to the UVD or whatsoever, but directrly to the compute shaders...

KornX
__________________
Field Application Engineer
Advanced Micro Devices
KornX is offline   Reply With Quote
Old 17th January 2013, 16:49   #4  |  Link
LoRd_MuldeR
Software Developer
 
LoRd_MuldeR's Avatar
 
Join Date: Jun 2005
Location: Last House on Slunk Street
Posts: 13,248
This most likely will run into the same problems as GPU-accelerated encoding:

Offloading only some functions of the decoding process to the GPU will have a severe delay for uploading the inputs from CPU memory to GPU memory and then downloading the results again. This delay can easily destroy any speed-benefit, even if the calculation itself runs a lot faster on the GPU. Now you may argue that the upload/download delay can be hidden using a "smart" pipelining. That may be right, but it doesn't make things easier

Also, when having a fully-fledged H.264 "hardware" decoder available as a hardwired piece of silicon on pretty much any graphics card sold in the last ~7 years, motivation to write an OpenCL-based "GPU" decoder from the scratch is low...

(Once "4K" content becomes more popular, you can be pretty sure that NVidia and AMD will "upgrade" their hardware decoders accordingly - with the next GPU generation)
__________________
Go to https://standforukraine.com/ to find legitimate Ukrainian Charities 🇺🇦✊

Last edited by LoRd_MuldeR; 17th January 2013 at 16:52.
LoRd_MuldeR is offline   Reply With Quote
Old 17th January 2013, 19:33   #5  |  Link
Dark Shikari
x264 developer
 
Dark Shikari's Avatar
 
Join Date: Sep 2005
Posts: 8,666
GPUs are an order of magnitude too slow to do video decoding, which is largely a linear process and can't be parallelized the way a GPU needs. That's why hardware decoders exist in the first place (besides the fact that they're nice and fast and take almost no power or silicon).
Dark Shikari is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 16:21.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.