Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > MPEG-4 AVC / H.264

Reply
 
Thread Tools Search this Thread Display Modes
Old 9th March 2007, 21:45   #1  |  Link
grumpy
Registered User
 
grumpy's Avatar
 
Join Date: Mar 2002
Posts: 30
Re: in News apple $50 hardware chip

If you read the news section today here, there is a rumor that apple will add hardware h.264 encoding and decoding, and I'm just curious.

1. Is this enough of a reason for you to by a mac?

2. If anyone has anyone that knows the kind of quality you could expect from the chip they may use?

3.If the quality is good, is there any other company that you've heard of offering , or working on offering, this in an add on board that would be aimed at consumer level pricing.

Last edited by grumpy; 9th March 2007 at 21:56. Reason: me grammer good
grumpy is offline   Reply With Quote
Old 9th March 2007, 22:12   #2  |  Link
dungweaver
Registered User
 
dungweaver's Avatar
 
Join Date: Feb 2005
Location: Australia
Posts: 14
I'm interested in this too. I'm wondering if there's a HAL-type layer, or even just a driver, that invokes this hardware encoder.

Hopefully there's people out there who use current hardware (MPEG-2, even -4) encoders...who can enlighten us.
__________________
Across your open mind I trace erratic lines - in motion and in time
dungweaver is offline   Reply With Quote
Old 10th March 2007, 00:24   #3  |  Link
JohnnyMalaria
Registered User
 
Join Date: Sep 2006
Posts: 602
Quote:
Originally Posted by grumpy View Post
1. Is this enough of a reason for you to by a mac?
Buy a new computer to get a $50 encoder chip?

Not much point, especially if you already have an XP system. For $79, you can get a USB h.264 encoder that can encode at 5x real-time speed:

http://www.adstech.com/products/RDX-...sp?pid=RDX-160

Quote:
Quick Video Conversion for your Video iPod, PSP and Mobile Phone

Import ANY Video File saved on your PC Into Instant Video To-Go ís Simple Interface*
Hardware Accelerator Encodes Your Video Files Into H.264 (.mp4)
Synchronize Your Movies to Your iPod with Video, Mobile Phone or Portable Media Player

ADS Techís Instant Video To-Go, Video Transfer Accelerator Uses Hardware Compression.

Converts up to 5 Times Faster Than Real Time
Superior Quality Encoding
PC Compatible
__________________
John Miller
Enosoft DV Processor - Free for personal use
JohnnyMalaria is offline   Reply With Quote
Old 10th March 2007, 00:24   #4  |  Link
giandrea
Registered User
 
Join Date: Sep 2004
Location: Italy
Posts: 154
I really dubt that Apple will introduce specific hardware H264 decoding/encoding chips in the next Macs. What they will do in my opinion is to use the video card capabilities for this tasks.

MPEG2 decoding is already available both in Windows and Mac OS X (via Apple DVD Player only). Hardware H264 decoding with Nvidia and ATI cards is already available on Windows. They will add it to the next release of Mac OS X (Leopard) probably.

Then they have probably come up with a method to encode to H264 employing your graphic card. It is possible and discussed on the x264 mailing list, just not implemented in open source.

So my guess is that they will have this new features in Mac OS X Leopard, and QuickTime 8 too.
giandrea is offline   Reply With Quote
Old 10th March 2007, 01:36   #5  |  Link
akupenguin
x264 developer
 
akupenguin's Avatar
 
Join Date: Sep 2004
Posts: 2,393
Actually, what we decided on the x264 mailing list was that a current high-end CPU is faster than a high-end GPU for video encoding. None of the chips discussed was anywhere near 50$, though.
akupenguin is offline   Reply With Quote
Old 10th March 2007, 07:06   #6  |  Link
foxyshadis
ангел смерти
 
foxyshadis's Avatar
 
Join Date: Nov 2004
Location: Lost
Posts: 9,174
Curious, but what about doing reference frame decoding on the GPU? Well, now that I think about it, there's no CABAC and you already have the mocomped reference, so all you'd do is dequantize and add. Hmm. Or what about shoveling data into it and doing CABAC there? Basically, I'm just trying to figure out which parts are the least integrated and thus the most offloadable, but you'd know far better than I would.
__________________
There are four boxes to be used in defense of liberty: soap, ballot, jury, and ammo. Please use in that order. ~ Ed Howdershelt
foxyshadis is offline   Reply With Quote
Old 10th March 2007, 10:57   #7  |  Link
simonhowson
Registered User
 
Join Date: Jan 2004
Posts: 122
Quote:
Originally Posted by JohnnyMalaria View Post
Buy a new computer to get a $50 encoder chip?

Not much point, especially if you already have an XP system. For $79, you can get a USB h.264 encoder that can encode at 5x real-time speed:

http://www.adstech.com/products/RDX-...sp?pid=RDX-160
Do you know what the quality is like when using this hardware encoder?
simonhowson is offline   Reply With Quote
Old 11th March 2007, 09:18   #8  |  Link
akupenguin
x264 developer
 
akupenguin's Avatar
 
Join Date: Sep 2004
Posts: 2,393
A GPU is really bad at doing CABAC, since CABAC is inherently a serial process while a GPU gets most of its speed by having lots of parallel processors.
There's no point in offloading just dequant+iDCT to the GPU, since that takes about 1% of ffh264's cpu-time.
So the only reasonable division of labor in a decoder is: CPU does CABAC, GPU does everything involving pixels and DCT.
akupenguin is offline   Reply With Quote
Old 11th March 2007, 13:58   #9  |  Link
giandrea
Registered User
 
Join Date: Sep 2004
Location: Italy
Posts: 154
Quote:
Originally Posted by akupenguin View Post
A GPU is really bad at doing CABAC, since CABAC is inherently a serial process while a GPU gets most of its speed by having lots of parallel processors.
There's no point in offloading just dequant+iDCT to the GPU, since that takes about 1% of ffh264's cpu-time.
So the only reasonable division of labor in a decoder is: CPU does CABAC, GPU does everything involving pixels and DCT.
So perhaps they won't use CABAC, whet if they don't use CABAC? Perhaps the compression will suffer but they will get a big speed gain.
giandrea is offline   Reply With Quote
Old 12th March 2007, 16:25   #10  |  Link
akupenguin
x264 developer
 
akupenguin's Avatar
 
Join Date: Sep 2004
Posts: 2,393
A decoder can't choose not to use CABAC, that's a property of the input stream. Apple's software encoder already doesn't use CABAC, so it would be no surprise if their hardware encoder doesn't either.
akupenguin is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 23:59.


Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2017, vBulletin Solutions, Inc.