Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > MPEG-4 AVC / H.264

Reply
 
Thread Tools Search this Thread Display Modes
Old 22nd March 2012, 12:12   #1  |  Link
zn
Registered User
 
Join Date: Jan 2009
Posts: 88
nvenc - official nvidia kepler hardware h264 encoder

NVIDIA Releases the 301.10 WHQL Driver for the GeForce GTX 680, with support of new technology :

NVIDIA NVENC Support - adds support for the new hardware-based H.264 video encoder in GeForce GTX 680, providing up to 4x faster encoding performance while consuming less power.
zn is offline   Reply With Quote
Old 22nd March 2012, 15:44   #2  |  Link
Selur
Registered User
 
Selur's Avatar
 
Join Date: Oct 2001
Location: Germany
Posts: 7,255
Encoding doesn't have to get faster on gpus it has to get better quality,...
__________________
Hybrid here in the forum, homepage
Selur is offline   Reply With Quote
Old 22nd March 2012, 18:37   #3  |  Link
cengizhan
Registered User
 
Join Date: May 2003
Location: Turkey
Posts: 98
Quote:
Originally Posted by Selur View Post
Encoding doesn't have to get faster on gpus it has to get better quality,...
which we didn't see in any of supported gpus like ati and nvidia. ati has very bad encoding quality. there is no benefit if the quality can't be compared with x264.
cengizhan is offline   Reply With Quote
Old 25th March 2012, 14:08   #4  |  Link
iwod
Registered User
 
Join Date: Apr 2002
Posts: 756
Quote:
Originally Posted by Selur View Post
Encoding doesn't have to get faster on gpus it has to get better quality,...
One thing is that we could transcode High Def 1080P video to Mobile devices Real Time. In this usage scorner, ( which should be more popular later ) the faster the better.

It looks like it is 30% faster then Intel QuickSync. But Intel also promise much faster Encoding with QuickSync 2.0 in Ivy Bridge.

Let see which one is better for hardware encoding.
iwod is offline   Reply With Quote
Old 22nd March 2012, 19:01   #5  |  Link
mandarinka
Registered User
 
mandarinka's Avatar
 
Join Date: Jan 2007
Posts: 729
It also needs to consume much less power than the old shader-based stuff did, while the same utilising cpu.
And that is what these fixed-function blocks should be good at, hence all three players implementing them. BTW I've read somewhere that the VGA guys also need fast and efficient encoders for purposes of wireless display connection technology, since it might need compression at least for higher resolutions and refresh rates. (Dunno if it is just a BS or actual fact.)
mandarinka is offline   Reply With Quote
Old 22nd March 2012, 20:20   #6  |  Link
hajj_3
Registered User
 
Join Date: Mar 2004
Posts: 1,120
info from nvidia's website:

NVENC
All Kepler GPUs also incorporate a new hardware-based H.264 video encoder, NVENC.
Prior to the introduction of Kepler, video encoding on previous GeForce products was handled by
encode software running on the GPU’s array of CUDA Cores. While the CUDA Cores were able to deliver
tremendous performance speedups compared to CPU-based encoding, one downside of using these
high-speed processor cores to process video encoding was increased power consumption.
By using specialized circuitry for H.264 encoding, the NVENC hardware encoder in Kepler is almost four
times faster than our previous CUDA-based encoder while consuming much less power.

It is important to note that an application can choose to encode using both NVENC hardware and
NVIDIA’s legacy CUDA encoder in parallel, without negatively affecting each other. However, some video
pre-processing algorithms may require CUDA, and this will result in reduced performance from the
CUDA encoder since the available CUDA Cores will be shared by the encoder and pre-processor.
NVENC provides the following:
  • [Can encode full HD resolution (1080p) videos up to 8x faster than real-time. For example, in high
    performance mode, encoding of a 16 minute long 1080p, 30 fps video will take approximately 2
    minutes.]
  • Support for H.264 Base, Main, and High Profile Level 4.1 (same as Blu-ray standard)
  • Supports MVC (Multiview Video Coding) for stereoscopic video—an extension of H.264 which is
    used for Blu-ray 3D.
  • Up to 4096x4096 encode
We currently expose NVENC through proprietary APIs, and provide an SDK for development using
NVENC. Later this year, CUDA developers will also be able to use the high performance NVENC video
encoder. For example, you could use the compute engines for video pre-processing and then do the
actual H.264 encoding in NVENC. Alternatively, you can choose to improve overall video encoding
performance by running simultaneous parallel encoders in CUDA and NVENC, without affecting each
other’s performance.

NVENC enables a wide range of new use cases for consumers:
  • HD videoconferencing on mainstream notebooks
  • Sending the contents of the desktop to the big screen TV (gaming, video) through a wireless
    connection
  • Authoring high quality Blu-ray discs from your HD camcorder
A beta version of Cyberlink MediaEspresso with NVENC support is now available on the GeForce GTX
680 press FTP. Support will be coming soon for Cyberlink PowerDirector and Arcsoft MediaConverter.

source: page 26 of: http://www.geforce.com/Active/en_US/...aper-FINAL.pdf

Last edited by hajj_3; 22nd March 2012 at 22:45.
hajj_3 is offline   Reply With Quote
Old 23rd March 2012, 18:45   #7  |  Link
Ranguvar
Registered User
 
Ranguvar's Avatar
 
Join Date: Feb 2007
Location: ::1
Posts: 1,236
This is more interesting than most GPU encoders, because it's an actual hardware chip on the GTX 680 dedicated to H.264 encoding.
Ranguvar is offline   Reply With Quote
Old 24th March 2012, 10:41   #8  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,926
Quote:
Originally Posted by Ranguvar View Post
This is more interesting than most GPU encoders, because it's an actual hardware chip on the GTX 680 dedicated to H.264 encoding.
Intel also said this still some stuff (ME) is being done on the EUs, i see those chips more as hybrids, though well see if Nvidias is also a Hybrid or a real independent Asic

http://www.tomshardware.com/reviews/...k,3161-16.html

also this is most interesting in this Mpeg-2 test the old GTX 580 cuda cores could beat the GTX 680 encoder very strange this let me believe the Mpeg-2 Cuda fix was setting in here which was introduced to avoid a design bug in the VP4 decoder so the Mpeg-2 decoding get relayed to the CUDA cores not the DSP, though Tomshardware doesn't know that, so they must have forgotten to disable that for the VP5 decoder in the GTX 680

Though most impressive for me personaly is this http://www.guru3d.com/article/geforce-gtx-680-review/4 and ill wait for a version with 1 Pin replacing my 2 Pin 460 GTX
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 24th March 2012 at 11:41.
CruNcher is offline   Reply With Quote
Old 4th April 2012, 01:49   #9  |  Link
Ranguvar
Registered User
 
Ranguvar's Avatar
 
Join Date: Feb 2007
Location: ::1
Posts: 1,236
Quote:
Originally Posted by CruNcher View Post
also this is most interesting in this Mpeg-2 test the old GTX 580 cuda cores could beat the GTX 680 encoder very strange this let me believe the Mpeg-2 Cuda fix was setting in here which was introduced to avoid a design bug in the VP4 decoder so the Mpeg-2 decoding get relayed to the CUDA cores not the DSP, though Tomshardware doesn't know that, so they must have forgotten to disable that for the VP5 decoder in the GTX 680
It's possible, but remember that in general compute, the GTX 680 is worse than the GTX 580.

Quote:
Originally Posted by CruNcher View Post
Though most impressive for me personaly is this http://www.guru3d.com/article/geforce-gtx-680-review/4 and ill wait for a version with 1 Pin replacing my 2 Pin 460 GTX
Just ordered a GTX 680 myself, also upgrading from a 460. This will be fun.
Ranguvar is offline   Reply With Quote
Old 23rd March 2012, 18:52   #10  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,738
And speed versus quality can each be a fixed variable in comparisons. So one interesting question is "how does speed compare when x264 is run at settings that produce equivalent quality?"

HW can get interesting when it can be faster than even the lowest quality software settings. It can also be interesting where it doesn't use resources that can be spent on other bottlenecks, like source decode or preprocessing.

Also, fewer joules-per-minute or lower TCO-per-minute can matter for high volume facilities.

I'm joining the Amazon.com video tram as a encoding quality and workflow guru starting Monday, so these questions have rather been on my mind ...
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 24th March 2012, 15:48   #11  |  Link
kieranrk
Registered User
 
Join Date: Jun 2009
Location: London, United Kingdom
Posts: 707
Quote:
Originally Posted by benwaggoner View Post
I'm joining the Amazon.com video tram as a encoding quality and workflow guru starting Monday, so these questions have rather been on my mind ...
Good luck with your new job. It sounds very interesting.
kieranrk is offline   Reply With Quote
Old 23rd March 2012, 19:44   #12  |  Link
zn
Registered User
 
Join Date: Jan 2009
Posts: 88
Guru3D: Geforce GTX 680 review: NVENC

not interesting , but there will more article about nvenc later
zn is offline   Reply With Quote
Old 23rd March 2012, 19:48   #13  |  Link
hajj_3
Registered User
 
Join Date: Mar 2004
Posts: 1,120
yeah, having dedicated hardware in the gpu is a nice addition, hope the x264 devs add support for hardware encoding at some point in the near future. It would be nice if google hired some coders to add this support, i'm sure they would save money on encoding their youtube videos if they had a low end gpu hardware encoding some of it.
hajj_3 is offline   Reply With Quote
Old 23rd March 2012, 21:22   #14  |  Link
Didée
Registered User
 
Join Date: Apr 2002
Location: Germany
Posts: 5,389
We'll see at some time how it fares. When the hero/ine is scrambling some dark dungeons, and you cannot reckognize anything since all is just floating mush, then speed & power savings don't count much.
__________________
- We´re at the beginning of the end of mankind´s childhood -

My little flickr gallery. (Yes indeed, I do have hobbies other than digital video!)
Didée is offline   Reply With Quote
Old 23rd March 2012, 23:56   #15  |  Link
zn
Registered User
 
Join Date: Jan 2009
Posts: 88
if someone who bought GTX 680 want to do nvenc test and didn't know where to get sw - send pm to me
zn is offline   Reply With Quote
Old 24th March 2012, 11:25   #16  |  Link
zn
Registered User
 
Join Date: Jan 2009
Posts: 88
laptop models also have this feature

additional h264 silicon on laptops - looks strange
zn is offline   Reply With Quote
Old 24th March 2012, 11:38   #17  |  Link
hajj_3
Registered User
 
Join Date: Mar 2004
Posts: 1,120
many of the 600 series mobile gpu's are rebranded 500 series fermi gpu's btw, only a few will be kepler, newer models with kelper will come later: http://fudzilla.com/home/item/26501-...re-still-fermi

not sure about the desktop gpu's, wouldn't mind knowing though.
hajj_3 is offline   Reply With Quote
Old 24th March 2012, 11:48   #18  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,926
zn you can have 4 H.264 decoder on Desktop/laptops internally if you want these days and the same for Encoder

1 via internal/external solutions like a Broadcom Decoder (a Spurs Encoding/Decoding card)
1 via the CPU/GPU Intel/AMD
1 via the discrete GFX AMD/Nvidia :P
1 solely on the CPU (though you have to understand how the other stuff needs cpu time when and why,to efficiently utilize it together)

plenty of crazy stuff doable with this power and Nvidia as they claim you can even use the Cuda Cores and the DSP simultaneously or if you clever enough pair the Encoders (Multithread)

It's crazy to think about which workflows even on consumer systems this makes possible (Realtime) (some of those i already evaluate since some time now SB)

you can mix them in very different ways together though it's not easy to keep track of the impacts to balance it out efficiently and integrate into a Realtime workflow (you also have to take the OS into account)
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 24th March 2012 at 16:04.
CruNcher is offline   Reply With Quote
Old 24th March 2012, 16:01   #19  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,926
@ benwaggoner
oh you left/leaving Microsoft ?
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004
CruNcher is offline   Reply With Quote
Old 24th March 2012, 21:17   #20  |  Link
hajj_3
Registered User
 
Join Date: Mar 2004
Posts: 1,120
maybe it is because silverlight is no more, i can't see them making a silverlight 6 due to HTML5 adding so many features that are in silverlight/flash.
hajj_3 is offline   Reply With Quote
Reply

Tags
nvidia

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 06:22.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.