View Single Post
Old 11th January 2019, 15:20   #16558  |  Link
Balthazar2k4
Registered User
 
Join Date: Mar 2009
Location: Here, There, & Everywhere
Posts: 269
Quote:
Originally Posted by Atak_Snajpera View Post
I'm not interested in hardware encoding at all because hardware encoders suck in fine detail retention at low bitrates. Another problem. Hardware encoding will also be less useful in Distributed encoding mode where most laptops/pcs use non nvidia GPUs.

Summary
  1. I do not have nvidia GPU and my next GPU will be for 100% AMD navi
  2. Hardware encoders produce noticeable more blurry image at lower bitrates
  3. Many machines in DE mode could not be used for encoding due to incompatible gpu.
To back Atak's point, I have tried GPU encoding for several years. Each year the quality does improve, but the overall image is still nowhere near what traditional CPU encoding can do. Often the image is softer with more compression artifacts and larger file size. The only win is the speed. I can encode 1080p at >200fps on my 2080ti which makes the need for distributed encoding null in my opinion, but the result is so underwhelming that I can't justify it. Also, as soon as you throw any additional filters in the mix such as denoise or tonemapping you lose the speed anyways. I'll stick with distributed encoding for now.
Balthazar2k4 is offline   Reply With Quote