View Single Post
Old 30th December 2016, 03:40   #3  |  Link
KahnDigifer
Digital Devil
 
Join Date: Oct 2016
Posts: 4
Thanks for the response. I'm away from the PC I've been using so won't be able to test for a few days, but I'll try out your suggestion.

I was thinking of doing the HEVC encoding with my GTX 1070 GPU. Is the quality difference between a software and hardware encoder a set rule or can the GPU encoded file produce similar quality if the bit rate is set higher, like 30 mbps as opposed to 25 mbps for the cpu encoded file?

These are for videos I've shot and will make downloadable online. The reason I'm considering using the GPU for HEVC encoding rather than the CPU is because there's a large bulk of them and it's in 4k. Although my desktop has a premium thermal paste and a 240 mm Corsair H100i cooler, I'm worried CPU encoding that much will fry my processor.
KahnDigifer is offline   Reply With Quote