View Single Post
Old 29th January 2021, 01:23   #4  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by YaBoyShredderson View Post
Im generally annoyed by the variability as it can vary wildly and often unpredictably, as well as some files ending up larger than the source. Rather than wasting time encoding something several times to try and get the file size down for some files, ot would be easier to just set the average bitrate.
The simplest thing to be would be to set --crf, --bitrate, and --vbv-maxrate together. In a single pass that would keep the average lower, although 1-pass VBR won't give optimal bit distribution. You could also do the first pass as --crf but save a .stats file and potentially an analysis-save. For files that aren't too large, call it a day. And for files that are too large, then do a 2nd pass targeting an ABR. Reusing the stats and particularly analysis data should make a second rate-controlled pass quite a bit faster than the first.

I tried this all some years ago and it worked quite nicely, but there are probably specifics I am forgetting.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote