View Single Post
Old 27th October 2015, 19:35   #5  |  Link
vivan
/人 ◕ ‿‿ ◕ 人\
 
Join Date: May 2011
Location: Russia
Posts: 643
Quote:
Originally Posted by rosh View Post
which gave me about 74% compression.
This is all lossy compression. Using the HEVC encoder (x265, default parameters).
The problem is that this tells you nothing.
Imagine compressing picture with jpg. You compressed it to 10kb. Does it tell you anything?

Quote:
Originally Posted by rosh View Post
This article says that standard H.264 can achieve good quality video at 100:1 compression for normal video sequences. I've tried many test sequences mainly of fast motion video, and they too were around 98%-99%.
1) Everyone have different definition of a "good" quality. For me it's x264's crf 16-17, but there're people that are fine with x265 default crf (28), which I find pretty bad.
(and even though crf is much better metric than bitrate (and even psnr/ssim), it's still far from "constant quality").
2) With modern codecs bitrate doesn't mean much - quality also depends on the video itself and used encoder (x264 is the best H.264 encoder). Difference between videos with high and low complexity exceeds 10x. Difference between encoders and their settings could add 2-3x to that...
1:100 is 0.12 bpp (bits per pixel). For normal, real-life content it's usually low (at good x264 settings). For complex video (like crowd_run) it's very low. For simple video (like old_town_cross) that could be quite good.
3) It also depends on framerate and resolution - 4k@60fps requires less bpp then SD@24fps to look good.

Quote:
Originally Posted by rosh View Post
I guess is also depends on the level of entropy - maybe /dev/urandom isn't purely random to give low compression.
Well, yes, it's depends - there're ways to make noise more compressible (colorless, static, etc), though I don't believe that noise made using bad rng will be any more compressible then when using a good one.
vivan is offline   Reply With Quote