Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > High Efficiency Video Coding (HEVC)

Reply
 
Thread Tools Search this Thread Display Modes
Old 28th December 2017, 09:38   #1  |  Link
Neillithan
Banned
 
Join Date: Feb 2007
Posts: 124
x265 - Always choose 10bit?

A month ago, I was told to enable 10bit mode, without being asked if my input footage was 8bit.

Is there a reason to encode in 10bit even if the input footage is 8bit?
Neillithan is offline   Reply With Quote
Old 28th December 2017, 10:54   #2  |  Link
microchip8
ffx264/ffhevc author
 
microchip8's Avatar
 
Join Date: May 2007
Location: /dev/video0
Posts: 1,843
you'd get much less banding in your encode in 10 bit. x265 still suffers too much from that when using 8 bit. Your input doesn't matter much what bit depth it has (8 or 10)

do note that a lot of gear still doesn't support 10 bit yet, computers excluded. If you have a HW decoder that can deal with 10 bit, displaying it on an 8 bit display (like a TV) is not an issue. If your TV is 10 bit, even better
__________________
ffx264 || ffhevc || ffxvid || microenc

Last edited by microchip8; 28th December 2017 at 10:58.
microchip8 is offline   Reply With Quote
Old 29th December 2017, 19:17   #3  |  Link
Selur
Registered User
 
Selur's Avatar
 
Join Date: Oct 2001
Location: Germany
Posts: 7,259
Quote:
Is there a reason to encode in 10bit even if the input footage is 8bit?
Pro:
1. better compression
2. no banding through compression precision
Con:
1. higher compression complexity -> potentially more cpu usage
2. higher decoding complexity -> potentially more cpu usage
3. not all decoders support it

Personally I only encode in 10bit,..

Cu Selur
__________________
Hybrid here in the forum, homepage
Selur is offline   Reply With Quote
Old 30th December 2017, 22:59   #4  |  Link
GZZ
Registered User
 
Join Date: Jan 2002
Posts: 581
I did a test on a 8 bit TV-episode encoded with x265 10 bit (Preset Medium) CRF 19 and compared it to a 8 bit encoding (same preset and crf) and I couldnt see any difference, but the file size came out 200 mb bigger for the 10 bit encode. Is there a video (Youtube ?) that will show the difference of 10 bit encoding on a 8 bit source compared to a 8 bit encoding of the same source ?
GZZ is offline   Reply With Quote
Old 31st December 2017, 03:04   #5  |  Link
RanmaCanada
Registered User
 
Join Date: May 2009
Posts: 328
There are actually some video blogs that will point you in the direction you are looking. You can quickly google something like benefits of 10 bit x265 encoding. What you will be looking for is things that you can see only in frame by frame comparisons really, and in areas where there would normally be lots of dithering, like clouds and leaves and backgrounds. Information that would easily be lost. The 10bit profile makes it less objectionable when dealing with gradients.
RanmaCanada is offline   Reply With Quote
Old 31st December 2017, 10:03   #6  |  Link
Sparktank
47.952fps@71.928Hz
 
Sparktank's Avatar
 
Join Date: Mar 2011
Posts: 940
Quote:
Originally Posted by GZZ View Post
I did a test on a 8 bit TV-episode encoded with x265 10 bit (Preset Medium) CRF 19 and compared it to a 8 bit encoding (same preset and crf) and I couldnt see any difference, but the file size came out 200 mb bigger for the 10 bit encode. Is there a video (Youtube ?) that will show the difference of 10 bit encoding on a 8 bit source compared to a 8 bit encoding of the same source ?
You're better using 2-pass with equal bitrate/settings for comparing.

CRF in 10bit can be lower(quality) compared to 8bit.
CRF19 in 8bit, you can go CRF 21-23 in 10bit.
__________________
Win10 (x64) build 19041
NVIDIA GeForce GTX 1060 3GB (GP106) 3071MB/GDDR5 | (r435_95-4)
NTSC | DVD: R1 | BD: A
AMD Ryzen 5 2600 @3.4GHz (6c/12th, I'm on AVX2 now!)
Sparktank is offline   Reply With Quote
Old 31st December 2017, 16:12   #7  |  Link
GZZ
Registered User
 
Join Date: Jan 2002
Posts: 581
Quote:
Originally Posted by Sparktank View Post
You're better using 2-pass with equal bitrate/settings for comparing.

CRF in 10bit can be lower(quality) compared to 8bit.
CRF19 in 8bit, you can go CRF 21-23 in 10bit.
That is what I cant get into my head. So if I encode a 8bit movie as 10 bit I can use higher CRF and still keep the same quality as if I encoded the movie in 8 bit with CRF 19.

How can that be ?
GZZ is offline   Reply With Quote
Old 31st December 2017, 20:06   #8  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Why does 10-bit save bandwidth (even when content is 8-bit)?
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 31st December 2017, 22:33   #9  |  Link
Motenai Yoda
Registered User
 
Motenai Yoda's Avatar
 
Join Date: Jan 2010
Posts: 709
Quote:
Originally Posted by GZZ View Post
That is what I cant get into my head. So if I encode a 8bit movie as 10 bit I can use higher CRF and still keep the same quality as if I encoded the movie in 8 bit with CRF 19.

How can that be ?
10bit improve about 5-7% of compression quality
usually in x264 10bit can avoid banding, on x265 it's a bit different, better quality but isn't banding-free.
always compare 2 pass vs 2 pass at the same bitrate
__________________
powered by Google Translator
Motenai Yoda is offline   Reply With Quote
Old 1st January 2018, 01:13   #10  |  Link
WhatZit
Registered User
 
Join Date: Aug 2016
Posts: 60
Quote:
Originally Posted by Neillithan View Post
Is there a reason to encode in 10bit even if the input footage is 8bit?
Ultimately, your choice of encode methodology depends entirely on how you intend to display those encodes. If you're stuck supporting 8-bit only hardware, then you're stuck with 8-bit encoding. Can't avoid it.

However, if you only ever intend to have your encodes played on new 10-bit capable hardware, then you should unequivocally encode in 10-bits at all times, for all of the reasons that everyone has mentioned.

The tangible benefits of 10-bit encoding will exist forever, but the ephemeral problems of compatibility will vanish once hardware that supports 10-bit formats becomes commonplace. Just look at the spec sheets for existing 2017 models, and upcoming 2018 models.

So, simple choice: are you held hostage by ageing hardware, or are you free to explore the best possible options?
WhatZit is offline   Reply With Quote
Old 1st January 2018, 07:44   #11  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Sadly 10-bit H.264 will probably never be supported by decoding hardware.

Happily most mobile devices will probably be able to do software decoding of 10-bit H.264, albeit while using a lot more power.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 1st January 2018, 17:43   #12  |  Link
GZZ
Registered User
 
Join Date: Jan 2002
Posts: 581
Quote:
Originally Posted by Asmodian View Post
Sadly 10-bit H.264 will probably never be supported by decoding hardware.

Happily most mobile devices will probably be able to do software decoding of 10-bit H.264, albeit while using a lot more power.
Why use H.264 when H.265 is out ? H.265 10 bit is supported by many devices.
GZZ is offline   Reply With Quote
Old 1st January 2018, 18:05   #13  |  Link
Sparktank
47.952fps@71.928Hz
 
Sparktank's Avatar
 
Join Date: Mar 2011
Posts: 940
Quote:
Originally Posted by GZZ View Post
Why use H.264 when H.265 is out ? H.265 10 bit is supported by many devices.
It does, but still needs more work.

x264 is still getting updates, but you can still just use a few builds older than the latest version and not look back for archival purposes.

Dollars to donuts, if it's not going to have HDR in the end result, I'd rather go with x264.

It'll be some time before I upgrade any piece of hardware to natively decode HEVC without framedrops.

I'm not much for standalone hardware or media players, so I use my PC more often.
And I have yet to upgrade my graphics card to support the 4K HDR movies and then I still have to upgrade my CPU to handle all that far better, but I think if I want to upgrade my CPU to one of the newest generations out there, I'd have to upgrade my mother board, and if I'm going to upgrade my motherboard, I might as well get one that supports better RAM and get a new case for it as well. And maybe look into water cooling.

For my personal case? No HDR? Then go x264 hi10p.
Blurays to hi10p is sufficient enough until I up my game to start working on the 4K BD's, by which time, hopefully x265 will be on par.

And, true, it's far more improved since a few years ago.
But, alas, I have miles and miles to go before I sleep.
__________________
Win10 (x64) build 19041
NVIDIA GeForce GTX 1060 3GB (GP106) 3071MB/GDDR5 | (r435_95-4)
NTSC | DVD: R1 | BD: A
AMD Ryzen 5 2600 @3.4GHz (6c/12th, I'm on AVX2 now!)
Sparktank is offline   Reply With Quote
Old 2nd January 2018, 20:14   #14  |  Link
birdie
Artem S. Tashkinov
 
birdie's Avatar
 
Join Date: Dec 2006
Posts: 337
I had quite a bad experience with re-encoding H.264 sources to HEVC while enabling 10bit encoding: zero quality improvement while file sizes become significantly larger (at least 5%).
birdie is offline   Reply With Quote
Old 2nd January 2018, 22:33   #15  |  Link
jd17
Registered User
 
Join Date: Jun 2017
Posts: 89
Sounds like unrealistic expectations to me...

An encode cannot be better than it's source.
It might look more pleasing in some regard (e.g. smoothed gradations by means of dithering) - but it will never be better.

If you are getting large filesizes, you are...
a) encoding a source that is already (highly) compressed or
b) using the wrong settings / targets or
c) expecting too much.
jd17 is offline   Reply With Quote
Old 2nd January 2018, 22:40   #16  |  Link
microchip8
ffx264/ffhevc author
 
microchip8's Avatar
 
Join Date: May 2007
Location: /dev/video0
Posts: 1,843
Quote:
Originally Posted by birdie View Post
I had quite a bad experience with re-encoding H.264 sources to HEVC while enabling 10bit encoding: zero quality improvement while file sizes become significantly larger (at least 5%).
you're doing something wrong. I get almost half size reduction when using x265 10 bit @ CRF 21 compared to 8 bit x264 @ CRF 18. Am hard pressed to notice difference in quality between the two

post your settings, if you will
__________________
ffx264 || ffhevc || ffxvid || microenc
microchip8 is offline   Reply With Quote
Old 2nd January 2018, 23:29   #17  |  Link
Nico8583
Registered User
 
Join Date: Jan 2010
Location: France
Posts: 851
Quote:
Originally Posted by froggy1 View Post
you're doing something wrong. I get almost half size reduction when using x265 10 bit @ CRF 21 compared to 8 bit x264 @ CRF 18. Am hard pressed to notice difference in quality between the two

post your settings, if you will
Do you compare "x264 8bit @ CRF 18 encoded to x265 10bit @ CRF 21" or "H264 source encoded to x264 8bit @ CRF 18 vs H264 source encoded to x265 10bit @ CRF 21" ?
Nico8583 is offline   Reply With Quote
Old 2nd January 2018, 23:34   #18  |  Link
microchip8
ffx264/ffhevc author
 
microchip8's Avatar
 
Join Date: May 2007
Location: /dev/video0
Posts: 1,843
Quote:
Originally Posted by Nico8583 View Post
Do you compare "x264 8bit @ CRF 18 encoded to x265 10bit @ CRF 21" or "H264 source encoded to x264 8bit @ CRF 18 vs H264 source encoded to x265 10bit @ CRF 21" ?
the latter
__________________
ffx264 || ffhevc || ffxvid || microenc
microchip8 is offline   Reply With Quote
Old 3rd January 2018, 02:57   #19  |  Link
Motenai Yoda
Registered User
 
Motenai Yoda's Avatar
 
Join Date: Jan 2010
Posts: 709
Quote:
Originally Posted by Sparktank View Post
It'll be some time before I upgrade any piece of hardware to natively decode HEVC without framedrops.
Your cpu should be enough to sw decode hevc without framedrops
moreover h264 10bit hw support doesn't exist and luckily will not.
__________________
powered by Google Translator
Motenai Yoda is offline   Reply With Quote
Old 3rd January 2018, 03:16   #20  |  Link
Sparktank
47.952fps@71.928Hz
 
Sparktank's Avatar
 
Join Date: Mar 2011
Posts: 940
Quote:
Originally Posted by Motenai Yoda View Post
Your cpu should be enough to sw decode hevc without framedrops
moreover h264 10bit hw support doesn't exist and luckily will not.
Worth checking out before diving into the rabbit hole.

I should start with the UHD drive first.
__________________
Win10 (x64) build 19041
NVIDIA GeForce GTX 1060 3GB (GP106) 3071MB/GDDR5 | (r435_95-4)
NTSC | DVD: R1 | BD: A
AMD Ryzen 5 2600 @3.4GHz (6c/12th, I'm on AVX2 now!)
Sparktank is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 09:50.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.