Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 25th February 2016, 00:42   #36461  |  Link
XMonarchY
Guest
 
Posts: n/a
What does
Quote:
* added support for native 10bit 4:2:0 DXVA decoding (+ scaling)
mean? I recall I could play 10bit 4:2:0 videos with "DXVA2 Copy-Back" enabled in LAV Filters and it was just fine...

"Processing done by GPU Video Logic" - is this what madshi meant?

BTW, LAV Filters does show hardware support for HEVC on my GTX 980 through DXVA2 Copy-Back acceleration. I guess its only for 8bit content then. If 8bit is enough, why would HEVC rips start using 10bit anyway? Using 10bit would increase file size, wouldn't it?

Is NVidia CUVID still an inferior option to DXVA2 Copy-Back? Does it still produce artifacts after all the driver updates, LAV updates, and madVR updates?

Last edited by XMonarchY; 25th February 2016 at 00:45.
  Reply With Quote
Old 25th February 2016, 00:53   #36462  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by XMonarchY View Post
What does

mean? I recall I could play 10bit 4:2:0 videos with "DXVA2 Copy-Back" enabled in LAV Filters and it was just fine...

"Processing done by GPU Video Logic" - is this what madshi meant?

BTW, LAV Filters does show hardware support for HEVC on my GTX 980 through DXVA2 Copy-Back acceleration. I guess its only for 8bit content then. If 8bit is enough, why would HEVC rips start using 10bit anyway? Using 10bit would increase file size, wouldn't it?

Is NVidia CUVID still an inferior option to DXVA2 Copy-Back? Does it still produce artifacts after all the driver updates, LAV updates, and madVR updates?
You need at least 10bits for the wider gamut and HDR. In fact, 12bits is the minimum according to Dolby (Vision).
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 25th February 2016, 00:58   #36463  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by Manni View Post
You need at least 10bits for the wider gamut and HDR. In fact, 12bits is the minimum according to Dolby (Vision).
No, I'm pretty sure all of the sources are 10-bit. I have seen the official UHD Blu-ray specs. 12-bit sources are listed as a (possible) future upgrade but are not part of the current standard.
Warner306 is offline   Reply With Quote
Old 25th February 2016, 01:00   #36464  |  Link
clsid
*****
 
Join Date: Feb 2005
Posts: 5,643
That changelog entry only concerns the "native" variant of DXVA2. That requires special handling from the video renderer. Copy-back acts as a normal decoder as far as renderers (and other downstream filters) are concerned.

Your GTX 980 provides partial acceleration. That means it does not have a dedicated decoder unit for HEVC, but uses generic GPU and CPU resources to do the decoding. That is much less efficient and slower compared to full acceleration.

10bit does not necessarily make things bigger. In fact, it can sometimes even provide similar quality at lower size.

CUVID is inferior due to broken NVIDIA drivers. LAV and madVR are NOT at fault.
__________________
MPC-HC 2.1.7.2
clsid is offline   Reply With Quote
Old 25th February 2016, 01:00   #36465  |  Link
XMonarchY
Guest
 
Posts: n/a
Quote:
Originally Posted by Manni View Post
You need at least 10bits for the wider gamut and HDR. In fact, 12bits is the minimum according to Dolby (Vision).
I have neither HDR nor 3D and only normal sRGB/Rec.709 colorspace, but my TV has 12bit output. I guess it makes no difference for anyone without HDR and wide colorspace? I thought no 4K TV could reproduce UHD colorspace yet...
  Reply With Quote
Old 25th February 2016, 01:02   #36466  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by XMonarchY View Post
What does

mean? I recall I could play 10bit 4:2:0 videos with "DXVA2 Copy-Back" enabled in LAV Filters and it was just fine...

"Processing done by GPU Video Logic" - is this what madshi meant?

BTW, LAV Filters does show hardware support for HEVC on my GTX 980 through DXVA2 Copy-Back acceleration. I guess its only for 8bit content then. If 8bit is enough, why would HEVC rips start using 10bit anyway? Using 10bit would increase file size, wouldn't it?

Is NVidia CUVID still an inferior option to DXVA2 Copy-Back? Does it still produce artifacts after all the driver updates, LAV updates, and madVR updates?
I don't think you are getting actual fixed-function hardware decoding but hybrid decoding instead, where part of the GPU is used to decode the HEVC content. But this impacts the performance of the GPU and will not do much of anything to improve the speed of HEVC decoding.

The only fixed-function HEVC decoders are found in the new Intel iGPU (which are 8-bit) and the Nvidia GTX 950/960, which are 10-bit.
Warner306 is offline   Reply With Quote
Old 25th February 2016, 01:11   #36467  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,565
Quote:
Originally Posted by XMonarchY View Post
I have neither HDR nor 3D and only normal sRGB/Rec.709 colorspace, but my TV has 12bit output. I guess it makes no difference for anyone without HDR and wide colorspace? I thought no 4K TV could reproduce UHD colorspace yet...
Even if your display does not have HDR your PC still has to be fast enough to decode the 10 bit HEVC video. 4k can bring fast CPUs to its limits.
sneaker_ger is offline   Reply With Quote
Old 25th February 2016, 01:47   #36468  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
[QUOTE=XMonarchY;1758464]What does

Quote:
* added support for native 10bit 4:2:0 DXVA decoding (+ scaling)
that madVR can accept p010 from DXVA native and it can scale these using DXVA.
Quote:
BTW, LAV Filters does show hardware support for HEVC on my GTX 980 through DXVA2 Copy-Back acceleration. I guess its only for 8bit content then. If 8bit is enough, why would HEVC rips start using 10bit anyway? Using 10bit would increase file size, wouldn't it?
this is hybrid decoding which is pretty much a waste of time. the performence gain from CPU only is very little. 10 bit encoding lowers needed bitrate.
Quote:
Is NVidia CUVID still an inferior option to DXVA2 Copy-Back? Does it still produce artifacts after all the driver updates, LAV updates, and madVR updates?
it's just buggy and 10 bit isn't supported.
Quote:
I have neither HDR nor 3D and only normal sRGB/Rec.709 colorspace, but my TV has 12bit output. I guess it makes no difference for anyone without HDR and wide colorspace? I thought no 4K TV could reproduce UHD colorspace yet...
it doesn't matter if they can't reproduce the color space 100%. and i wouldn't be shocked if it will take 5+ year before we reach 100% BT 2020 on a payable display.
huhn is offline   Reply With Quote
Old 25th February 2016, 01:52   #36469  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Quote:
Originally Posted by Warner306 View Post
The only fixed-function HEVC decoders are found in the new Intel iGPU (which are 8-bit) and the Nvidia GTX 950/960, which are 10-bit.
AMD's Fury cards and Carrizo also have HEVC fixed function 8 bit decoder afaik.
aufkrawall is offline   Reply With Quote
Old 25th February 2016, 01:54   #36470  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by aufkrawall View Post
AMD's Fury cards and Carrizo also have HEVC fixed function 8 bit decoder afaik.
that's correct. the 3xx series has this decoder.
huhn is offline   Reply With Quote
Old 25th February 2016, 02:13   #36471  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
It's a bit more complex (or confusing):
AMD 3xx series also consists of rebranded GCN GPUs (even GCN 1.0).
Afaik GCN 1.2 Tonga GPUs (r9 380 and r9 380X) have a hybrid 8 bit HEVC decoder.
Only very recent Fiji cards (Fury X, Fury, Fury Nano) and Carrizo SoC have fixed function 8 bit HEVC decoder (but no 10 bit capability either).
All other GCN GPUs don't have HEVC decoding capabilities.

Last edited by aufkrawall; 25th February 2016 at 02:19.
aufkrawall is offline   Reply With Quote
Old 25th February 2016, 02:21   #36472  |  Link
baii
Registered User
 
Join Date: Dec 2011
Posts: 180
But I don't see 8bit hevc going to be used much , if at all.
So the word again, is wait.

Sent from my 306SH
baii is offline   Reply With Quote
Old 25th February 2016, 02:37   #36473  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Agreed, HEVC 8 bit decoding capability seems quite useless to me as well.
aufkrawall is offline   Reply With Quote
Old 25th February 2016, 08:07   #36474  |  Link
Uoppi
Registered User
 
Join Date: Oct 2015
Posts: 99
Could someone clarify what "crop black bars" is supposed to do? Should it zoom/scale the bars away or keep them visible (providing player is not set to zoom and madVR is set to lose no image content)?
Uoppi is offline   Reply With Quote
Old 25th February 2016, 08:15   #36475  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
it is there to remove hard coded black bars from videos.
huhn is offline   Reply With Quote
Old 25th February 2016, 08:16   #36476  |  Link
Sunset1982
Registered User
 
Join Date: Sep 2014
Posts: 280
Quote:
Could someone clarify what "crop black bars" is supposed to do? Should it zoom/scale the bars away or keep them visible (providing player is not set to zoom and madVR is set to lose no image content)?
and does it provide a performance boost with files that have black bars when they are cropped by madvr?
__________________
Intel i5 6600, 16 GB DDR4, AMD Vega RX56 8 GB, Windows 10 x64, Kodi DS Player 17.6, MadVR (x64), LAV Filters (x64), XySubfilter .746 (x64)
LG 4K OLED (65C8D), Denon X-4200 AVR, Dali Zensor 5.1 Set
Sunset1982 is offline   Reply With Quote
Old 25th February 2016, 08:19   #36477  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
depends.

but in most cases you have to scale the image more which could be slower.

and removing blackbar without resizing the image is unreliable so you can really take advantage of the performance boost.
huhn is offline   Reply With Quote
Old 25th February 2016, 09:29   #36478  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by Warner306 View Post
No, I'm pretty sure all of the sources are 10-bit. I have seen the official UHD Blu-ray specs. 12-bit sources are listed as a (possible) future upgrade but are not part of the current standard.
Sorry, you are pretty wrong

UHD Bluray encoded with Dolby Vision (all Sony titles, Warner as well I think, Fox is the only one skipping on both Dolby Vision and immersive audio on their initial releases, which is why I'm boycotting them) provide a 10bits HDR10 layer for compatibility with non Dolby Vision displays, and an optional Dolby Vision layer which adds 12bits bit depth using a 1080p layer. Dolby Vision is part of the UHD Bluray specs, it's just an optional layer. Only HDR10 is mandatory.

Dolby (who created PQ gamma) states that 12bits is the minimal bit depth to resolve 4K/UH in PQ Gamma/WCG/HDR without any banding. With HDR10, there is still banding in the picture, especially in the low end.

Of course Dolby could be saying this to justify Dolby Vision, which is why I said "according to Dolby".

Bottom line in no meaningful content will be available in 8bits. UHDTV requires 10bits rec709 minimum, and UHD Bluray requires 10bits as well, WCG or not, and supports 12 bits with Dolby Vision.

Although the new JVCs don't support Dolby Vision, they offer a 12bits path from the input to the panels, hopefully in preparation for Dolby Vision support. The main value of Dolby Vision at this stage is that unlike HDR10, it's possible to calibrate a display accurately with Dolby Vision. Unfortunately it requires hardware support in the source and the display, and a pay-for license.

Of course someone with an 8bits, rec-709 display doesn't care about this, but in that case upscaling bluray is probably a better idea anyway.

Probably time to get back on topic, I'm sure Madshi has better things to do than reading all this in his thread...
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 25th February 2016 at 10:13.
Manni is offline   Reply With Quote
Old 25th February 2016, 09:33   #36479  |  Link
Arm3nian
Registered User
 
Join Date: Jul 2014
Location: Las Vegas
Posts: 177
Video specs are only half the story in determining if you can decode in real time or not. The other half is the bitrate. Specs do not determine the bit rate alone, the compression also has a role. I can screen capture uncompressed 1080p 24fps 8 bit RGB and it will have a higher bitrate than 4k bluray releases. But you also have to consider the software decoder implementation. How efficient is it at decoding avc vs hevc. There are too many variables.

My 4.2GHz 5820k can decode the 4k 10bit 60fps hevc clip from Samsung that was posted a few pages back with no problems. It hovers at around 50% usage.

Most UHD blurays will have a bitrate similar to each other, and the standard has a maximum, so if you can decode that, then you're good to go for bluray. But that doesn't mean you can decode anything. A high bitrate file encoded at the fastest setting would most likely encourage your CPU to commit suicide.

For UHD, you don't need gpu decoding if you have a fairly modern cpu with a decent overclock. But it still would help since it's more efficient at doing it and doesn't turn your room into an oven and cause fans to ramp up.
Arm3nian is offline   Reply With Quote
Old 25th February 2016, 09:40   #36480  |  Link
Arm3nian
Registered User
 
Join Date: Jul 2014
Location: Las Vegas
Posts: 177
Quote:
Originally Posted by Manni View Post
Probably time to get back on topic, I'm sure Madshi has better things to do than reading all this in his thread...
Why is this even being discussed here. Let's go flood the lav thread instead. Onward brothers!
Arm3nian is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 20:59.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.