Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
25th February 2016, 00:42 | #36461 | Link | |
Guest
Posts: n/a
|
What does
Quote:
"Processing done by GPU Video Logic" - is this what madshi meant? BTW, LAV Filters does show hardware support for HEVC on my GTX 980 through DXVA2 Copy-Back acceleration. I guess its only for 8bit content then. If 8bit is enough, why would HEVC rips start using 10bit anyway? Using 10bit would increase file size, wouldn't it? Is NVidia CUVID still an inferior option to DXVA2 Copy-Back? Does it still produce artifacts after all the driver updates, LAV updates, and madVR updates? Last edited by XMonarchY; 25th February 2016 at 00:45. |
|
25th February 2016, 00:53 | #36462 | Link | |
Registered User
Join Date: Jul 2014
Posts: 942
|
Quote:
__________________
Win11 Pro x64 b23H2 Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33 madVR/LAV/jRiver/MyMovies/CMC Denon X8500HA>HD Fury VRRoom>TCL 55C805K |
|
25th February 2016, 00:58 | #36463 | Link |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
No, I'm pretty sure all of the sources are 10-bit. I have seen the official UHD Blu-ray specs. 12-bit sources are listed as a (possible) future upgrade but are not part of the current standard.
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players |
25th February 2016, 01:00 | #36464 | Link |
*****
Join Date: Feb 2005
Posts: 5,643
|
That changelog entry only concerns the "native" variant of DXVA2. That requires special handling from the video renderer. Copy-back acts as a normal decoder as far as renderers (and other downstream filters) are concerned.
Your GTX 980 provides partial acceleration. That means it does not have a dedicated decoder unit for HEVC, but uses generic GPU and CPU resources to do the decoding. That is much less efficient and slower compared to full acceleration. 10bit does not necessarily make things bigger. In fact, it can sometimes even provide similar quality at lower size. CUVID is inferior due to broken NVIDIA drivers. LAV and madVR are NOT at fault.
__________________
MPC-HC 2.1.7.2 |
25th February 2016, 01:02 | #36466 | Link | |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
Quote:
The only fixed-function HEVC decoders are found in the new Intel iGPU (which are 8-bit) and the Nvidia GTX 950/960, which are 10-bit.
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players |
|
25th February 2016, 01:47 | #36468 | Link | ||||
Registered User
Join Date: Oct 2012
Posts: 7,903
|
[QUOTE=XMonarchY;1758464]What does
Quote:
Quote:
Quote:
Quote:
|
||||
25th February 2016, 02:13 | #36471 | Link |
Registered User
Join Date: Dec 2011
Posts: 1,812
|
It's a bit more complex (or confusing):
AMD 3xx series also consists of rebranded GCN GPUs (even GCN 1.0). Afaik GCN 1.2 Tonga GPUs (r9 380 and r9 380X) have a hybrid 8 bit HEVC decoder. Only very recent Fiji cards (Fury X, Fury, Fury Nano) and Carrizo SoC have fixed function 8 bit HEVC decoder (but no 10 bit capability either). All other GCN GPUs don't have HEVC decoding capabilities. Last edited by aufkrawall; 25th February 2016 at 02:19. |
25th February 2016, 08:16 | #36476 | Link | |
Registered User
Join Date: Sep 2014
Posts: 280
|
Quote:
__________________
Intel i5 6600, 16 GB DDR4, AMD Vega RX56 8 GB, Windows 10 x64, Kodi DS Player 17.6, MadVR (x64), LAV Filters (x64), XySubfilter .746 (x64) LG 4K OLED (65C8D), Denon X-4200 AVR, Dali Zensor 5.1 Set |
|
25th February 2016, 09:29 | #36478 | Link | |
Registered User
Join Date: Jul 2014
Posts: 942
|
Quote:
UHD Bluray encoded with Dolby Vision (all Sony titles, Warner as well I think, Fox is the only one skipping on both Dolby Vision and immersive audio on their initial releases, which is why I'm boycotting them) provide a 10bits HDR10 layer for compatibility with non Dolby Vision displays, and an optional Dolby Vision layer which adds 12bits bit depth using a 1080p layer. Dolby Vision is part of the UHD Bluray specs, it's just an optional layer. Only HDR10 is mandatory. Dolby (who created PQ gamma) states that 12bits is the minimal bit depth to resolve 4K/UH in PQ Gamma/WCG/HDR without any banding. With HDR10, there is still banding in the picture, especially in the low end. Of course Dolby could be saying this to justify Dolby Vision, which is why I said "according to Dolby". Bottom line in no meaningful content will be available in 8bits. UHDTV requires 10bits rec709 minimum, and UHD Bluray requires 10bits as well, WCG or not, and supports 12 bits with Dolby Vision. Although the new JVCs don't support Dolby Vision, they offer a 12bits path from the input to the panels, hopefully in preparation for Dolby Vision support. The main value of Dolby Vision at this stage is that unlike HDR10, it's possible to calibrate a display accurately with Dolby Vision. Unfortunately it requires hardware support in the source and the display, and a pay-for license. Of course someone with an 8bits, rec-709 display doesn't care about this, but in that case upscaling bluray is probably a better idea anyway. Probably time to get back on topic, I'm sure Madshi has better things to do than reading all this in his thread...
__________________
Win11 Pro x64 b23H2 Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33 madVR/LAV/jRiver/MyMovies/CMC Denon X8500HA>HD Fury VRRoom>TCL 55C805K Last edited by Manni; 25th February 2016 at 10:13. |
|
25th February 2016, 09:33 | #36479 | Link |
Registered User
Join Date: Jul 2014
Location: Las Vegas
Posts: 177
|
Video specs are only half the story in determining if you can decode in real time or not. The other half is the bitrate. Specs do not determine the bit rate alone, the compression also has a role. I can screen capture uncompressed 1080p 24fps 8 bit RGB and it will have a higher bitrate than 4k bluray releases. But you also have to consider the software decoder implementation. How efficient is it at decoding avc vs hevc. There are too many variables.
My 4.2GHz 5820k can decode the 4k 10bit 60fps hevc clip from Samsung that was posted a few pages back with no problems. It hovers at around 50% usage. Most UHD blurays will have a bitrate similar to each other, and the standard has a maximum, so if you can decode that, then you're good to go for bluray. But that doesn't mean you can decode anything. A high bitrate file encoded at the fastest setting would most likely encourage your CPU to commit suicide. For UHD, you don't need gpu decoding if you have a fairly modern cpu with a decent overclock. But it still would help since it's more efficient at doing it and doesn't turn your room into an oven and cause fans to ramp up. |
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
Thread Tools | Search this Thread |
Display Modes | |
|
|