Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 25th February 2016, 02:21   #36481  |  Link
baii
Registered User
 
Join Date: Dec 2011
Posts: 180
But I don't see 8bit hevc going to be used much , if at all.
So the word again, is wait.

Sent from my 306SH
baii is offline   Reply With Quote
Old 25th February 2016, 02:37   #36482  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,716
Agreed, HEVC 8 bit decoding capability seems quite useless to me as well.
aufkrawall is offline   Reply With Quote
Old 25th February 2016, 08:07   #36483  |  Link
Uoppi
Registered User
 
Join Date: Oct 2015
Posts: 99
Could someone clarify what "crop black bars" is supposed to do? Should it zoom/scale the bars away or keep them visible (providing player is not set to zoom and madVR is set to lose no image content)?
Uoppi is offline   Reply With Quote
Old 25th February 2016, 08:15   #36484  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,965
it is there to remove hard coded black bars from videos.
huhn is offline   Reply With Quote
Old 25th February 2016, 08:16   #36485  |  Link
Sunset1982
Registered User
 
Join Date: Sep 2014
Posts: 277
Quote:
Could someone clarify what "crop black bars" is supposed to do? Should it zoom/scale the bars away or keep them visible (providing player is not set to zoom and madVR is set to lose no image content)?
and does it provide a performance boost with files that have black bars when they are cropped by madvr?
__________________
Intel i5 6600, 16 GB DDR4, AMD Vega RX56 8 GB, Windows 10 x64, Kodi DS Player 17.6, MadVR (x64), LAV Filters (x64), XySubfilter .746 (x64)
LG 4K OLED (65C8D), Denon X-4200 AVR, Dali Zensor 5.1 Set
Sunset1982 is offline   Reply With Quote
Old 25th February 2016, 08:19   #36486  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,965
depends.

but in most cases you have to scale the image more which could be slower.

and removing blackbar without resizing the image is unreliable so you can really take advantage of the performance boost.
huhn is offline   Reply With Quote
Old 25th February 2016, 09:29   #36487  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 733
Quote:
Originally Posted by Warner306 View Post
No, I'm pretty sure all of the sources are 10-bit. I have seen the official UHD Blu-ray specs. 12-bit sources are listed as a (possible) future upgrade but are not part of the current standard.
Sorry, you are pretty wrong

UHD Bluray encoded with Dolby Vision (all Sony titles, Warner as well I think, Fox is the only one skipping on both Dolby Vision and immersive audio on their initial releases, which is why I'm boycotting them) provide a 10bits HDR10 layer for compatibility with non Dolby Vision displays, and an optional Dolby Vision layer which adds 12bits bit depth using a 1080p layer. Dolby Vision is part of the UHD Bluray specs, it's just an optional layer. Only HDR10 is mandatory.

Dolby (who created PQ gamma) states that 12bits is the minimal bit depth to resolve 4K/UH in PQ Gamma/WCG/HDR without any banding. With HDR10, there is still banding in the picture, especially in the low end.

Of course Dolby could be saying this to justify Dolby Vision, which is why I said "according to Dolby".

Bottom line in no meaningful content will be available in 8bits. UHDTV requires 10bits rec709 minimum, and UHD Bluray requires 10bits as well, WCG or not, and supports 12 bits with Dolby Vision.

Although the new JVCs don't support Dolby Vision, they offer a 12bits path from the input to the panels, hopefully in preparation for Dolby Vision support. The main value of Dolby Vision at this stage is that unlike HDR10, it's possible to calibrate a display accurately with Dolby Vision. Unfortunately it requires hardware support in the source and the display, and a pay-for license.

Of course someone with an 8bits, rec-709 display doesn't care about this, but in that case upscaling bluray is probably a better idea anyway.

Probably time to get back on topic, I'm sure Madshi has better things to do than reading all this in his thread...
__________________
Win10 Pro x64 b1903 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 436.48 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25
Denon X8500H>HD Fury Maestro>JVC RS2000

Last edited by Manni; 25th February 2016 at 10:13.
Manni is offline   Reply With Quote
Old 25th February 2016, 09:33   #36488  |  Link
Arm3nian
Registered User
 
Join Date: Jul 2014
Location: Las Vegas
Posts: 177
Video specs are only half the story in determining if you can decode in real time or not. The other half is the bitrate. Specs do not determine the bit rate alone, the compression also has a role. I can screen capture uncompressed 1080p 24fps 8 bit RGB and it will have a higher bitrate than 4k bluray releases. But you also have to consider the software decoder implementation. How efficient is it at decoding avc vs hevc. There are too many variables.

My 4.2GHz 5820k can decode the 4k 10bit 60fps hevc clip from Samsung that was posted a few pages back with no problems. It hovers at around 50% usage.

Most UHD blurays will have a bitrate similar to each other, and the standard has a maximum, so if you can decode that, then you're good to go for bluray. But that doesn't mean you can decode anything. A high bitrate file encoded at the fastest setting would most likely encourage your CPU to commit suicide.

For UHD, you don't need gpu decoding if you have a fairly modern cpu with a decent overclock. But it still would help since it's more efficient at doing it and doesn't turn your room into an oven and cause fans to ramp up.
Arm3nian is offline   Reply With Quote
Old 25th February 2016, 09:40   #36489  |  Link
Arm3nian
Registered User
 
Join Date: Jul 2014
Location: Las Vegas
Posts: 177
Quote:
Originally Posted by Manni View Post
Probably time to get back on topic, I'm sure Madshi has better things to do than reading all this in his thread...
Why is this even being discussed here. Let's go flood the lav thread instead. Onward brothers!
Arm3nian is offline   Reply With Quote
Old 25th February 2016, 09:41   #36490  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,834
Quote:
Originally Posted by Arm3nian View Post
Why is this even being discussed here. Let's go flood the lav thread instead. Onward brothers!
You know the forum allows you to create your own threads for entirely unrelated discussions.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 25th February 2016, 10:18   #36491  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by Manni View Post
Sorry, you are pretty wrong

UHD Bluray encoded with Dolby Vision (all Sony titles, Warner as well I think, Fox is the only one skipping on both Dolby Vision and immersive audio on their initial releases, which is why I'm boycotting them) provide a 10bits HDR10 layer for compatibility with non Dolby Vision displays, and an optional Dolby Vision layer which adds 12bits bit depth using a 1080p layer. Dolby Vision is part of the UHD Bluray specs, it's just an optional layer. Only HDR10 is mandatory.

Dolby (who created PQ gamma) states that 12bits is the minimal bit depth to resolve 4K/UH in PQ Gamma/WCG/HDR without any banding. With HDR10, there is still banding in the picture, especially in the low end.

Of course Dolby could be saying this to justify Dolby Vision, which is why I said "according to Dolby".

Bottom line in no meaningful content will be available in 8bits. UHDTV requires 10bits rec709 minimum, and UHD Bluray requires 10bits as well, WCG or not, and supports 12 bits with Dolby Vision.

Although the new JVCs don't support Dolby Vision, they offer a 12bits path from the input to the panels, hopefully in preparation for Dolby Vision support. The main value of Dolby Vision at this stage is that unlike HDR10, it's possible to calibrate a display accurately with Dolby Vision. Unfortunately it requires hardware support in the source and the display, and a pay-for license.

Of course someone with an 8bits, rec-709 display doesn't care about this, but in that case upscaling bluray is probably a better idea anyway.

Probably time to get back on topic, I'm sure Madshi has better things to do than reading all this in his thread...
I see. Wouldn't this just be the metadata layered on top of the 10-bit source? Or is the entire pipeline 12-bits?

At any rate, is it possible to decode Dolby Vision with madVR, or is this proprietary to select TV manufacturers?
Warner306 is offline   Reply With Quote
Old 25th February 2016, 12:18   #36492  |  Link
David
Registered User
 
Join Date: Feb 2014
Posts: 16
What do i need to watch hdr with madVR if my TV is 10 bit and WCG?

- Last LAV nightly build
- Setting madVR ex. fullscreen+d3d11+10 bits and (?) DCI P3 in set is already calibrated
- Maximizin luminance and contrast TV setting(?)
- Setting TV WCG
- Ajusting madVR nits

With this settings i think madVR will send 10 bits DCI P3 (REC 2020) with dinamic range compressed to luminance settings. Am i correct?

What about PQ EOTF (ST 2084)? What does madVR with it?

Last edited by David; 25th February 2016 at 12:21.
David is offline   Reply With Quote
Old 25th February 2016, 12:23   #36493  |  Link
nijiko
Hi-Fi Fans
 
Join Date: Dec 2008
Posts: 222
I've a long time no watching FHD films.
Is there some problem with Intel videocard driver for Win10 now? (HD4600 and newest 4380 driver)
It's heavy tearing with MPC-HC svn 101 + LAV 67.134 + madVR 90.12 in Win10 10586.104.

Addon:
The same settings in Win8.1, there is no problems.
And in Win10 now, if I use NV card, there is no problems, too.

Last edited by nijiko; 25th February 2016 at 12:30. Reason: Addon
nijiko is offline   Reply With Quote
Old 25th February 2016, 12:40   #36494  |  Link
starlight2
Registered User
 
Join Date: Jun 2010
Posts: 26
hi guys, i want buy and use a gtx 970 only for madvr but my processor is Intel i3 3110M (dual core 2.4Ghz), Madvr use only gpu? Is possibile this or is cpu limited?
starlight2 is offline   Reply With Quote
Old 25th February 2016, 12:52   #36495  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,227
Quote:
Originally Posted by starlight2 View Post
hi guys, i want buy and use a gtx 970 only for madvr but my processor is Intel i3 3110M (dual core 2.4Ghz), Madvr use only gpu? Is possibile this or is cpu limited?
Yup. You're very limited with using your onboard GPU.
ryrynz is offline   Reply With Quote
Old 25th February 2016, 13:05   #36496  |  Link
starlight2
Registered User
 
Join Date: Jun 2010
Posts: 26
so the CPU is used anyway? I am currently using graphics card GT610, use only 1080p and use bilinear 100 + AR + SuperRes and the cpu is only used 10%
starlight2 is offline   Reply With Quote
Old 25th February 2016, 13:16   #36497  |  Link
Sunset1982
Registered User
 
Join Date: Sep 2014
Posts: 277
Quote:
hi guys, i want buy and use a gtx 970 only for madvr
I would wait till summer when next GPU generation will be released... it will be more power efficient and more powerful with support for HDMI 2.0a, HDR and HEVC 10 bit decoding...
__________________
Intel i5 6600, 16 GB DDR4, AMD Vega RX56 8 GB, Windows 10 x64, Kodi DS Player 17.6, MadVR (x64), LAV Filters (x64), XySubfilter .746 (x64)
LG 4K OLED (65C8D), Denon X-4200 AVR, Dali Zensor 5.1 Set
Sunset1982 is offline   Reply With Quote
Old 25th February 2016, 13:19   #36498  |  Link
Uoppi
Registered User
 
Join Date: Oct 2015
Posts: 99
Quote:
Originally Posted by huhn View Post
depends.

but in most cases you have to scale the image more which could be slower.

and removing blackbar without resizing the image is unreliable so you can really take advantage of the performance boost.
I'm seeing an insignificant performance boost so I've decided to just disable "crop black bars".

Half of the time I don't even understand why madVR is scaling, so I guess I'm better off not using that feature, LOL. For example, because of the scaling, some videos even end up playing under a different/unexpected madVR profile (i.e. not the profile that would be used if "crop black bars" was disabled).

Last edited by Uoppi; 25th February 2016 at 13:28.
Uoppi is offline   Reply With Quote
Old 25th February 2016, 13:21   #36499  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 733
Quote:
Originally Posted by Warner306 View Post
I see. Wouldn't this just be the metadata layered on top of the 10-bit source? Or is the entire pipeline 12-bits?

At any rate, is it possible to decode Dolby Vision with madVR, or is this proprietary to select TV manufacturers?
No, as I said it's not just metadata, Dolby Vision achieves 12bits on top of the UHD HDR10 mandatory layer with a 1080p layer which is used to recreate the UHD 12bits original bitdepth.

Dolby Vision is only supported if both the source and the display have the necessary hardware and license. This is what makes third party HDR calibration possible, because the characteristics of every licensed display features in a Dolby maintained database, shared with licensed calibration software companies. With HDR10, each manufacturer does what they want - and seems unwilling to share what they do exactly - as there is no standard defined for consumer playback (although the ITU might publish something in the next few months).

As MadVR isn't tied to any display (it does what a display manufacturer does, but it's not doing it to any fixed target) and doesn't support HDR passthrough yet, I don't think it's possible to license it for its current HDR to SDR conversion feature.

When/if MadVR handles HDR metadata passthrough (which Madshi has indicated will not happen until there is HDR support at GPU/Driver/API level, so not until this summer with Arctic Islands at best), it might be possible to do something if the display supports Dolby Vision. But honestly, it looks like a lot of work for a minimal benefit for most people.

I leave it to Madshi to answer this question in more details, but it might be difficult for him to do so at this stage as there is little technical information publicly available on Dolby Vision if you're not a licensee.
__________________
Win10 Pro x64 b1903 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 436.48 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25
Denon X8500H>HD Fury Maestro>JVC RS2000

Last edited by Manni; 25th February 2016 at 13:26.
Manni is offline   Reply With Quote
Old 25th February 2016, 14:46   #36500  |  Link
starlight2
Registered User
 
Join Date: Jun 2010
Posts: 26
Quote:
Originally Posted by Sunset1982 View Post
I would wait till summer when next GPU generation will be released... it will be more power efficient and more powerful with support for HDMI 2.0a, HDR and HEVC 10 bit decoding...
Thank you, but i want buy and use now :P
Guys Help me?
Your processor, what percentage of use have when working with madVR?
starlight2 is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 03:15.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.