Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 7th December 2015, 11:02   #34501  |  Link
RainyDog
Registered User
 
Join Date: May 2009
Posts: 168
Quote:
Originally Posted by foozoor View Post
Image enhancement is not enough.

The interest is to do "downsampling" without custom resolution or avisynth.
I am sure madshi already tried to implement this and I wanted to hear what he thinks about it.
You wouldn't be downsampling in the way you do for gaming though.

With gaming it involves rendering the image and assets at a higher native resolution then downsampling to your displays resolution.

With 1080p video on a 1080p display, to upscale/line-double to a higher resolution then downscale back to 1080p is not a good idea at all. You'd just be introducing scaling artifacts in both the upscale and downscale for the sake of it.
RainyDog is offline   Reply With Quote
Old 7th December 2015, 11:09   #34502  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 711
Has anyone tried to use MadVR in 10bits FSE mode to display HDR content to an HDR compatible display?
Is there any chance it would work if the PC was set to RGB 4:4:4 and PC Levels?
Or is there no way to get this to work until GPU, drivers and software handle it properly?
As MadVR is currently the only way to display 10bits content, I guess that's the only candidate on a PC.
I would want to try demo content like the Exodus and the Life of Pi trailers available (HEVC, 10bits, HDR, UHD).
MPC-BE and MadVR seem to struggle a bit more than PowerDVD 15 with HEVC content on my GPU, but as PDV15 doesn't support 10bits that rules it out.
Thanks if anyone has tried or has any ideas/suggestions.
I'd like to avoid having to buy a Sony 4K server just to try this out
__________________
Win10 Pro x64 b1809 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 398.11 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25PR2
Denon X8500H>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 7th December 2015, 11:15   #34503  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,759
You would need to tell the display that you are sending it HDR content - and that is assuming display can even accept HDR in RGB (in contrast to YCbCr which they would otherwise get from a STB). No software that I know of can tell a display this at this point.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 7th December 2015, 11:16   #34504  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,729
there is currently no HDR support and 10 bit doesn't change this.

the problem is the different gamma curve and the TV doesn't know about this nor can madVR correct it and than send it to the TV
huhn is offline   Reply With Quote
Old 7th December 2015, 12:21   #34505  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 711
Quote:
Originally Posted by nevcairiel View Post
You would need to tell the display that you are sending it HDR content - and that is assuming display can even accept HDR in RGB (in contrast to YCbCr which they would otherwise get from a STB). No software that I know of can tell a display this at this point.
Quote:
Originally Posted by huhn View Post
there is currently no HDR support and 10 bit doesn't change this.

the problem is the different gamma curve and the TV doesn't know about this nor can madVR correct it and than send it to the TV
Thanks both, but there are workarounds for this as long as the content is sent properly.

First, you can switch the display manually to HDR mode even if the source doesn't support HDMI 2.0a.

This is what Sony and JVC did at IFA/CEDIA when demoing HDR content using the Sony FMP-X10 as a source, which doesn't support HDMI 2.0a (unlike their HDR displays). They only had to switch the display to HDR mode so that the correct PQ Gamma curve would be applied and the levels would be interpreted properly, along with the HDR metadata. The new Sony 520/665ES even does this on their slow HDMI 1.4 bandwidth chipset, so a proper HDMI 2.0 chipset isn't even necessary, you just need the HDMI 2.0a profile to be implemented on the display.

You can also use the HD Fury Integral to make a non HDR compatible source send the HDR flag so that the display switches to HDR mode. This is achieved either using the GUI on a PC, or an Android (and soon iOS) app. They even offer an option to use CEC to hijack the SP/LP keys on a remote and use these to switch HDR on and off. They have lots of really funky HDR options to inject HDR metadata in the chain. They even plan to add HDR support to non HDR compatible display, but that's not implemented yet. Highly recommended if you want to play with this (and get rid of HDCP 2.2 as well to display HDCP 2.2 protected sources like the Roku 4, the NVidia Shield, the Amazon Fire TV or an upcoming UHD Bluray player to non HDCP 2.2 displays/AVRs).

Anyway, this is why I'm focusing on the PC being able to send the 10bits HEVC HDR content untouched.

Is this possible with MadVR in 10bits FSE and the current versions of LAV/MPC-BE, and if yes in which mode would you suggest the drivers/MadVR should be set?

My guess would be RGB 4:4:4 10bits for the driver, to avoid a conversion to YBC before the content even leaves the PC, and then having both the driver and MadVR set to PC levels (0-255) and LAV set to untouched to get a chance to display the correct levels. The display would be set to RGB Enhanced (PC Levels) as well to all a direct input.

All this should be doable at 23/24p even with a 2.0 level B HDMI out, although my HD7870 switches to 8bits RGB 4:4:4 when in UHD even at 23/24p, so I'm planning to get the upcoming Club3D active DisplayPort 1.2 to HDMI 2.0 adapter to get a full 18gb/s HDMI port (with HDCP 2.2, not that I will need it) and get more flexibility with output modes at various refresh rates.

What do you think?
__________________
Win10 Pro x64 b1809 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 398.11 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25PR2
Denon X8500H>HD Fury Maestro>JVC RS2000

Last edited by Manni; 7th December 2015 at 12:30.
Manni is offline   Reply With Quote
Old 7th December 2015, 12:37   #34506  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,729
there is no correct PQ gamma curve they can be different with each file. you can choice a different max luma value to use the "bit" more effectively on what the source has.

and the next problem is the different types of HDR. usually a display only supports one type. i never heard of a display supporting all or even 2 types of HDR yet.
huhn is offline   Reply With Quote
Old 7th December 2015, 12:37   #34507  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 1,428
Quote:
Originally Posted by nevcairiel View Post
Thats likely 25Hz interlaced then. At least thats what it is on most other TVs.
Hm, my 5 year old TV reports 23p,24p,25p,29p,59p,60p and couple of other interlaced resolution via at least 1 hdmi port. And madVR can correctly switch between them.
I don't lot of interlaced content, just sometimes some UK caps, that reporting interlaced flag, but I disabled that functionality in madVR, since I don't see any difference (probably the interlaced flag is set during broadcast).
Quote:
Originally Posted by Manni View Post
I would want to try demo content like the Exodus and the Life of Pi trailers available (HEVC, 10bits, HDR, UHD).
Interesting, can you provide links for them?
Thanks
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v385.28),Win10 LTSB 1607,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED65B8(2160p@23/24/25/29/30/50/59/60Hz)
chros is offline   Reply With Quote
Old 7th December 2015, 12:41   #34508  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 711
Quote:
Originally Posted by huhn View Post
there is no correct PQ gamma curve they can be different with each file. you can choice a different max luma value to use the "bit" more effectively on what the source has.

and the next problem is the different types of HDR. usually a display only supports one type. i never heard of a display supporting all or even 2 types of HDR yet.
I'm talking HDR10, as none of these files uses Dolby Vision, Technicolor or Philips HDR and this is what the JVCs support.

Please, can you stick to answering the question? Thanks.

I'm asking is there a way to get a file encoded in HDR10, HEVC, 10bits, UHD out to a compatible display which is able to display that content properly provided it's not destroyed by the player/driver/renderer/decoder. Let me deal with the calibration side of it

By the way the new JVCs allow you to specify in their settings the correct value for peak white so that they can translate the content mastered to 1000nits to whatever you can get on their projectors (probably closer to 100-150nits at best). Shame HDR at home is mastered for panels and not for projectors. I'd have preferred to get content mastered to 100nits for peak white, as for HDR cinema, but that wouldn't help much with panels used in a living room with ambient light, which is clearly their main target.
__________________
Win10 Pro x64 b1809 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 398.11 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25PR2
Denon X8500H>HD Fury Maestro>JVC RS2000

Last edited by Manni; 7th December 2015 at 12:50.
Manni is offline   Reply With Quote
Old 7th December 2015, 12:43   #34509  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 711
Quote:
Originally Posted by chros View Post
Interesting, can you provide links for them?
Thanks
Sure, http://demo-uhd3d.com/ in the UHD HDR section.
__________________
Win10 Pro x64 b1809 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 398.11 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25PR2
Denon X8500H>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 7th December 2015, 12:59   #34510  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,759
Quote:
Originally Posted by Manni View Post
you can switch the display manually to HDR mode even if the source doesn't support HDMI 2.0a.
HEVC HDR encodes include extra metadata that is required to properly interpret the image, without this its not possible to show it properly.
The display needs to get this metadata from the player somehow, otherwise all bets are off. If you want to somehow manually do that through external hardware, go nuts, but thats hardly relevant for madVR at this point.

Also, that still leaves the problem that I seriously doubt HDR is meant to be transferred over RGB, as there is a distinctly different conversion from YUV/YCbCr to RGB for HDR content, and any RGB converted without this would already be mangled and damaged.

A PC cannot output content "untouched", since PCs cannot output YCbCr as-is through consumer hardware/software.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 7th December 2015 at 13:02.
nevcairiel is offline   Reply With Quote
Old 7th December 2015, 13:29   #34511  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
FWIW, I'm working on HDR playback as we speak. However, madVR will convert the content to use a normal gamma transfer function. I expect final quality to be better this way than sending the HDR content untouched to the display and letting the display do all the processing.
madshi is offline   Reply With Quote
Old 7th December 2015, 13:37   #34512  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 711
Quote:
Originally Posted by madshi View Post
FWIW, I'm working on HDR playback as we speak. However, madVR will convert the content to use a normal gamma transfer function. I expect final quality to be better this way than sending the HDR content untouched to the display and letting the display do all the processing.
Thanks for this bit of hope!

Will we get the option to use PQ gamma, or will this be a fixed setting in MadVR?

Any idea when we might be able to test this?

Which GPU/Driver/MadVR settings will you recommend re pixel format / levels?

Let me know if you'd like me to beta test, I should have an X7000 in the next few weeks.

Does this mean you're going to implement HDR support in MadTPG for calibration as well? That would be great!
__________________
Win10 Pro x64 b1809 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 398.11 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25PR2
Denon X8500H>HD Fury Maestro>JVC RS2000

Last edited by Manni; 7th December 2015 at 13:51.
Manni is offline   Reply With Quote
Old 7th December 2015, 15:04   #34513  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,759
Quote:
Originally Posted by madshi View Post
FWIW, I'm working on HDR playback as we speak. However, madVR will convert the content to use a normal gamma transfer function. I expect final quality to be better this way than sending the HDR content untouched to the display and letting the display do all the processing.
Couldn't high-end displays do backlight trickery to achieve a much higher contrast/dynamic range when they have all the information?
I'm just guessing here, but it sounds like something that might happen.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 7th December 2015, 16:40   #34514  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Madshi are you using the HDR clips Manni provided as reference?
So the HDR metadata is inside the MP4 file and this is what we might expect in the future 10bit, 4:2:0, rec2020, 3840x2160, HEVC?

EDIT:
Playing a little with the HDR demos of Exodus and PI I can get a decent image setting the black level to -30 or so.
It is probably too early to say but if I'd have to guess, I think that these particular demos are just expanded versions of the original and not scene by scene mastered.
The blacks of the content seem to be somewhere in the middle of the range, but what about the lower range?

How the HEVC HDR encoding utilizes the 10bit to distribute the shades and how the TV knows to decode that and engage certain light zones in certain brightness is still a mystery...
What is the difference between the two versions of the same video on that site? One is called "Dynamic Mode"?
Hmmm...

One thing is for certain, all the shadow detail is preserved because it is in the middle of the range unlike in a blu-ray where it is the lowest and first few steps and are blocky.
Together with the high bitrate this creates a very beautiful picture and many times smoother than a standard Bluray.

After Madshi will do his magic on the new format, even people with standard 1080p rec.709 monitors will see a clearly better picture.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 7th December 2015 at 18:08.
James Freeman is offline   Reply With Quote
Old 7th December 2015, 18:10   #34515  |  Link
Uoppi
Registered User
 
Join Date: Oct 2015
Posts: 99
Quote:
Originally Posted by chros View Post
Hm, my 5 year old TV reports 23p,24p,25p,29p,59p,60p and couple of other interlaced resolution via at least 1 hdmi port. And madVR can correctly switch between them.
Same with my both Samsungs (and 30p support listed and tested too).

So how would I go about checking if the 25 Hz is true/native 25 Hz? This is the first time I've heard that 25 Hz might actually be 50 Hz. I don't understand why the TV would report 25 Hz refresh rate if it's actually in 50 Hz mode.

EDIT: From the TV's manual (under "Supported Video Resolutions"):
1920 x 1080 - Display Format: 25Hz, Horizontal Frequency (KHz): 28.125, Vertical Frequency (Hz): 25.000, Clock Frequency (MHz): 74.250
Don't know if the above means "true" 25 Hz or not?

Last edited by Uoppi; 7th December 2015 at 18:20.
Uoppi is offline   Reply With Quote
Old 7th December 2015, 18:39   #34516  |  Link
Thunderbolt8
Registered User
 
Join Date: Sep 2006
Posts: 2,171
Reports if 50 or 25 Hz can vary on how interlacing is detected and if deinterlacing is activated automatically. Better check this manually, AFAIK it cannot be entirely perfect in all cases.
__________________
Laptop Acer Aspire V3-772g: i7-4202MQ, 8GB Ram, NVIDIA GTX 760M (+ Intel HD 4600), Windows 8.1 x64, madVR (x64), MPC-HC (x64), LAV Filter (x64), XySubfilter (x64)
Thunderbolt8 is offline   Reply With Quote
Old 7th December 2015, 18:41   #34517  |  Link
Razoola
Registered User
 
Join Date: May 2007
Posts: 454
Quote:
Originally Posted by nevcairiel View Post
Couldn't high-end displays do backlight trickery to achieve a much higher contrast/dynamic range when they have all the information?
I'm just guessing here, but it sounds like something that might happen.
Philips tried that with the 9703 range a few years back. The problem is it causes flashes of color in the picture when there is motion. Those who know what the rainbow effect is that some people could see on plasmas will know what Im talking about... Its just like that but only stronger.

edit....

http://www.appliancedesign.com/artic...rightness-9-19

see above url for a little info on the spec.

Last edited by Razoola; 7th December 2015 at 18:45.
Razoola is offline   Reply With Quote
Old 7th December 2015, 18:48   #34518  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 930
Quote:
Originally Posted by Uoppi View Post
Oh, I thought almost all current TVs do?

Windows reports both my Samsung TVs from 2008 and 2014 as supporting 25 Hz, but it's not necessarily "true" 25 Hz then? The TVs' info OSD also says 25 Hz and the manual lists 25 Hz as a supported refresh rate.
I wasn't talking about what the TV supports or will accept, I'm talking about what the TV will display. As nevcairiel says, "25 Hz" input is likely for interlaced but even if you could send your TV 25p content, it'll display it at at least 50 Hz. Likely more, especially if it's a plasma (mine does 100 Hz for 25/50p and 96 Hz for 24p).
__________________
HTPC Hardware: Intel Celeron G530; nVidia GT 430
HTPC Software: Windows 7; MediaPortal 1.19.0; Kodi DSPlayer 17.6; LAV Filters (DXVA2); MadVR
TV Setup: LG OLED55B7V; Onkyo TX-NR515; Minix U9-H
DragonQ is offline   Reply With Quote
Old 7th December 2015, 19:48   #34519  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 1,428
Quote:
Originally Posted by Uoppi View Post
Same with my both Samsungs (and 30p support listed and tested too).
So how would I go about checking if the 25 Hz is true/native 25 Hz?
Don't know if the above means "true" 25 Hz or not?
There was a long debate about this on avsforum, and the result was truly confusing.
We use our TV driven by PC so it acts like a monitor if you set it up like this, means you turn off every image processing feature that could modify frames. With my LG means: I have to set scan mode, turn off TruMOtion and turn on (!) RealCinema (it's for 24p content, don't ask why they distinguished it).
Quote:
Originally Posted by DragonQ View Post
I wasn't talking about what the TV supports or will accept, I'm talking about what the TV will display. As nevcairiel says, "25 Hz" input is likely for interlaced but even if you could send your TV 25p content, it'll display it at at least 50 Hz.
And they discussed what could happen when you set TruMotion on how it would affect frames.
I have 1 problem about this version of the story: my LG is EU version, so it's officially 100Hz (probably it refers to an 50fps content and using TruMotion with it.)
But: what would happen with a 23.976 fps content with it? You'll never get 50Hz, neither 100. So will it use 48??? (They states that it'll use 120Hz with the US version.)
Either way, I think it doesn't matter if the frames are exactly doubled or tripled, we will get continous movement with it, without stuttering and there won't happen any deinterlacing on our TV.
This is how I understand it.
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v385.28),Win10 LTSB 1607,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED65B8(2160p@23/24/25/29/30/50/59/60Hz)

Last edited by chros; 7th December 2015 at 19:51.
chros is offline   Reply With Quote
Old 7th December 2015, 20:39   #34520  |  Link
Uoppi
Registered User
 
Join Date: Oct 2015
Posts: 99
Quote:
Originally Posted by DragonQ View Post
I wasn't talking about what the TV supports or will accept, I'm talking about what the TV will display.
OK. So if displays only rarely output true 25 Hz, I could just as well omit 25p from madVR's supported refresh rates list, right (because it will be 50 Hz anyway)?

Just wondering, if the TV specs say "Display Format: 25Hz ... Vertical Frequency (Hz): 25.000", shouldn't it be "50.000" instead then?

Anyway, I just realized 29 Hz and 59 Hz are not listed for my TV after all, so I'll need to delete them from madVR's refresh rate list.

Last edited by Uoppi; 7th December 2015 at 20:43.
Uoppi is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 20:16.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.