Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
15th February 2019, 13:54 | #54741 | Link | |
Registered User
Join Date: Apr 2018
Location: Paris, France
Posts: 92
|
Quote:
Best example is motion interpolation. Most self-proclaimed purists are very happy to watch 24p movies with judder because "hey, that's what a 24p movie is supposed to look like, it's part of movie language". Except the movie was primarily supposed to be watched in an actual movie theater where I've never seen any motion judder, probably because the projection tech is different. So, TV or projector, I always activate a tad of motion interpolation (3/10 de-judder on my LG, motionflow low on my Sony VP) to achieve what I consider to be "actual movie theater" look. I'm sure 90% of people here would consider this a blasphemy Bottom line : random articles on the Internet saying you should turn this setting on or off on your TV or projector are not always right... Just give it a try and see 1) if you see a difference and 2) if you like it. Last edited by Charky; 15th February 2019 at 13:58. |
|
15th February 2019, 13:59 | #54742 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,926
|
check up the term judder.
both TV and projector in cinemas are able to show the same type of motion judder free 24 hz. the term judder comes from displaying 24 frames at 60 hz aka 3:2 judder(there are other types of judder too) which creates shaking movement and looks abyssal in camera pan shoots. |
15th February 2019, 14:03 | #54743 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
Watching 24p movies with judder is terrible and I don't think any video purists would suggest that. You do not need motion interpolation to avoid judder
I also tested all the processing options of my TV and turned them off because they damage the image, most obviously by converting to 4:2:2 to do the processing. I agree that one should test themselves but the processing on many TVs does do a lot of damage to the output from madVR.
__________________
madVR options explained |
15th February 2019, 14:20 | #54744 | Link | ||
Registered User
Join Date: May 2013
Posts: 712
|
Quote:
Yup. I used to be bothered by judder, Until I realized, Camera pan shots look crappy no matter what you do, because it's 24fps.. Where's that damn 48fps hobbit.. Quote:
Madvr is a brute force sort of processor. Bundled TV processors, well they have to make a buck. If we add up cost, a Strictly Madvr build ~$200 GPU + $300 CPU + $150 Motherboard + $50 Cooler + $50 Powersupply + $150 Ram + $50 case, That's $950 already for the image processor. The whole damn TV is ~$500-2000, how could they possibly match the ability of a $950 image processor, and Madshi's immense TALENT . ?
__________________
Ghetto | 2500k 5Ghz Last edited by tp4tissue; 15th February 2019 at 14:27. |
||
15th February 2019, 16:29 | #54748 | Link | |
Registered User
Join Date: Oct 2016
Posts: 896
|
Quote:
For my part I am able to read books or even work all day on a computer display from 75 cm very comfortably (as long as its backlight doesn't flicker). Of course I don't do that with my TV because of living space considerations, but not because of eye comfort considerations.
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40 |
|
15th February 2019, 17:57 | #54749 | Link |
Registered User
Join Date: May 2004
Posts: 5,351
|
If they'd just release those 10000 nit displays they're holding back from us we wouldn't need any of madvr's magical tone mapping! (Come on, that was hilarious! )
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED |
15th February 2019, 19:16 | #54750 | Link | |
Registered User
Join Date: May 2013
Posts: 712
|
Quote:
So... we'd need 20,000 with the panasonic light modulation layer + lcd layer Then throw in blur reduction blinking, something like 30-40,000cd ?
__________________
Ghetto | 2500k 5Ghz |
|
15th February 2019, 20:01 | #54752 | Link |
Registered User
Join Date: May 2004
Posts: 5,351
|
I take it back...we WILL need madvr tone mapping after all. Even at 1000-2000 nits some of these displays are blinding! My 700 or so nits with my OLED is fine but I've seen some of these crazy high nit LCD's and I would think some people would want to tone map down so as not to need sunglasses!
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED |
15th February 2019, 23:37 | #54753 | Link | |
Registered User
Join Date: Oct 2018
Posts: 324
|
Quote:
So the reasoning to determine whether or not a display has a proper HDR is based on the range it has to fit the highlights, which for HDR10 are in the range 100-1000 and for Dolby 100-4000, and that's very subjetive. I've seen comments like those you say but also reviews in Rtings giving an 8 score in HDR to displays with only 450 peak nits. The only serious paper I've seen is one from EBU about requirements for video monitors in TV production, which sets the lower limit to 600 nits. And the problem to have a simple calibration setup is that not all the content is graded the same way. That's why they're using madVR measurements of different averages to set dynamically the most adequate exposure. |
|
15th February 2019, 23:46 | #54754 | Link | |
Registered User
Join Date: Dec 2018
Posts: 207
|
Quote:
|
|
16th February 2019, 00:28 | #54756 | Link | |
Registered User
Join Date: May 2018
Posts: 259
|
Quote:
I understand all HDR movies are not the same, but unless you start off with a base setting, I don't understand how you can setup madVR ? The principal with SDR is set whites and blacks, but with HDR its not really talked about, so when we say about tone mapping of TVs, where is the starting point ? As surely we need one ? Maybe I am missing something here as 99% of people will not be using madVR, so how do the rest of the users set up an HDR TV.
__________________
Windows 10-1909 | i5-3570k | GTX 1070 Windforce OC Rev2 8GB : 430.64 | Pioneer VSX-534 | Philips 65PUS6703 - 65" |
|
16th February 2019, 01:36 | #54757 | Link | |
Registered User
Join Date: May 2013
Posts: 712
|
Quote:
The non-madvr plebs just use passthrough or equivalent, letting the TV do the tonemapping. The TV uses the same style of tonemapping as madvr's tone mapping to get the hdr bluray content to FIT in the range of the tv set. Madvr is just more pliable and NGU (holy aura, angel sings).
__________________
Ghetto | 2500k 5Ghz |
|
16th February 2019, 01:44 | #54758 | Link |
Registered User
Join Date: Oct 2018
Posts: 324
|
Sorry, I misread your post, I though you was refering to a setup for HDR -> SDR. HDR is an absolute format, so you don't have to do anything, the HDR mode of your display takes care of doing everything and it should look the same as in any other HDR display up to the peak limit. You only have to make sure to choose the normal HDR mode and maybe check that contrast and backlight are at 100% and dynamic contrast is disabled.
Anyway I don't use HDR passthrough and my HDR TV is from 2016 so I'm not the best to explain this. |
16th February 2019, 01:59 | #54759 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,926
|
no this is not even close to reality.
HDR display do what ever they what and there is no clear spec on how to put the higher range image into the lower range of the TV. just clipping is not an option. clipping 4000 nits to 600 will not result in anything useful. heck what about sub 100 nit projector just clip at 100 nits? |
16th February 2019, 03:14 | #54760 | Link |
Registered User
Join Date: Oct 2018
Posts: 324
|
They don't do whatever they want, they all have to decode the content with the same PQ EOTF up to some point, so the image up to this point is exactly the same in terms of luminance, with the only difference of the limitations of each display regarding black point. Above that point each display has his own tonemapping, but this doesn't affect very much the overall luminance.
HDR for 100 nits projectors? They can sell it but it's just a contradiction. So yes, clipping at 100 nits shoud be the only reasonable option. |
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
|
|