View Single Post
Old 7th February 2018, 06:28   #475  |  Link
xx2000xx
Registered User
 
Join Date: Jun 2017
Posts: 4
Quote:
Originally Posted by Asmodian View Post
I strongly recommend against using ALL the trade quality for performance options! Use the default ones (everything from the top down to "scale chroma separately, if it saves performance") but the others can cause issues.
Also it's supposed to scale from the biggest hit to performance from the top to the bottom.

Sorry for the rambling earlier but I was thinking of getting into calibration and looking at the people with well over 1k worth of equipment, which I take you have too to make a proper 3DLUT, I don't think it's worth it just to make SDR better. If Calman 6 goes on sale or if I can snag it 2nd hand on EBAY then I'd consider but I probably need more than that.

My Ideal-Lume went in the trash after a 10 year run so I talked to the only other true 6500k non-diy bias lighting from the biaslighting.com guys and it's made a massive difference over the crappy $10 popular Asus one on Amazon and even the Lume. Ideal lume: http://www.cinemaquestinc.com/ideal_lume.htm - Has an LED one now too and I can't recommend a bias light enough, just for the eye strain alone. I know the Sony's, from the youtube vids through Japanese guy who might be the best in the world says it's a must for them. My LG has more of a pop and more engulfing even though it's not going to improve the black level or contrast but it's more psychosomatic for me at this point.

The rule of thumb was always 10% of the maximum brightness of your TV but just got changed a few weeks ago to 5 nits and with HDR even lower. They have 11 notches and using their youtube vid: https://www.youtube.com/watch?v=OAwrN6xiqJg - which is about as cheesy as it gets is a decent reference but I should see if I can get some good slides.

Let me rattle off a few questions concerning your LG/Nvidia/MadVR setup.

I take it you use adaptive vsync as Madshi recommended a while back?

Do you use the edge enhancement/noise reduction in Nvidia's image setting on the bottom on the control panel? I know a popular guide says to use it but I'm not sure about anything he posts really besides some decent info on what and how much affects rendering.

In the manage 3D settings I'm not sure what MadVR disregards, which I would suspect most if not all, but besides changing it to optimal power do you change anything else?

The two I'm curious about, because you actually turned down the default CPU settings a few notches which I found odd is the: maximum rendering frames and the brand new option a few patches ago that both limit the number of per-rendered CPU frames ranging from 1-4 before the GPU kicks in. Because your amazing GPU are you putting that low to give all the attention to the Vid Card? I was under the assumption that they both work totally independent.

The frames in advance along with the GPU/CPU settings has always perplexed me and I'm going to spend a few hours the new few days and if the results are quite a bit different then I'll make a big ass spread sheet breaking down all the numbers now that I flashed the bios on my MSI 970GTX (easy stuff) by 20% but still F'd by the 3.5 gigs. along with an i7 Skylake @ 4.8'ish or after I delid it tomorrow.

Last when you calibrated the LG did you find any features besides using true motion which I assume you use useful? I take it their sharpness and other features would totally conflict with MadVR's settings.

I'm 100% on a quest now to dump 2.2 for BT.1886 gama somehow using SDR that works well with MadVR but also a forget it and set while also general use at 4:4:4. Speaking of which, what's up with PC mode when it comes to putting the PC label on it, which I thought was just fine per rtings, but if you change the icon to PC also then it gets all out wack. For example 90% of techicolor's settings are grey so I suspect there's something more to it because I don't remember that on a prior firmware.

Last edited by xx2000xx; 7th February 2018 at 07:47.
xx2000xx is offline   Reply With Quote