Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 29th January 2018, 06:32   #461  |  Link
cr0n=0sTT88
Registered User
 
cr0n=0sTT88's Avatar
 
Join Date: Aug 2013
Posts: 61
Quote:
Originally Posted by mytbyte View Post
This forum is a tutorial... You can try one thing: for HDR disable "preserve hue", maybe also "compress highlights"...but i think your GPU is generally the bottleneck, it doesn't have it's own memory, it shares (probably DDR3) memory with rest of the system...your D3D usage is maxing out...how is playback when you disable MadVR? Never mind if the picture is washed out.
I've select Bilinear and DXVA in: chroma upscaling, image downscaling, image upscaling. and despite the fact that speed improves I have the same problem. image moves like a robot.



but I disable "preserve hue" works very fine!!! thank you master!!



thank you!!!!



PS: I'm using a nec 2K monitor for test it. I don't test it in the projector JVC 4K. I do not know if this is important.

Last edited by cr0n=0sTT88; 29th January 2018 at 06:37.
cr0n=0sTT88 is offline   Reply With Quote
Old 29th January 2018, 07:03   #462  |  Link
cr0n=0sTT88
Registered User
 
cr0n=0sTT88's Avatar
 
Join Date: Aug 2013
Posts: 61
Quote:
Originally Posted by Asmodian View Post
Right now Nvidia is better, NGU runs unusually slowly on Polaris (AMD's 480/580). Also the 480 and 580 are good for mining so they are often hard to find right now. The 1050 Ti is a great card for madVR but there are probably a few options you can change for better performance with madVR. HDR -> SDR in high quality is relatively difficult so you need change to low quality mode with something like an HD4600:
it works! thank you!!

1050 is enough? 1060 is better for scaling algorithms? memory of 3GB or 6GB are determinants in the results?
cr0n=0sTT88 is offline   Reply With Quote
Old 29th January 2018, 08:08   #463  |  Link
cr0n=0sTT88
Registered User
 
cr0n=0sTT88's Avatar
 
Join Date: Aug 2013
Posts: 61
one more thing... this movies have 24fps and I'm playing at 60fps. I think that should overload the gpu.
is it possible play at 24fps and use a better filters?
cr0n=0sTT88 is offline   Reply With Quote
Old 29th January 2018, 08:41   #464  |  Link
mytbyte
Registered User
 
Join Date: Dec 2016
Posts: 198
Well, you should switch the GPU output mode to 2160/23p if it's available for HD4600 for smooth pans...if you succeed, you can test it yourself, but I don't think 60 Hz output affects performance unless you have enabled smooth motion (and you have not, as far as i can see)...

and you really must test it on a 4K display/projector it IS very important!

Last edited by mytbyte; 29th January 2018 at 08:54.
mytbyte is offline   Reply With Quote
Old 29th January 2018, 18:14   #465  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,871
Quote:
Originally Posted by cr0n=0sTT88 View Post
it works! thank you!!

1050 is enough? 1060 is better for scaling algorithms? memory of 3GB or 6GB are determinants in the results?
A 1080 Ti is better than a 1080 for scaling algorithms.

There are still no GPUs that can completely max out madVR, though the 1080 Ti is pretty close. A 1060 6GB is actually faster than the 3GB (it has more cores), in addition to having more memory, both of which are helpful for madVR. 3GB is less than you want for watching 4K content.

A 4GB 1050 Ti is as low as I can recommend, the higher the better.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 29th January 2018, 18:35   #466  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
Quote:
Originally Posted by Asmodian View Post
though the 1080 Ti is pretty close.
For now... There's always a fancy new algo around the corner, isn't there? (But no, nothing very soon.)
madshi is offline   Reply With Quote
Old 29th January 2018, 18:47   #467  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,871
Your Remove Compression Artifacts algorithm is already a new one that can really put pressure on GPUs. High quality deblocking of 4K content brings my Titan X (Pascal) to its knees.

It is always good to keep software ahead of the hardware.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 29th January 2018, 18:53   #468  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
Haha! Can the Titan X do RCA "High" quality for 4Kp60 content?
madshi is offline   Reply With Quote
Old 29th January 2018, 23:17   #469  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,871
Not even if I skip processing the chroma channels.

I get ~18ms for the RCA medium step on only the luma of 4K video (25ms with the chroma). ~59 ms for RCA high on only the luma (89ms with the chroma).

Edit: in fact I cannot run RCA on 4Kp60 at all.
__________________
madVR options explained

Last edited by Asmodian; 30th January 2018 at 01:19.
Asmodian is offline   Reply With Quote
Old 29th January 2018, 23:27   #470  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
Ouch. Well, obviously the Titan X is not a very fast card...
madshi is offline   Reply With Quote
Old 30th January 2018, 07:52   #471  |  Link
cr0n=0sTT88
Registered User
 
cr0n=0sTT88's Avatar
 
Join Date: Aug 2013
Posts: 61
Quote:
Originally Posted by mytbyte View Post
Well, you should switch the GPU output mode to 2160/23p if it's available for HD4600 for smooth pans...if you succeed, you can test it yourself, but I don't think 60 Hz output affects performance unless you have enabled smooth motion (and you have not, as far as i can see)...

and you really must test it on a 4K display/projector it IS very important!

I'm using JRiver and for my monitor Nec PA272W using displayport i only have this options available:



why the option of 30hz is not available? i don't understand.

I will test it later with my 4K projector

Quote:
Originally Posted by Asmodian View Post
A 1080 Ti is better than a 1080 for scaling algorithms.

There are still no GPUs that can completely max out madVR, though the 1080 Ti is pretty close. A 1060 6GB is actually faster than the 3GB (it has more cores), in addition to having more memory, both of which are helpful for madVR. 3GB is less than you want for watching 4K content.

A 4GB 1050 Ti is as low as I can recommend, the higher the better.
thank you sir!

i will buy nvidia 1060 6GB of second hand, new cards are very expensive now! (at least in spain). The last year 250€ and now 350€!
cr0n=0sTT88 is offline   Reply With Quote
Old 30th January 2018, 09:50   #472  |  Link
mytbyte
Registered User
 
Join Date: Dec 2016
Posts: 198
I don't have a clue about jriver, but it is most likely that your monitor doesn't support 23.976 or 24Hz or 30 Hz input so you probably can try to make a custom resolution/refresh and see if monitor will sync

Sent from my GM 5 Plus d using Tapatalk

Last edited by mytbyte; 30th January 2018 at 10:45.
mytbyte is offline   Reply With Quote
Old 2nd February 2018, 15:41   #473  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 605
Hello, I'm wondering if that part is still up to date:
Quote:
scale in linear light: [Disabled] [...] The chroma must be upscaled to match the luma before downscaling when using this option because linear light is only possible in RGB.
I have enabled "scale chroma separately if it saves performance" which means it's not upscaled when I play 2160p on my 1080 display, but the OSD still shows 'LL' next to the downscaler used, and also a 'ConvertToLinearLight' step when I use ShowRenderSteps.
Is it a 'dummy step' and it actually does not use Linear Light in that case, or can you now downscale in linear light only the luma ?

Thanks.
__________________
HTPC: Windows 10 1909, MediaPortal 1, LAV Filters, ReClock, madVR. DVB-C TV, Panasonic GT60, 6.0 speakers Denon 2310, Core 2 Duo E7400, GeForce 1050 Ti

Last edited by el Filou; 2nd February 2018 at 15:46.
el Filou is offline   Reply With Quote
Old 2nd February 2018, 16:08   #474  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,367
linear light scaling clearly works with YCbCr right now.
the OSD doesn't matter for this the image is clearly different.
huhn is offline   Reply With Quote
Old 7th February 2018, 06:28   #475  |  Link
xx2000xx
Registered User
 
Join Date: Jun 2017
Posts: 4
Quote:
Originally Posted by Asmodian View Post
I strongly recommend against using ALL the trade quality for performance options! Use the default ones (everything from the top down to "scale chroma separately, if it saves performance") but the others can cause issues.
Also it's supposed to scale from the biggest hit to performance from the top to the bottom.

Sorry for the rambling earlier but I was thinking of getting into calibration and looking at the people with well over 1k worth of equipment, which I take you have too to make a proper 3DLUT, I don't think it's worth it just to make SDR better. If Calman 6 goes on sale or if I can snag it 2nd hand on EBAY then I'd consider but I probably need more than that.

My Ideal-Lume went in the trash after a 10 year run so I talked to the only other true 6500k non-diy bias lighting from the biaslighting.com guys and it's made a massive difference over the crappy $10 popular Asus one on Amazon and even the Lume. Ideal lume: http://www.cinemaquestinc.com/ideal_lume.htm - Has an LED one now too and I can't recommend a bias light enough, just for the eye strain alone. I know the Sony's, from the youtube vids through Japanese guy who might be the best in the world says it's a must for them. My LG has more of a pop and more engulfing even though it's not going to improve the black level or contrast but it's more psychosomatic for me at this point.

The rule of thumb was always 10% of the maximum brightness of your TV but just got changed a few weeks ago to 5 nits and with HDR even lower. They have 11 notches and using their youtube vid: https://www.youtube.com/watch?v=OAwrN6xiqJg - which is about as cheesy as it gets is a decent reference but I should see if I can get some good slides.

Let me rattle off a few questions concerning your LG/Nvidia/MadVR setup.

I take it you use adaptive vsync as Madshi recommended a while back?

Do you use the edge enhancement/noise reduction in Nvidia's image setting on the bottom on the control panel? I know a popular guide says to use it but I'm not sure about anything he posts really besides some decent info on what and how much affects rendering.

In the manage 3D settings I'm not sure what MadVR disregards, which I would suspect most if not all, but besides changing it to optimal power do you change anything else?

The two I'm curious about, because you actually turned down the default CPU settings a few notches which I found odd is the: maximum rendering frames and the brand new option a few patches ago that both limit the number of per-rendered CPU frames ranging from 1-4 before the GPU kicks in. Because your amazing GPU are you putting that low to give all the attention to the Vid Card? I was under the assumption that they both work totally independent.

The frames in advance along with the GPU/CPU settings has always perplexed me and I'm going to spend a few hours the new few days and if the results are quite a bit different then I'll make a big ass spread sheet breaking down all the numbers now that I flashed the bios on my MSI 970GTX (easy stuff) by 20% but still F'd by the 3.5 gigs. along with an i7 Skylake @ 4.8'ish or after I delid it tomorrow.

Last when you calibrated the LG did you find any features besides using true motion which I assume you use useful? I take it their sharpness and other features would totally conflict with MadVR's settings.

I'm 100% on a quest now to dump 2.2 for BT.1886 gama somehow using SDR that works well with MadVR but also a forget it and set while also general use at 4:4:4. Speaking of which, what's up with PC mode when it comes to putting the PC label on it, which I thought was just fine per rtings, but if you change the icon to PC also then it gets all out wack. For example 90% of techicolor's settings are grey so I suspect there's something more to it because I don't remember that on a prior firmware.

Last edited by xx2000xx; 7th February 2018 at 07:47.
xx2000xx is offline   Reply With Quote
Old 7th February 2018, 07:42   #476  |  Link
xx2000xx
Registered User
 
Join Date: Jun 2017
Posts: 4
Quote:
Originally Posted by madshi View Post
For now... There's always a fancy new algo around the corner, isn't there? (But no, nothing very soon.)
HTPC would be dead and everybody would be watching videos, after signing their life over, on their fancy google player with their shi**y 5 inch "smart" phones if it wasn't for you. That or 10's of millions of more malware infected computers downloading codec packs using VLC or whatever should be extinct.

Glad to hear that you're in contact with MS/Nvidia too and if Nvidia didn't offer you a job working with the driver team, especially since they're going all in on horrendous idiot boxes that will eventually put an end to PC's as we know it in the next 5-10 years. The writing on the wall has been very clear for years now. Just don't anandtech us and say peace out and get some stuffy office job.

Nvidia merges with MS to exclusively work on the Xbox, AMD folds, game over. Intel knew what was up years ago when they started butchering their CPU's with integrated graphics. Small form factor is here, where the money is, and will only gain steam.

Once they finally got in the golden token market locked up after years and years of struggle in the housewives that has the disposable income, which they know would buy their kids the same products to completely corner the next generation which already started. Then brainwashing the public that you need to upgrade your phone every single year that makes Intel's tick/tock model a joke.

The only hope is the i386/x86 architecture, which should have been dead 10 years ago, something new will rise but I think MS has a little say on how that will go. Enjoy it while it lasts.
xx2000xx is offline   Reply With Quote
Old 8th February 2018, 20:15   #477  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,871
Quote:
Originally Posted by xx2000xx View Post
I take it you use adaptive vsync as Madshi recommended a while back?
No, I use a simple fixed refresh rate and application controlled v-sync.

Quote:
Originally Posted by xx2000xx View Post
Do you use the edge enhancement/noise reduction in Nvidia's image setting on the bottom on the control panel?
Absolutely not!

Quote:
Originally Posted by xx2000xx View Post
In the manage 3D settings I'm not sure what MadVR disregards, which I would suspect most if not all, but besides changing it to optimal power do you change anything else?
Optimal power is a bad option, use adaptive or prefer maximum performance.

Quote:
Originally Posted by xx2000xx View Post
The two I'm curious about, because you actually turned down the default CPU settings a few notches which I found odd is the: maximum rendering frames and the brand new option a few patches ago that both limit the number of per-rendered CPU frames ranging from 1-4 before the GPU kicks in. Because your amazing GPU are you putting that low to give all the attention to the Vid Card? I was under the assumption that they both work totally independent.
I think you are confused as to what those options mean. madVR's CPU buffers are simply buffers in system memory for decoded video frames, not pre-rendered frames. I only turn them down a bit because I use a 32-bit video player and don't need more CPU buffers, memory use can go quite high with 4K video and complex subtitles. Leave the Nvidia options on application controlled, you don't want the drivers limiting what madVR wants to do.

Quote:
Originally Posted by xx2000xx View Post
Last when you calibrated the LG did you find any features besides using true motion which I assume you use useful? I take it their sharpness and other features would totally conflict with MadVR's settings.
I do not use true motion, I don't like any of the motion interpolation methods. They also increase the input lag, a large negative because I also use my TV for games and desktop use.

Quote:
Originally Posted by xx2000xx View Post
Speaking of which, what's up with PC mode when it comes to putting the PC label on it, which I thought was just fine per rtings, but if you change the icon to PC also then it gets all out wack. For example 90% of techicolor's settings are grey so I suspect there's something more to it because I don't remember that on a prior firmware.
I definitely use PC mode, it turns off (almost) all the processing, lowers input lag, and maintains 4:4:4 sampling. Ideal for madVR as well as games/desktop. I can still fix the white point, which is the only processing I want.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 11th February 2018, 15:07   #478  |  Link
cr0n=0sTT88
Registered User
 
cr0n=0sTT88's Avatar
 
Join Date: Aug 2013
Posts: 61
Hi,

I have a Nec PA272W. This a monitor with 3D-LUT programmable. I don't know how to programming my monitor yet.
Is it possible optimize MadVR for get better quality with this type of the monitor?
I will buy a GTX1060/GTX1050ti the next week, now I'm using HD4600.

Wich are the options of MadVR that it can give a better quality for my 3D-LUT monitor?
HDR only?

I have a super-monitor and a super-software and a good card (GTX1060) but I don't know how to configure it!

I have a friend with a i1Display pro, I can use it This is good for get better results?

Last edited by cr0n=0sTT88; 11th February 2018 at 15:10.
cr0n=0sTT88 is offline   Reply With Quote
Old 11th February 2018, 15:19   #479  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,367
madVR has nothing todo with 3d lut in your screen (it's just a way to calibrate a screen) . but you can blindly assume that the 3D lut in madVR is of higher quality. but to use either you will need a colorimeter and quite some knowledge.
huhn is offline   Reply With Quote
Old 11th February 2018, 16:49   #480  |  Link
cr0n=0sTT88
Registered User
 
cr0n=0sTT88's Avatar
 
Join Date: Aug 2013
Posts: 61
Quote:
Originally Posted by huhn View Post
madVR has nothing todo with 3d lut in your screen (it's just a way to calibrate a screen) . but you can blindly assume that the 3D lut in madVR is of higher quality. but to use either you will need a colorimeter and quite some knowledge.



This options are for use the internal colour CPU of 3D-LUT with madVR?

If I use these options, will not I get better quality with my Nec PA272W?
cr0n=0sTT88 is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 10:25.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, vBulletin Solutions Inc.