Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 29th April 2019, 04:58   #56041  |  Link
ttnuagmada
Registered User
 
Join Date: Jan 2013
Posts: 14
Quote:
Originally Posted by tp4tissue View Post
are you using black bar detection, try turning it off. Something similar happens to mine on certain videos , with the black bar detection, madvr will freeze, the pc's doesn't hardlock . but I get a black window, no image.
I have not tried messing with that. I'll definitely try it when i get home. thanks!
ttnuagmada is offline   Reply With Quote
Old 29th April 2019, 09:13   #56042  |  Link
Charky
Registered User
 
Join Date: Apr 2018
Location: Paris, France
Posts: 58
Quote:
Originally Posted by tp4tissue View Post
No one should watch movies without 3DLut.
Well, again, that's just your opinion

Above a certain price tag, most recent displays have at least one mode that's accurate.

Or, should I say, "accurate enough", cause of course you'll answer me "it's not accurate enough for me, I need a dE of 0,1 on all the color spectrum !".

Again, that's OK, it's just not everybody's needs. Some of us are not color maniacs and are perfectly fine with the stock colors of their "Cinema" mode of whatever is called the best mode on their TVs or projectors.
Charky is offline   Reply With Quote
Old 29th April 2019, 10:07   #56043  |  Link
oldpainlesskodi
Registered User
 
Join Date: Apr 2017
Posts: 178
Figured out the cause of banding my end.

Stock install of either the latest Geforce or Modified Quadro driver = no banding at Full-Full-Full. However, if I use madLevelsTweaker or NV_RGBFullRangeToggle to force PC Levels, I get heavy banding with exactly the same settings/chain - not sure what has changed with the latest driver, as I haven't seen this issue in previous ones.

Odd.

Just sharing in case anyone else comes across the same issue.
__________________
Sapphire RX 5700 XT (19.7.5) Ryzen 7 3700x, PRIME X570-Pro, Win 10 x64 (1903), Silverstone LC13B-E, Pioneer SC-LX501 Elite D3, Samsung UE55KS8000, Mission M33i speakers, Kodi Dsplayer 17.6 X64
oldpainlesskodi is offline   Reply With Quote
Old 29th April 2019, 12:51   #56044  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 417
Quote:
Originally Posted by Charky View Post
Well, again, that's just your opinion

Above a certain price tag, most recent displays have at least one mode that's accurate.

Or, should I say, "accurate enough", cause of course you'll answer me "it's not accurate enough for me, I need a dE of 0,1 on all the color spectrum !".

Again, that's OK, it's just not everybody's needs. Some of us are not color maniacs and are perfectly fine with the stock colors of their "Cinema" mode of whatever is called the best mode on their TVs or projectors.
I think I see it the way you see it.
I totally understand the purist's way of looking at things and I personally am more on that side of the fence than the one of "regular" people.

On the other hand, this being the official madVR discussion, I think more care should be taken in providing a more complete answer to people asking the GPU needed to use madVR. It's one thing to say "if you go with 2080Ti you're going to get this and that, which will provide unrivaled quality", a whole different beast stating "you *need* a 2080Ti to use madVR properly".

Just my two cents.
ashlar42 is offline   Reply With Quote
Old 29th April 2019, 13:50   #56045  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 406
Quote:
Originally Posted by Charky View Post
Well, again, that's just your opinion

Above a certain price tag, most recent displays have at least one mode that's accurate.

Or, should I say, "accurate enough", cause of course you'll answer me "it's not accurate enough for me, I need a dE of 0,1 on all the color spectrum !".

Again, that's OK, it's just not everybody's needs. Some of us are not color maniacs and are perfectly fine with the stock colors of their "Cinema" mode of whatever is called the best mode on their TVs or projectors.
BEFORE Madvr graced us with 3Dlut support, the only way to get calibrated was to buy $500 lut-boxes. These doohikies weren't even that reliable outside of their designated applications and hookups. +$230 for the colorimeter, We were looking at $730. It costs even more in europe which don't enjoy 'Murica pricing, they'd be looking at ~$8-900

When the Madvr color enlightenment happened, The price to get accurate Colors dropped to simply the cost of the colorimeter, gaming changing.

__
Your argument is the same as saying a Blind person doesn't need eyes. A blind person can live well in modernity, but he'd still be much better off With-Eyes had we the technology to do such a thing.

That's exactly how computers are without a colorimeter, BLIND.

Calibration is NOT all about Delta E. It's about color Balance and Intent. A critical part of movie making labor is color-grading. This is something you already Paid For when you acquire any digital-media. The director and color artist sit in a dim gray room, forgoing vitamin D critical to their health, to bring us fantastic media.

The color subtleties and director's intent are thoroughly Obliterated without Color calibration. This is as true of $300 tvs as $2000 tvs.

The reason we can't get a good out of the box performance is PRECISELY because of general consumer ignorance. Never having SEEN an image remotely close to reference, a crass consumer can only decide based on very basic properties, them being, brighter, more saturated, sharper, and bluer. In catering to this crowd, tv-makers have no option but to release terribly inaccurate sets.

From the director's point of view, there is no right/wrong color, he's the producer.

From the consumer's point of view, there IS a right color, the color the producer chose. In this case, on the viewer end, it either Is that color or it Isn't, Neither you nor your PC will ever know right from wrong without a 3Dlut. To choose Uncalibrated is to choose ignorance.

Good-Taste, comes at a cost, and has to itself be trained. There is of course Freedom in choosing a more blue image, but Not before establishing Reference.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 29th April 2019 at 14:18.
tp4tissue is offline   Reply With Quote
Old 29th April 2019, 14:55   #56046  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 930
Any ideas yet how the GTX 1650 compares to the GTX 1050 Ti in madVR performance? Seems to be a poor replacement for gaming. Still waiting on that GT 1030 replacement with 3+ GiB of VRAM but hey ho.
__________________
HTPC Hardware: Intel Celeron G530; nVidia GT 430
HTPC Software: Windows 7; MediaPortal 1.19.0; Kodi DSPlayer 17.6; LAV Filters (DXVA2); MadVR
TV Setup: LG OLED55B7V; Onkyo TX-NR515; Minix U9-H
DragonQ is offline   Reply With Quote
Old 29th April 2019, 15:19   #56047  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,818
Quote:
Originally Posted by tp4tissue View Post
Calibration is NOT all about Delta E.
Its also about the White Point and Gamma curve.

Quote:
Originally Posted by tp4tissue View Post
From the consumer's point of view, there IS a right color, the color the producer chose
And a good TV is perfectly capable of producing that color without an external 3DLUT. Heck a good modern TV will have color adjustments built in (up to 20 point corrections, which are plenty) so if you have a colorimeter (which you need for a 3DLUT creation anyway), you can already tweak the TV to get to very good levels without needing any modifications on the playback side. And many premium TVs come calibrated out of the box to Delta E < 3 or such, which is already pretty decent.

The only ignorance here is assuming that you know the only solution to a long solved problem.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 29th April 2019 at 15:25.
nevcairiel is offline   Reply With Quote
Old 29th April 2019, 15:55   #56048  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 406
Quote:
Originally Posted by nevcairiel View Post

And a good TV is perfectly capable of producing that color without an external 3DLUT. Heck a good modern TV will have color adjustments built in (up to 20 point corrections, which are plenty) so if you have a colorimeter (which you need for a 3DLUT creation anyway), you can already tweak the TV to get to very good levels without needing any modifications on the playback side. And many premium TVs come calibrated out of the box to Delta E < 3 or such, which is already pretty decent.

The only ignorance here is assuming that you know the only solution to a long solved problem.
You're talking about acceptable range. And I concur, there is a range.

But the goal is to narrow it given the consumer's available resources.

It isn't that Tvs can't produce a certain color, they simply don't produce right colors by choice of the maker, influenced by the ignorance of the consumer.

What has not been solved is End to End, and the only way to solve it is Consumer measurement.

For example, just in this thread, we have red hue green hue oddities with simple driver differences. What assurance do we have that a bluray player or some other electronic isn't rendering / outputting something equally anomalous. How would the user know without measuring ?

That 20 point gamma adjustment, do it by eye ? This is the solution?

Yes we have a better picture today than yesteryears, but that last stretch towards the viewer has always been the missing component.

Factory calibration is good but drifted after 1000 hours, and worthless after 5000 hours. That could be between 1-2 years.

Factory settings are also with respect to their patch/tone generator, NOT consumer output hardware.

For example, dell released a prosumer wide gamut monitor in 2009 u2410. their factory calibrated srgb mode was for 5800k. You couldn't adjust this in the regular menu/ custom mode etc. So for a long time, people couldn't get accurate Srgb. When they calibrated it would clamp the contrast ratio to 500:1, while the panel itself was capable of ~770:1. It wasn't until much later when people had access to the service menu, that this could be rectified.

Calibration and RE-Calibration are critical.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 29th April 2019 at 16:12.
tp4tissue is offline   Reply With Quote
Old 29th April 2019, 16:05   #56049  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 406
Quote:
Originally Posted by DragonQ View Post
Any ideas yet how the GTX 1650 compares to the GTX 1050 Ti in madVR performance? Seems to be a poor replacement for gaming. Still waiting on that GT 1030 replacement with 3+ GiB of VRAM but hey ho.
It has less cuda cores than a 1060, and 1060 is already kind of not enough for 4K HDR.

I'd still recommend the 1070 (used), for now.

If you're not worried about the current driver hassle, 2060/70/80 are all there to future proof, as Madshi has dropped hints that there may be something useful in that new architecture, this is not for certain though, just hints.

You don't want 1050ti unless you're Certain that you won't be needing 4KHDR
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 29th April 2019 at 16:08.
tp4tissue is offline   Reply With Quote
Old 29th April 2019, 16:54   #56050  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 537
Quote:
Originally Posted by DragonQ View Post
Any ideas yet how the GTX 1650 compares to the GTX 1050 Ti in madVR performance?
Should only be around 15% faster. Maybe 20% if it can clock a bit higher.
__________________
HTPC: Windows 10 1809, MediaPortal 1, LAV Filters, ReClock, madVR. DVB-C TV, Panasonic GT60, 6.0 speakers Denon 2310, Core 2 Duo E7400, GeForce 1050 Ti
el Filou is offline   Reply With Quote
Old 29th April 2019, 17:05   #56051  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,906
it should be more in the region of 30-50% more.
it's not build with the inferior samsung process so it should clock ~100 mhz higher.

just the paper number make the card ~37 % faster and turing has added some juicy goods which may or may never play a role.
huhn is offline   Reply With Quote
Old 29th April 2019, 17:08   #56052  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,818
Quote:
Originally Posted by tp4tissue View Post
It isn't that Tvs can't produce a certain color, they simply don't produce right colors by choice of the maker, influenced by the ignorance of the consumer.
Thats why they have several presets. Some cater to people that like warmer image, some like colder, and some like it accurate. Its easy enough to read various TV reviews, measuring pre/post calibrated color is something many better reviews do these days (with post-calibration being done using TV integrated calibration solutions, so that already gives you some information how good those work as well)

Quote:
Originally Posted by tp4tissue View Post
What has not been solved is End to End, and the only way to solve it is Consumer measurement.

For example, just in this thread, we have red hue green hue oddities with simple driver differences. What assurance do we have that a bluray player or some other electronic isn't rendering / outputting something equally anomalous. How would the user know without measuring ?

That 20 point gamma adjustment, do it by eye ? This is the solution?

Yes we have a better picture today than yesteryears, but that last stretch towards the viewer has always been the missing component.

Factory calibration is good but drifted after 1000 hours, and worthless after 5000 hours. That could be between 1-2 years.

[...]

Calibration and RE-Calibration are critical.
I never said you should not measure. You can measure and then calibrate the colors in the TVs controls as much as it lets you. In fact, even if you use a 3DLUT, you should do that, since it can often correct in directions that a 3DLUT simply cannot go (ie. if the gamut is too narrow, a 3DLUT cannot expand it, it can only ever narrow it further, while TV controls can potentially expand it).

With a good TV with good calibration controls, this can get you to levels close to perfect, beyond visible differences. This is how TV calibration has worked for years, without any fancy external boxes. Some people even pay professionals for a one-time calibration (which is often cheaper then buying a colorimeter if you only do it once, nevermind needing to know wtf you are doing).
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 29th April 2019 at 17:12.
nevcairiel is offline   Reply With Quote
Old 29th April 2019, 17:20   #56053  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 537
Ah you're right, it seems to clock at 1905-1920 MHz. Still, 30% faster is doable then, but I'd be very surprised if it was more than that.
__________________
HTPC: Windows 10 1809, MediaPortal 1, LAV Filters, ReClock, madVR. DVB-C TV, Panasonic GT60, 6.0 speakers Denon 2310, Core 2 Duo E7400, GeForce 1050 Ti
el Filou is offline   Reply With Quote
Old 29th April 2019, 21:09   #56054  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA
Posts: 79
Quote:
Originally Posted by oldpainlesskodi View Post
However, if I use madLevelsTweaker or NV_RGBFullRangeToggle to force PC Levels, I get heavy banding with exactly the same settings/chain
What's your reason for using this with an Nvidia card/driver that can already do this by itself? Confuses me almost as much as the Quadro modding, which you felt shouldn't be discussed in here. I'm sure a lot of us would like to hear of ways to improve our viewing experience
__________________
Henry

LG OLED65C7P | VIZIO M70-C3 | Denon AVR-X3500H | Elac Uni-Fi | Elac Debut 2.0 SUB3030
NVIDIA GeForce GTX 960 | LAV Filters | madVR | MPC-HC | Plex | X-Rite i1Display Pro | DisplayCAL | HCFR
VBB is offline   Reply With Quote
Old 29th April 2019, 21:18   #56055  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 537
Nothing, it's snake oil at least for madVR and all other consumer apps.
__________________
HTPC: Windows 10 1809, MediaPortal 1, LAV Filters, ReClock, madVR. DVB-C TV, Panasonic GT60, 6.0 speakers Denon 2310, Core 2 Duo E7400, GeForce 1050 Ti
el Filou is offline   Reply With Quote
Old 29th April 2019, 21:53   #56056  |  Link
oldpainlesskodi
Registered User
 
Join Date: Apr 2017
Posts: 178
Quote:
Originally Posted by VBB View Post
What's your reason for using this with an Nvidia card/driver that can already do this by itself? Confuses me almost as much as the Quadro modding, which you felt shouldn't be discussed in here. I'm sure a lot of us would like to hear of ways to improve our viewing experience
Nvidia with it's default settings uses Limited, but with using either tool I mentioned, it's default changes to full.

As for the Quadro vs Geforce debate, well, plenty of places out there for that.
__________________
Sapphire RX 5700 XT (19.7.5) Ryzen 7 3700x, PRIME X570-Pro, Win 10 x64 (1903), Silverstone LC13B-E, Pioneer SC-LX501 Elite D3, Samsung UE55KS8000, Mission M33i speakers, Kodi Dsplayer 17.6 X64
oldpainlesskodi is offline   Reply With Quote
Old 29th April 2019, 22:03   #56057  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA
Posts: 79
Quote:
Originally Posted by oldpainlesskodi View Post
Nvidia with it's default settings uses Limited, but with using either tool I mentioned, it's default changes to full.
But why not just change it in the control panel?
__________________
Henry

LG OLED65C7P | VIZIO M70-C3 | Denon AVR-X3500H | Elac Uni-Fi | Elac Debut 2.0 SUB3030
NVIDIA GeForce GTX 960 | LAV Filters | madVR | MPC-HC | Plex | X-Rite i1Display Pro | DisplayCAL | HCFR
VBB is offline   Reply With Quote
Old 29th April 2019, 22:08   #56058  |  Link
katodevin
Registered User
 
Join Date: Sep 2007
Posts: 27
Just wanted to report in that the new Nvidia driver does not fix stuttering issues with Turing cards (specifically 1660), but may lessen frequency. Was able to watch 2-3 movies since throwing my 1660 back into my system with new drivers (drivers cleaned between swaps). When watching GoT (not even 4k) last night, stutters came back about halfway through. Switched over to the Plex player and finished watching the episode.

Power management set to adaptive in control panel.

Putting my 1060 back into the system. I'd advise against anyone buying a Turing card with the expectation of using it with MadVR/MPC.
katodevin is offline   Reply With Quote
Old 29th April 2019, 22:12   #56059  |  Link
oldpainlesskodi
Registered User
 
Join Date: Apr 2017
Posts: 178
Quote:
Originally Posted by VBB View Post
But why not just change it in the control panel?
With the newer Nvidia drivers, selecting 12bit does not survive a reboot, as it reverts to 8bit. If, however, you force PC levels (on previous drivers), and leave it at default, using DSplayer with Madvr in FSE it then outputs 12bit in every refresh that supports 12bit (everything below 30hrz) in Full range.

But, like I mentioned, something has changed in the new drivers (WDDM 2.6 perhaps?) that then results in very bad banding (Spears_Munsil_Quantazation_Test_2160p becomes a banding mess, for example), even though Madvr reports Full Range 12bit using this previous successful workaround.

Hope that makes sense.

Update - thrown the towel in on the latest driver and went back to one of the latest WDDM 2.5 drivers, for now.
__________________
Sapphire RX 5700 XT (19.7.5) Ryzen 7 3700x, PRIME X570-Pro, Win 10 x64 (1903), Silverstone LC13B-E, Pioneer SC-LX501 Elite D3, Samsung UE55KS8000, Mission M33i speakers, Kodi Dsplayer 17.6 X64

Last edited by oldpainlesskodi; 29th April 2019 at 23:17.
oldpainlesskodi is offline   Reply With Quote
Old 29th April 2019, 22:52   #56060  |  Link
famasfilms
Registered User
 
Join Date: May 2015
Posts: 17
Was this week's Game of Thrones highly pixelated/blocky for anyone else?

Is there a way to fix that?
famasfilms is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 08:59.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.