Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players
Register FAQ Calendar Today's Posts Search

Reply
 
Thread Tools Search this Thread Display Modes
Old 17th May 2015, 22:54   #30141  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,348
Quote:
Originally Posted by 6233638 View Post
Though the 10-bit output was working, I had to reduce this from 16 to 12 on my system to stop the queues being almost empty in the 10-bit DX11 FSE mode.
Only 10-bit was affected, 8-bit DX11 FSE worked just fine at 15. (16)

Windows 8.1 x64, GTX570 (350.12 and 347.88 tested), 32-bit madVR 0.88.8

EDIT: Actually this didn't fix the problem.
After 20-30 seconds of playback, every 5 seconds or so it started switching between full queues and almost-empty queues again, and stuttering while reporting a presentation glitch.
Reducing this even further (8) may have fixed the issue but I'd have to watch something reasonably long to confirm it.

EDIT2: After 30 minutes of playback, the queues remained full and there were no presentation glitches.
That may not solve it on everyone's system, but seems to have fixed the problem for me.
Did you monitor GPU memory usage? The 570 didn't have a whole lot of memory if I remember correctly, and 10-bit surfaces need more memory than 8-bit did.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 18th May 2015, 00:04   #30142  |  Link
6233638
Registered User
 
Join Date: Apr 2009
Posts: 1,019
Quote:
Originally Posted by nevcairiel View Post
Did you monitor GPU memory usage? The 570 didn't have a whole lot of memory if I remember correctly, and 10-bit surfaces need more memory than 8-bit did.
I thought it might be that, but it was only using 929/1280MB.
My testing was only using very basic settings too: 1080p displayed 1:1 with Bicubic 75 chroma, so the render times were around 5ms.
It looks like I can maybe get away with presenting 10 frames in advance, though 12 initially looked like it would be OK too, so I might just leave it at 8 since that's working.
Finished a 2 hour movie with it set to 8 and the buffer was always full, with no presentation glitches.

Quote:
Originally Posted by huhn View Post
do you have some screen that are not from pixelart 320x200 games?
I don't off-hand, but can probably get some tomorrow.
It was mainly 720p footage of more modern games that I was testing that with.
NNEDI3 really seems to clean up icons using fine lines or colored text.
6233638 is offline   Reply With Quote
Old 18th May 2015, 00:28   #30143  |  Link
webs0r
Registered User
 
Join Date: Jun 2007
Posts: 68
Hello, 2 things I wanted to check from this 0.88 version -

(1) Visual artifacts (little blocks) appearing in BT.601 vids
I found that with the new madVR v0.88 there are visual glitches appearing - like little black squares in videos. It seems to be only BT.601/SMPTE C videos. They disappear and re-appear in different places.

I first noticed it while playing the smallramp.ytp from madtestpatternsource. Also occurs while the video is paused.
It happens regardless of the madvr calibration setting (3dlut or disable).
madVR d3d9 windowed mode (8 bit), all settings set to don't flush (doesn't seem to matter), exclusive/windowed mode doesn't matter, still happens.

(2) MadTPG & Argyll - how do I get madTPG to default to the largest "Image Area"?
When I run dispcal and/or dispread the image area keeps resetting to the smallest size possible, even after I drag the slider. Had a quick check, can't seem to find how to alter this behaviour?

System details:
Win7 x64
nvidia GTX680, driver 350.12. Also occurred on v347.
mpc-hc 1.7.8.162 beta
Argyll 1.7.0

All advice appreciated!!
webs0r is offline   Reply With Quote
Old 18th May 2015, 00:33   #30144  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 339
Don't uncheck trade quality for performance defaults, and validation

Quote:
Originally Posted by 6233638 View Post
When I compare 8-bit and 10-bit, there is a clear improvement.
However, with 8-bit using Error Diffusion I was actually seeing that some tests seemed to be smoother in 8-bit rather than 10-bit.
After having caught up on this topic now (a lot has happened in the last two weeks!) it would seem that what I'm probably seeing is the difference between Error Diffusion and Ordered Dither.
So although you may expect there to be no visible differences, in some specific tests - and perhaps only with displays of a certain type, over a certain size, or contrast ratio - it seems as though the type of dither in use may still matter.
What are you basing your opinion on (your eyes), or something else?
What is the display you are using to make the that conclusion? (Screen size, DPI, resolution, etc).
I would vote against it being in the trade quality for performance screen (which is already packed) unless others can validate your findings. Too many people seem to jump into that screen, and not knowing any better unchecking everything in there thinking it will give them a better picture (and that there system can handle it), and that is rarely ever true. Then they come here complaining about dropped frames and high render stats (among other things).
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.
Anime Viewer is offline   Reply With Quote
Old 18th May 2015, 00:37   #30145  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
Quote:
Originally Posted by webs0r View Post
Hello, 2 things I wanted to check from this 0.88 version -

(1) Visual artifacts (little blocks) appearing in BT.601 vids
I found that with the new madVR v0.88 there are visual glitches appearing - like little black squares in videos. It seems to be only BT.601/SMPTE C videos. They disappear and re-appear in different places.

I first noticed it while playing the smallramp.ytp from madtestpatternsource. Also occurs while the video is paused.
It happens regardless of the madvr calibration setting (3dlut or disable).
madVR d3d9 windowed mode (8 bit), all settings set to don't flush (doesn't seem to matter), exclusive/windowed mode doesn't matter, still happens.

(2) MadTPG & Argyll - how do I get madTPG to default to the largest "Image Area"?
When I run dispcal and/or dispread the image area keeps resetting to the smallest size possible, even after I drag the slider. Had a quick check, can't seem to find how to alter this behaviour?

System details:
Win7 x64
nvidia GTX680, driver 350.12. Also occurred on v347.
mpc-hc 1.7.8.162 beta
Argyll 1.7.0

All advice appreciated!!
there is a bug in the current nvidia driver that produce issue if you use nnedi3. sound like you have encounter this bug.

you can work around that bug by installing new nvidia driver and using madVR x64 before x86 sounds strange but this works.
huhn is offline   Reply With Quote
Old 18th May 2015, 01:29   #30146  |  Link
6233638
Registered User
 
Join Date: Apr 2009
Posts: 1,019
Quote:
Originally Posted by Anime Viewer View Post
What are you basing your opinion on (your eyes), or something else?
What is the display you are using to make the that conclusion? (Screen size, DPI, resolution, etc).
I would vote against it being in the trade quality for performance screen (which is already packed) unless others can validate your findings. Too many people seem to jump into that screen, and not knowing any better unchecking everything in there thinking it will give them a better picture (and that there system can handle it), and that is rarely ever true. Then they come here complaining about dropped frames and high render stats (among other things).
As I said: without the option to use error diffusion with the 10-bit output, or a shortcut to change the type of dither being used without exiting FSE, it's very difficult to be entirely certain.

But while 10-bit is largely a big improvement over 8-bit on my TV (undithered, there are clearly four times as many steps) there were some tests that I set up which appeared to be slightly better with 8-bit.
Since 10-bit generally looked best in the majority of my tests, I suspect that the difference in dither being used (ED when outputting 8-bit, OD when outputting 10-bit) could be the cause.

Of course it may not be. But if I select Error Diffusion, I would like madVR to use Error Diffusion.
It is easy enough to select another algorithm, or to create a profile which switches to something else when entering FSE mode.

But for me, it is FSE mode (10-bit) where I would be most likely to use Error Diffusion.
With an 8-bit output, Error Diffusion can actually be too smooth, and some banding may remain simply due to the limited number of steps.
Random Dither does a better job hiding banding at the cost of some additional noise with an 8-bit output.
With a 10-bit output that should not be necessary.

If performance is bad, I would say that the user needs to change their settings, not that the settings should lie about what they do.
Perhaps the default should change, or there could be presets that users could select, but I don't like the options saying that one thing is selected, while doing something else.
Or removing options which have a high performance cost because someone may actually use it.
6233638 is offline   Reply With Quote
Old 18th May 2015, 04:41   #30147  |  Link
XMonarchY
Guest
 
Posts: n/a
Is there any reason to still include madLevels Tweaker? nVidia now has a fully functional setting that provides full range 0-255 over HDMI in videos and games. Is there any reason to use it?

Another question I had - does MPC-HC override madVR refresh rate settings? For example, my TV supports 23hz and 24hz modes, but they are not exactly 23.976hz because of my videocard. If I create a custom 23.976hz mode in MPC-HC x64 1.7.8 build 191, then will that 23.976hz apply or will madVR just use my TV's default 23hz mode? I also tried to make a 23.976hz custom resolution in nVidia CP, but it creates a 24.000hz resolution instead. How can I force it to use 23.976hz exactly?

Last edited by XMonarchY; 18th May 2015 at 05:02.
  Reply With Quote
Old 18th May 2015, 06:34   #30148  |  Link
nijiko
Hi-Fi Fans
 
Join Date: Dec 2008
Posts: 222
Quote:
Originally Posted by huhn View Post
than play it save and stick to 8 bit + dithering. but 10 bit should work fine too.
Oh, I see. Thank you!
nijiko is offline   Reply With Quote
Old 18th May 2015, 07:41   #30149  |  Link
Scyna
Registered User
 
Join Date: Jul 2013
Posts: 33
Not sure if its just me but when I played something on my tv which is 1080p thru hdmi I can't fill render queues. When I use my 1080p monitor thru dvi it works fine. This bug only happens with dx11 renderer. This is on amd 290.
Scyna is offline   Reply With Quote
Old 18th May 2015, 13:08   #30150  |  Link
webs0r
Registered User
 
Join Date: Jun 2007
Posts: 68
Quote:
Originally Posted by huhn View Post
there is a bug in the current nvidia driver that produce issue if you use nnedi3. sound like you have encounter this bug.

you can work around that bug by installing new nvidia driver and using madVR x64 before x86 sounds strange but this works.
Ah thanks huhn... This worked a treat.

Also I figured this out:
Quote:
(2) MadTPG & Argyll - how do I get madTPG to default to the largest "Image Area"?
When I run dispcal and/or dispread the image area keeps resetting to the smallest size possible, even after I drag the slider. Had a quick check, can't seem to find how to alter this behaviour?
I previously had tried the -P switch (for dispcal) but didn't use big enough values and didn't notice the change!
Use -P0.5,0.5,10,10 (10 means 100%, 1 means 10%). This fills the image area fully just in case anyone was wondering.
webs0r is offline   Reply With Quote
Old 18th May 2015, 13:10   #30151  |  Link
Skwelcha
<(ovO)>
 
Join Date: Jun 2011
Location: Bremen, Germany
Posts: 50
nvm: a restart fixed it apperantly..
__________________
MadVR 0.92.9 x64, MPC-HC, LAV 0.70.2, GV-N1060G1 GAMING-6GD @ 385.41, LG OLED65B7D, Denon AVR-1912, Windows 10 Pro 1709 x64, i7-6700k

Last edited by Skwelcha; 18th May 2015 at 13:34.
Skwelcha is offline   Reply With Quote
Old 18th May 2015, 13:40   #30152  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 339
Quote:
Originally Posted by Scyna View Post
Not sure if its just me but when I played something on my tv which is 1080p thru hdmi I can't fill render queues. When I use my 1080p monitor thru dvi it works fine. This bug only happens with dx11 renderer. This is on amd 290.
Is Smooth Motion running in a different state (on or off) on one compared to the other? How about refresh rate, is one reporting a different screen hz than the other while running the video? Does it have the same problem when running in both the videos normal resolution and in full screen?
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.
Anime Viewer is offline   Reply With Quote
Old 18th May 2015, 14:32   #30153  |  Link
6ari8
Registered User
 
Join Date: Jun 2012
Posts: 43
Anyone else getting a black screen in "D3D11 exclusive (10 bit)" mode in Windows 10TP with the new 352.84 Nvidia drivers? It's fine when using 8 bit but 10 bit gives a black screen.

I was using the Windows 8.1 350.12 drivers before this and had no issues...
6ari8 is offline   Reply With Quote
Old 18th May 2015, 14:55   #30154  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
Quote:
Originally Posted by 6ari8 View Post
Anyone else getting a black screen in "D3D11 exclusive (10 bit)" mode in Windows 10TP with the new 352.84 Nvidia drivers? It's fine when using 8 bit but 10 bit gives a black screen.

I was using the Windows 8.1 350.12 drivers before this and had no issues...
i get a picture with 10 bit d3d11.
using a 760 and 352.84 on windows 10 10074
8 bit crashed my system after exit.
huhn is offline   Reply With Quote
Old 18th May 2015, 14:57   #30155  |  Link
avinab
Registered User
 
Join Date: May 2015
Posts: 14
Not able to change to NNEDI3

I am using latest madvr version.I am on windows 10,nvidia gtx 980 latest version(windows 10 driver)
I am not able to switch to NNEDI in chroma upscaling or use image doubling.It crashes.Can anybody confirm or look into it.
I am using nvidia version 352.84. I am using pot player 64 bit

Last edited by avinab; 18th May 2015 at 15:28. Reason: Changes required
avinab is offline   Reply With Quote
Old 18th May 2015, 15:05   #30156  |  Link
6233638
Registered User
 
Join Date: Apr 2009
Posts: 1,019
Quote:
Originally Posted by GTPVHD View Post
http://www.geforce.com/whats-new/art...river-released
New driver for Win7 SP1/Win 8.1 users.
Huh, these drivers now let you select between an 8-bit and a 12-bit output as well.

I wonder - does that mean that madVR's 10-bit output will be converted to 12-bit?
My display does not report the bit-depth of the incoming signal.

Last edited by 6233638; 18th May 2015 at 15:14.
6233638 is offline   Reply With Quote
Old 18th May 2015, 15:06   #30157  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
Quote:
Originally Posted by avinab View Post
I am using latest madvr version.I am on windows 10,nvidia gtx 980 latest version(windows 10 driver)
I am not able to switch to NNEDI in chroma upscaling or use image doubling.It crashes.Can anybody confirm or look into it.
works fine which driver version?
huhn is offline   Reply With Quote
Old 18th May 2015, 15:09   #30158  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
Quote:
Originally Posted by 6233638 View Post
Huh, these drivers now let you select between an 8-bit and a 12-bit output as well.

I wonder - does that mean that madVR's 10-bit output will be upsampled to 12-bit?
My display does not report the bit-depth of the incoming signal.
it should just add some zero but yes. for some reason they "skipped" 10 bit. not sure if they use the highest possible or can't output 10 bit for a screen with 10 bit only (DP 1.2 UHD 60 Hz for example).
huhn is offline   Reply With Quote
Old 18th May 2015, 15:18   #30159  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,348
Quote:
Originally Posted by huhn View Post
it should just add some zero but yes. for some reason they "skipped" 10 bit. not sure if they use the highest possible or can't output 10 bit for a screen with 10 bit only (DP 1.2 UHD 60 Hz for example).
It offers 10-bit just fine as an option over DP for me. Didn't try HDMI yet.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 18th May 2015, 16:10   #30160  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Windows 7 x64 here.
I have 8 and 10 bit over DisplayPort with my screen, no 12.
Switching to 10bit, my monitor actually goes into 10bit mode so it actually works.
madVR FSE D3D11 10bit switches instantly so that's another sign that all is well.

I have tested and observed the following:
In 10bit mode "Adjust Desktop Color Setting" in control panel (nvidia) actually works in 10bit now, so moving the sliders generate no banding, Hurray!
Operating system Color Management (2DLut) still in 8bit because DWM (Desktop Window Manager) is also still in 8bit, so gradation and banding are generated when I switch on my ICC Porfile.
Windowed Mode in madVR still appears to be in 8bit even if it set to 10bit in madVR, this is again because it has to go through DWM.
Overlay mode is also in 8bit.
In short, EVERYTHING! the whole OS is still in 8bit although the output is in 10bit.

I guess high bit depth DWM is a privileged in windows 10 only, all the older DWM versions of windows (7, 8 , 8.1) are 8bit only.
Or is it? Anyone cares to try windowed mode in 10bit with madVR in 10bit and report is the gradients are visible?

EDIT:
I may be wrong, maybe madshi will be able to call 10bit in windowed mode if the desktop is already in 10bit...
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 18th May 2015 at 18:10.
James Freeman is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 05:00.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.