Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 11th March 2011, 23:39   #5941  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by Mark_A_W View Post
Anyway, I've had terrible trouble getting madVR to play even remotely smoothly - it's just been a jumpy mess, with dropped and delayed frames jumping 20 or more every few seconds.

Turns out it is the "upload frames in render thread" option. With that disabled I get rock solid results
Interesting. On my NVidia integrated GPU playback I seem to have better results with the option enabled.

Quote:
Originally Posted by soulkeeper View Post
In the settings page there is not a setting about chroma downscalling (in luma downscalling i've selected spline 4taps)...
What is the chroma downscalling method?
The chroma algorithm you can choose in the settings dialog is applied for both up- and downsampling.

Quote:
Originally Posted by soulkeeper View Post
Also,is it normal to take 15-20 minutes for my fps to stabilze at 60fps (actually 60.02fps as per mpchc statistics)...shouldn't that be exactly 60fps?
Which exact "fps" statistic are you talking about?
madshi is offline   Reply With Quote
Old 12th March 2011, 23:31   #5942  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
For future madVR development I need YOUR HELP.

http://madshi.net/madNV12Test.zip

This is a little test program I've written which runs a few texture/surface related tests on your GPU and writes the results to a text file called "madNV12.txt" on your desktop. If you want to help out with future madVR development, please run this tool on your PC and send the "madNV12.txt" file to me via PM or email. Please don't post the results into this thread, otherwise there would be too many posts here. I need test results from as many OS / GPU combinations as possible.

Is there anybody with an Intel GPU who can run this, too? I'm asking this specifically because I know that most madVR users are using NVidia and ATI. Of course I also need test results from as many different NVidia and ATI GPUs as possible, integrated and dedicated.

Thanks for your help!

Just in case you're wondering, here are the test results from my HTPC:

Code:
Windows 7 x64 Service Pack 1
NVIDIA GeForce 9400

D3D9 Surface StretchRect:
D3DFMT_R8G8B8: creating GPU texture failed
D3DFMT_A8R8G8B8: lossy (16-235)
D3DFMT_X8R8G8B8: lossy (16-235)
D3DFMT_A8B8G8R8: creating GPU texture failed
D3DFMT_X8B8G8R8: creating GPU texture failed
D3DFMT_A2R10G10B10: luma lossless (16-236), chroma lossy
D3DFMT_A2B10G10R10: luma lossless (16-236), chroma lossy
D3DFMT_A16B16G16R16: StretchRect failed
D3DFMT_A16B16G16R16F: luma lossless (16-235), chroma lossy
D3DFMT_A32B32G32R32F: StretchRect failed

D3D9 Surface VideoProcessor:
D3DFMT_R8G8B8: creating GPU texture failed
D3DFMT_A8R8G8B8: lossy (16-235)
D3DFMT_X8R8G8B8: lossy (16-235)
D3DFMT_A8B8G8R8: creating GPU texture failed
D3DFMT_X8B8G8R8: creating GPU texture failed
D3DFMT_A2R10G10B10: luma lossless (16-236), chroma lossy
D3DFMT_A2B10G10R10: luma lossless (16-236), chroma lossy
D3DFMT_A16B16G16R16: luma lossless (16-235), chroma lossy
D3DFMT_A16B16G16R16F: luma lossless (16-235), chroma lossy
D3DFMT_A32B32G32R32F: luma lossless (16-235), chroma lossy

DXVA Surface StretchRect:
D3DFMT_R8G8B8: creating GPU texture failed
D3DFMT_A8R8G8B8: lossy (16-190)
D3DFMT_X8R8G8B8: lossy (16-190)
D3DFMT_A8B8G8R8: creating GPU texture failed
D3DFMT_X8B8G8R8: creating GPU texture failed
D3DFMT_A2R10G10B10: lossy (16-191)
D3DFMT_A2B10G10R10: lossy (16-191)
D3DFMT_A16B16G16R16: StretchRect failed
D3DFMT_A16B16G16R16F: lossy (16-178)
D3DFMT_A32B32G32R32F: StretchRect failed

DXVA Surface VideoProcessor:
D3DFMT_R8G8B8: creating GPU texture failed
D3DFMT_A8R8G8B8: lossy (16-235)
D3DFMT_X8R8G8B8: lossy (16-235)
D3DFMT_A8B8G8R8: creating GPU texture failed
D3DFMT_X8B8G8R8: creating GPU texture failed
D3DFMT_A2R10G10B10: luma lossless (16-236), chroma lossy
D3DFMT_A2B10G10R10: luma lossless (16-236), chroma lossy
D3DFMT_A16B16G16R16: luma lossless (16-235), chroma lossy
D3DFMT_A16B16G16R16F: luma lossless (16-235), chroma lossy
D3DFMT_A32B32G32R32F: luma lossless (16-235), chroma lossy

D3D9 Surface speed test:
NV12: upload 458 fps, download 27 fps, trick download failed
YV12: upload 278 fps, download 26 fps, trick download failed
A8R8G8B8: upload 164 fps, download 96 fps, trick download failed

DXVA Surface speed test:
NV12: upload 188 fps, download 252 fps, trick download failed
YV12: upload 278 fps, download 26 fps, trick download failed
A8R8G8B8: upload 160 fps, download 12 fps, trick download failed

A8R8G8B8 Texture speed test:
default: upload 81 fps, download 101 fps
dynamic: upload 112 fps, download 29 fps, trick download 74 fps
As you can see, it's just a bunch of tests. In the first sections I'm testing which methods the GPU/driver supports for converting NV12 surfaces to RGB textures and whether the conversion is "lossless" (meaning whether the conversion is fully reversible). In the last past of the tests I'm doing some speed tests to find out how fast GPU textures and GPU surfaces can be read from and written to. The test results will help me choosing the right algorithms for future development.

For comparison here's my development PC:

Code:
Windows XP Service Pack 3
AMD Radeon HD 6800 Series

D3D9 Surface StretchRect:
D3DFMT_R8G8B8: creating GPU texture failed
D3DFMT_A8R8G8B8: lossy (0-255)
D3DFMT_X8R8G8B8: lossy (0-255)
D3DFMT_A8B8G8R8: creating GPU texture failed
D3DFMT_X8B8G8R8: lossy (0-255)
D3DFMT_A2R10G10B10: lossless (0-255)
D3DFMT_A2B10G10R10: lossless (0-255)
D3DFMT_A16B16G16R16: lossless (0-255)
D3DFMT_A16B16G16R16F: lossless (0-255)
D3DFMT_A32B32G32R32F: lossless (0-255)

D3D9 Surface VideoProcessor:
D3DFMT_R8G8B8: creating GPU texture failed
D3DFMT_A8R8G8B8: lossy (0-0)
D3DFMT_X8R8G8B8: lossy (0-0)
D3DFMT_A8B8G8R8: creating GPU texture failed
D3DFMT_X8B8G8R8: VideoProcessBlt failed
D3DFMT_A2R10G10B10: VideoProcessBlt failed
D3DFMT_A2B10G10R10: VideoProcessBlt failed
D3DFMT_A16B16G16R16: VideoProcessBlt failed
D3DFMT_A16B16G16R16F: VideoProcessBlt failed
D3DFMT_A32B32G32R32F: VideoProcessBlt failed

DXVA Surface StretchRect:
D3DFMT_R8G8B8: creating GPU texture failed
D3DFMT_A8R8G8B8: lossy (0-255)
D3DFMT_X8R8G8B8: lossy (0-255)
D3DFMT_A8B8G8R8: creating GPU texture failed
D3DFMT_X8B8G8R8: lossy (0-255)
D3DFMT_A2R10G10B10: lossless (0-255)
D3DFMT_A2B10G10R10: lossless (0-255)
D3DFMT_A16B16G16R16: lossless (0-255)
D3DFMT_A16B16G16R16F: lossless (0-255)
D3DFMT_A32B32G32R32F: lossless (0-255)

DXVA Surface VideoProcessor:
D3DFMT_R8G8B8: creating GPU texture failed
D3DFMT_A8R8G8B8: lossy (0-0)
D3DFMT_X8R8G8B8: lossy (0-0)
D3DFMT_A8B8G8R8: creating GPU texture failed
D3DFMT_X8B8G8R8: VideoProcessBlt failed
D3DFMT_A2R10G10B10: VideoProcessBlt failed
D3DFMT_A2B10G10R10: VideoProcessBlt failed
D3DFMT_A16B16G16R16: VideoProcessBlt failed
D3DFMT_A16B16G16R16F: VideoProcessBlt failed
D3DFMT_A32B32G32R32F: VideoProcessBlt failed

D3D9 Surface speed test:
NV12: upload 181 fps, download 4 fps, trick download failed
YV12: upload 184 fps, download 4 fps, trick download failed
A8R8G8B8: upload 105 fps, download 2 fps, trick download failed

DXVA Surface speed test:
NV12: upload 178 fps, download 4 fps, trick download failed
YV12: upload 181 fps, download 4 fps, trick download failed
A8R8G8B8: upload 107 fps, download 2 fps, trick download failed

A8R8G8B8 Texture speed test:
default: upload 58 fps, download 174 fps
dynamic: upload 139 fps, download 2 fps, trick download 169 fps
As you can see the test results vary quite a bit. I'm not sure how much of the difference is GPU related and how much is OS related.
madshi is offline   Reply With Quote
Old 12th March 2011, 23:42   #5943  |  Link
Razoola
Registered User
 
Join Date: May 2007
Posts: 454
I have sent result of my gtx295 via PM.
Razoola is offline   Reply With Quote
Old 12th March 2011, 23:47   #5944  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Thanks in advance to everyone who helps out. Please understand if I don't reply to everyone... Please don't "announce" your PM/eMail in this thread. It's not necessary, I regularly check my PMs and eMails.
madshi is offline   Reply With Quote
Old 12th March 2011, 23:58   #5945  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
I've just noticed something in a report I just received. The first section reads like this:

Code:
Windows 7 x64 Service Pack 1
NVIDIA GeForce GTX 260

D3D9 Surface StretchRect:
D3DFMT_R8G8B8: creating GPU texture failed
D3DFMT_A8R8G8B8: lossy (0-255)
D3DFMT_X8R8G8B8: lossy (0-255)
D3DFMT_A8B8G8R8: creating GPU texture failed
D3DFMT_X8B8G8R8: creating GPU texture failed
D3DFMT_A2R10G10B10: lossy (0-255)
D3DFMT_A2B10G10R10: lossy (0-255)
[...]
If you have an NVidia GPU and if in the "D3D9 Surface StretchRect" section the test results say "(0-255)" that means that your NVidia GPU outputs video levels, by forcefully stretching all data behind madVR's back. This can result in banding artifacts. The fix for this problem is to create custom resolutions. You can double check whether the fix works by running madNV12Test again. If it reads "(16-235)" in the "D3D9 Surface StretchRect" section, that means you're getting proper output levels for best madVR quality.

Please note: This only applies to NVidia!

Edit: It seems that the "Adjust video color settings - Advanced - Dynamic Range" settings also affect these results. You should probably set them to "Video Player Settings" if you want to find out whether you get proper output levels with madVR. Probably only then my advice above is valid.

Last edited by madshi; 13th March 2011 at 00:53.
madshi is offline   Reply With Quote
Old 13th March 2011, 09:02   #5946  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Ok, time for some first conclusions.

upload speeds:
ATI: good (win7); acceptable (XP)
NVidia: good
Intel: acceptable

texture download speeds:
ATI: good (win7); acceptable (XP)
NVidia: good
Intel: acceptable

surface download speeds:
ATI: catastrophic
NVidia: fantastic (newer hardware/drivers); catastrophic (XP with older hardware/drivers)
Intel: catastrophic

reliability of NV12 -> RGB conversion
ATI: good (pretty much always the same)
NVidia: bad (influenced by various video options, sometimes weird results)
Intel: good (pretty much always the same)

lossless reversibility of NV12 -> RGB conversion
ATI: good
NVidia: bad (no way to stop driver from doing chroma upsampling, sometimes weird results for luma)
Intel: very bad (even luma is never reversible)

-------

Seems to me that ATI users might benefit from going win7. For NVidia users XP seems to do just fine (in these tests, at least), as long as you don't have old hardware + old drivers. Intel is really bad, even in win7.

P.S: Just to clarify: Most of these test results don't affect the current madVR version.

Last edited by madshi; 13th March 2011 at 09:11.
madshi is offline   Reply With Quote
Old 13th March 2011, 09:09   #5947  |  Link
namaiki
Registered User
 
Join Date: Sep 2009
Location: Sydney, Australia
Posts: 1,073
madshi, did you get any results from users with Intel graphics on sandy bridge? I think it's called Intel HD Graphics 3000.
namaiki is offline   Reply With Quote
Old 13th March 2011, 09:15   #5948  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Don't know for sure. I've two reports with "Intel(R) HD Graphics", one with "Mobile Intel(R) 4 Series Express Chipset Family" and one with "Mobile Intel(R) 965 Express Chipset Family".
madshi is offline   Reply With Quote
Old 13th March 2011, 09:30   #5949  |  Link
namaiki
Registered User
 
Join Date: Sep 2009
Location: Sydney, Australia
Posts: 1,073
Darn. Looking at the driver inf, it seems that the latest generation Intel Sandy Bridge graphics will come up as "Intel(R) HD Graphics Family".
namaiki is offline   Reply With Quote
Old 13th March 2011, 10:10   #5950  |  Link
Razoola
Registered User
 
Join Date: May 2007
Posts: 454
I noticed that the report created by your program does not report the graphic card driver version. It might be a good thing to have this reported in the case of nvidia at least given the issue you have already found
Razoola is offline   Reply With Quote
Old 13th March 2011, 10:26   #5951  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Yeah, maybe. Don't have fun/time to add that right now, though.

FWIW, I've enough results from ATI users now. But some more NVidia + Intel results would be nice.
madshi is offline   Reply With Quote
Old 13th March 2011, 10:34   #5952  |  Link
renq
Registered User
 
Join Date: Aug 2009
Posts: 51
GTX460 results sent
FW 267.05, should it matter

Last edited by renq; 13th March 2011 at 10:38.
renq is offline   Reply With Quote
Old 13th March 2011, 10:48   #5953  |  Link
Razoola
Registered User
 
Join Date: May 2007
Posts: 454
Quote:
Originally Posted by madshi View Post
Yeah, maybe. Don't have fun/time to add that right now, though.

FWIW, I've enough results from ATI users now. But some more NVidia + Intel results would be nice.
I also have an issue where I have 2 gpus, trying to run the test program on the secondary gpu is a pain given its display is in a different room from the PC. I have managed to do it but it was a real haedache. If you do update it adding driver details adding a switch to choose the monitor to run the test on could be really helpfull.

This result suffers the problem you mention about colors although I do already have the suggested fixes you mention in place. I want to point out that this gpu is output via HMDI. Will send the result now.

Last edited by Razoola; 13th March 2011 at 11:04.
Razoola is offline   Reply With Quote
Old 13th March 2011, 10:52   #5954  |  Link
DottorLeo
Registered User
 
DottorLeo's Avatar
 
Join Date: Jan 2010
Location: Treviso (Italy)
Posts: 15
Ati 5770 result send!
DottorLeo is offline   Reply With Quote
Old 13th March 2011, 11:44   #5955  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
Quote:
Originally Posted by madshi View Post
If you have an NVidia GPU and if in the "D3D9 Surface StretchRect" section the test results say "(0-255)" that means that your NVidia GPU outputs video levels, by forcefully stretching all data behind madVR's back. This can result in banding artifacts. The fix for this problem is to create custom resolutions. You can double check whether the fix works by running madNV12Test again. If it reads "(16-235)" in the "D3D9 Surface StretchRect" section, that means you're getting proper output levels for best madVR quality.
I have custom resolutions on my HTPC, yet it still says 0-255 in the output of your tool. I have not seen any evidence of banding, however. You wouldn't happen to have an easy to use sample file? :d
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is online now   Reply With Quote
Old 13th March 2011, 11:52   #5956  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
Quote:
Originally Posted by nevcairiel View Post
I have custom resolutions on my HTPC, yet it still says 0-255 in the output of your tool. I have not seen any evidence of banding, however. You wouldn't happen to have an easy to use sample file? :d
I previously had the Dynamic Range configured to 0-255, because my TV was supposed to get PC Level output. If i set that to Video Player controlled, i get the proper values in your tool.

I suppose thats how its meant to be configured to let madVR expand it to PC levels, right?
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is online now   Reply With Quote
Old 13th March 2011, 11:52   #5957  |  Link
Razoola
Registered User
 
Join Date: May 2007
Posts: 454
Quote:
Originally Posted by nevcairiel View Post
I have custom resolutions on my HTPC, yet it still says 0-255 in the output of your tool. I have not seen any evidence of banding, however. You wouldn't happen to have an easy to use sample file? :d
I have been playing with this also since madshi has brought it up and I have not seen any banding either. This issue I have however is if I do change the nvidia HMDI setting to output desktop levels (it was set to full screen videos) then I get a washed out picture on my plasma. I think this setting is only available when using HDMI out from the gfx card and is found at the bottom of the 'adjust desktop color settings' page in the nvidia control panel.

Having this set to RGB and full screen video gives me the best full contrast picture.
Razoola is offline   Reply With Quote
Old 13th March 2011, 11:53   #5958  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by nevcairiel View Post
I have custom resolutions on my HTPC, yet it still says 0-255 in the output of your tool. I have not seen any evidence of banding, however.
Make sure you set "Adjust video color settings - Advanced - Dynamic Range" to "Video Player Settings" in your NVidia control panel. Otherwise the "0-255" vs. "16-235" output of my tool doesn't say much about whether you'll get banding or not.

Quote:
Originally Posted by nevcairiel View Post
You wouldn't happen to have an easy to use sample file? :d
Sure! Scale the "smallramp.ytp" test pattern up to full screen resolution. That should produce a very smooth grayscale. (see madTestPatternSource folder).
madshi is offline   Reply With Quote
Old 13th March 2011, 11:55   #5959  |  Link
namaiki
Registered User
 
Join Date: Sep 2009
Location: Sydney, Australia
Posts: 1,073
There will/may be banding visible in EVR and any of the other system renderers, (VMR and Overlay) if you set your graphics card to expand from TV to PC levels without dither. (nVidia 9600M GT and Intel 4500MHD which I tested do not dither) (not saying you didn't already know this)
namaiki is offline   Reply With Quote
Old 13th March 2011, 11:58   #5960  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
Quote:
Originally Posted by madshi View Post
Sure! Scale the "smallramp.ytp" test pattern up to full screen resolution. That should produce a very smooth grayscale. (see madTestPatternSource folder).
That does look pretty smooth to me.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is online now   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 12:10.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.