Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 11th April 2009, 12:51   #101  |  Link
jj666
Registered User
 
Join Date: May 2003
Posts: 153
Quote:
Originally Posted by yesgrey3 View Post
So the "video elite" want's to run madVR in such old graphics cards?
lol ;-)

Quote:
Originally Posted by yesgrey3 View Post
It would be great if people with more recent cards would test it and post the results, so we could be able to see if it's just our cards that cannot handle it.
A single 9800GX2 works fine, I observed no "tearing" or anything else out of the ordinary.

Thanks a lot Madshi!

Cheers,

-jj-
jj666 is offline   Reply With Quote
Old 11th April 2009, 13:24   #102  |  Link
Casshern
Registered User
 
Join Date: Apr 2007
Posts: 220
No fundamental problems here. XP SP3, ATI 2600 PRO 512 MB, Catalyst 9.2, Zoomplayer. No tearing, smoothness is on par with overlay and vmr7 when using reclock. Just some small things:
1) The options do not get saved, and the dialog looks wierd (white boxes)
2) Picture looks a tad to light - according to the 3dlut employed it should expand just as the ATI drivers do. But it seems different. It's not a 16 vs. 0 difference more a 2 vs. 0. Could it be that the ATI level expansion is not very accurate? Or is there still a small inaccuracy in this renderer?

Is there a way to save the resulting RGB frame? Then one could easily compare with the real values...

bye,

Casshern

Quote:
Originally Posted by jj666 View Post
lol ;-)



A single 9800GX2 works fine, I observed no "tearing" or anything else out of the ordinary.

Thanks a lot Madshi!

Cheers,

-jj-

Last edited by Casshern; 11th April 2009 at 13:28.
Casshern is offline   Reply With Quote
Old 11th April 2009, 13:31   #103  |  Link
Mark_A_W
3 eyed CRT supporter
 
Join Date: Jan 2008
Location: Or-strayl-ya
Posts: 563
Quote:
Originally Posted by yesgrey3 View Post
So the "video elite" want's to run madVR in such old graphics cards?
Please remember that madVR needs lot of shader processing power, and the old cards have very few shader processing units... My GF 8600GT only has 16, and it barelly keeps up with it. So, if we really want to use madVR, we should start thinking in upgrading... of course madshi could be able to optimize the code, but in the end it will always be a question of shading power. You have it or not.
It would be great if people with more recent cards would test it and post the results, so we could be able to see if it's just our cards that cannot handle it.


I can help you with the 3dlut files creation, but you have to be more specific with your problems... Which source, which other renderers are you comparing? video levels or PC levels?

madshi,
are you ok with using this thread for 3dlut discussions, or will you prefer that we discuss 3dlut stuff in another thread?
Yeah....that 2 year old junk

I guess we need to figure out the minimum required card for 1080p video, if it is these cards that are the issue. I was looking at video card prices and for the ~$200 or so I paid for the 2600XT, I can get a 4850 or somesuch.

What's Madshi got (cue re-reading this thread..)? That would be a good place to start!


And yes please, can you create a thread with the basics of creating a 3dlut?

Currently I run a custom gamma curve (hand crafted with VideoEqualiser which a CRT owner in the AVS CRT forum wrote), on a CRT calibrated with HCFR (which was in-turn calibrated against Colorfacts with the help of a friend).

Can I use HCFR to help generate a 3d lut? Does it/can it replace my custom gamma curve?

Please answer in a new thread!

Thanks

Mark
Mark_A_W is offline   Reply With Quote
Old 11th April 2009, 13:42   #104  |  Link
Mark_A_W
3 eyed CRT supporter
 
Join Date: Jan 2008
Location: Or-strayl-ya
Posts: 563
Quote:
Originally Posted by madshi View Post
Ok, thanks for feedback. Improving motion smoothness is on the top of my priority list. However, the non-smoothness might also be caused by the HD2600 being too old/slow. Compared to my (entry level) HD3850, the HD2600 only has 1/3 of the shader power and only half of the memory bandwidth. You can try activating some of the "trade quality for performance" options. Maybe that helps smoothness for you?

I tested disabling all of preformance options, and it still tears.

The tearing is not very noticable on most scenes - it can be missed at a casual glance, but the Reclock tearing test reveals all. There are about 5 minor tears, spaced up the screen.

Normally tearing is one big tear towards the top of the screen. This is different.

I take your point about the HD2600 series cards. It may be time for an upgrade.
Mark_A_W is offline   Reply With Quote
Old 11th April 2009, 14:07   #105  |  Link
yesgrey
Registered User
 
Join Date: Sep 2004
Posts: 1,295
Quote:
Originally Posted by Casshern View Post
the dialog looks wierd (white boxes)
I think you (like me) might be using another color for your windows backgrounds instead of the default white...
Quote:
Originally Posted by Casshern View Post
Is there a way to save the resulting RGB frame? Then one could easily compare with the real values...
Yes. Use your keyboard PrtSc key and paste it to an image editor. I use Microsoft photo editor and "paste as new image...".

Quote:
Originally Posted by Mark_A_W View Post
I was looking at video card prices and for the ~$200 or so I paid for the 2600XT, I can get a 4850 or somesuch.
It's better waiting a little... Soon should be released (in May?) the new ATI 4770 which should be slightly slower than the 4850 and should cost ~$100...


Quote:
Originally Posted by Mark_A_W View Post
And yes please, can you create a thread with the basics of creating a 3dlut?
Does it/can it replace my custom gamma curve?
I'm currently adding the custom gamma curves. I will create the thread once it's ready.
yesgrey is offline   Reply With Quote
Old 11th April 2009, 14:27   #106  |  Link
Jaja1
Registered User
 
Join Date: Aug 2007
Posts: 59
First of all, thanks for another great project Madshi.

I tested some AVC Blu-rays on two machines. I'm using the same software setup on both machines XP+SP3, Zoomplayer, Haali splitter and CoreAVC.

On the NVidia 7600GT machine I get 90-100% CPU usage and the image stutters as hell. So I can't conclude whether the vidcard lacks power since the CPU usage alone would result in serious stuttering.

On a NVidia 9500GT on the other hand the CPU usage is normal (about 20-25%) and at first sight the video is smooth. Not very reliable since this is small PC Monitor. Will test this later with my PJ on a large screen.

It would be great if your effort to create smooth pan's would apply not only to 24p but also to 59,94 Hz displays. My projector unfortunately cannot do 24p. Of course you can't resolve the 3:2 judder, but now there are pans from time to time that are so bad that they are hard to watch.

There might be a annoying problem with custom timings on modern vidcards. Powerstrip doesn't support NVidia after the Geforce7 series and the Nvidia drivers are in my experience very unreliable creating custom timings. I do not know about ATI though.
Jaja1 is offline   Reply With Quote
Old 11th April 2009, 14:29   #107  |  Link
Mike5
Registered User
 
Join Date: Feb 2007
Location: Palermo (Italy)
Posts: 67
Quote:
Originally Posted by yesgrey3 View Post
It would be great if people with more recent cards would test it and post the results...
I'm going to spend the afternoon trying madVR on a HD 4650, with a projector and a 2.50 mt wide screen. I'll post the results.
Mike5 is offline   Reply With Quote
Old 11th April 2009, 14:46   #108  |  Link
Casshern
Registered User
 
Join Date: Apr 2007
Posts: 220
I noticed no tearing on an AGP 2600 Pro 512 MB passive (MSI). I have set catalyst to activate vsync synchonization if the application does not specify and use zoomplayer with reclock (but without vsync correction). But will check again tonight.

Quote:
Originally Posted by Mark_A_W View Post
I tested disabling all of preformance options, and it still tears.

The tearing is not very noticable on most scenes - it can be missed at a casual glance, but the Reclock tearing test reveals all. There are about 5 minor tears, spaced up the screen.

Normally tearing is one big tear towards the top of the screen. This is different.

I take your point about the HD2600 series cards. It may be time for an upgrade.
Casshern is offline   Reply With Quote
Old 11th April 2009, 14:51   #109  |  Link
Mark_A_W
3 eyed CRT supporter
 
Join Date: Jan 2008
Location: Or-strayl-ya
Posts: 563
Quote:
Originally Posted by Casshern View Post
I noticed no tearing on an AGP 2600 Pro 512 MB passive (MSI). I have set catalyst to activate vsync synchonization if the application does not specify and use zoomplayer with reclock (but without vsync correction). But will check again tonight.
Ditto with ZP and Reclock.

I tried the CCC V-sync settings. No difference. Reclock tearing test shows tearing.

Video is playing on my secondary monitor, that could be an issue (it is with Haali if Aero is enabled, that causes tearing..Aero gets confused with two monitors).
Mark_A_W is offline   Reply With Quote
Old 11th April 2009, 14:51   #110  |  Link
yesgrey
Registered User
 
Join Date: Sep 2004
Posts: 1,295
Quote:
Originally Posted by Jaja1 View Post
I'm using the same software setup on both machines XP+SP3, Zoomplayer, Haali splitter and CoreAVC.
Please try Coreavc with and without cuda. With my 8600GT I have to disable cuda.
yesgrey is offline   Reply With Quote
Old 11th April 2009, 15:30   #111  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,463
Quote:
Originally Posted by yesgrey3 View Post
Please try Coreavc with and without cuda. With my 8600GT I have to disable cuda.
I'll try on my GF9600 and will report back, it all works like a charm in HR in 23.976@48Hz...except for that damn jitter that requires many reseeks to catch the VSYNC fliptime properly

it all works fine w/ 25/29.97fps MKV and any other container, just 23.976 MKV crap out constantly....I think HR/HMS are too tied up together, and there's some glitch in either of those that apparently will NEVER be fixed

madVR might very well be the answer to all this Reclock hiccup and offer perfect smoothness OOTB...and if it could have a "compatibility" mode that uses the GUID of another renderer(VMR7/EVR/HR?) it could even work in *ANY* player(PDVD9/TMT3/KMPlayer)

2009 started well w/ CoreAVC CUDA and all, it might look even better w/ madVR....hopefully we'll also get an english.ini for PotPlayer(new player from the original KMP coder) soon enough

Last edited by leeperry; 11th April 2009 at 15:32.
leeperry is offline   Reply With Quote
Old 11th April 2009, 15:47   #112  |  Link
yesgrey
Registered User
 
Join Date: Sep 2004
Posts: 1,295
Quote:
Originally Posted by leeperry View Post
I'll try on my GF9600 and will report back, it all works like a charm in HR
But HR does not make such an intensive use of GPU as madVR, and also the memory use in madVR is very high due to the 3dlut (96MB) and all the intermediate buffers at 16bit per component... That's why we need to redefine our working minimuns.
With madVR doing everything in the GPU, we would have more CPU power for software AVC decoding...
yesgrey is offline   Reply With Quote
Old 11th April 2009, 16:21   #113  |  Link
Egh
Registered User
 
Join Date: Jun 2005
Posts: 630
Quote:
Originally Posted by yesgrey3 View Post
But HR does not make such an intensive use of GPU as madVR, and also the memory use in madVR is very high due to the 3dlut (96MB) and all the intermediate buffers at 16bit per component... That's why we need to redefine our working minimuns.
With madVR doing everything in the GPU, we would have more CPU power for software AVC decoding...
lolz. CPU if anything is the one to be used here, for madVR. Remember I reported it maxed one of the 3.1GHz cores here whilst paused.

As for memory requirement and so on, with 512mb I have on my grafix card I should expect no problems.


Quote:
Quote:
Originally Posted by Egh View Post
This renderer potentially is top#1 choice for video elite
7900GTX (fastest 7xxx GForce)
Quote:
Originally Posted by Mark_A_W View Post
On an ATi 2600XT
Quote:
So the "video elite" want's to run madVR in such old graphics cards?
Please remember that madVR needs lot of shader processing power, and the old cards have very few shader processing units... My GF 8600GT only has 16, and it barelly keeps up with it. So, if we really want to use madVR, we should start thinking in upgrading... of course madshi could be able to optimize the code, but in the end it will always be a question of shading power. You have it or not.
It would be great if people with more recent cards would test it and post the results, so we could be able to see if it's just our cards that cannot handle it.
Well, thing is that 7900GTX even though is definately older architecture, but still pulls most contemporary games in DX9 in medium and sometimes even high graphics settings. It renders about 100fps in a Source-engine based game whilst consuming slightly more CPU power than madVR displaying still frame only.

We definitely need some clarification from the developer regarding what shader version it uses and how many shader units are employed simultaneously.

I'm looking into upgrade options and this renderer would fasttrack it However I'm looking into the card which is able to produce 10bit colour, as one of my monitors is CRT which is theoretically able to produce deepcolour (few LCD panels are able to do it, and they do cost alot )

Last edited by Egh; 11th April 2009 at 16:28.
Egh is offline   Reply With Quote
Old 11th April 2009, 16:27   #114  |  Link
Betsy25
Registered User
 
Join Date: Sep 2008
Location: Holland, Belgium
Posts: 330
Congratulations Madshi !

Some more bug fixing and it's definitely a renderer to keep and use !

Can you please update links to the latest version in your top post ?

Thanks a bunch !
Betsy25 is offline   Reply With Quote
Old 11th April 2009, 16:38   #115  |  Link
yesgrey
Registered User
 
Join Date: Sep 2004
Posts: 1,295
Quote:
Originally Posted by Egh View Post
However I'm looking into the card which is able to produce 10bit colour
It's not the card, it's the OS. It seems that only Vista and up support the 10bit mode... I've tryed it with XP SP3 and did not work.
Quote:
Originally Posted by Egh View Post
one of my monitors is CRT which is theoretically able to produce deepcolour (few LCD panels are able to do it, and they do cost alot )
I'm not so sure about it... I always thought that being CRT analog it would be able of higher than 8bit bit depth, but recently I have read that the CRT phosphors limit is 8bit... and even if you feed it more you will not be able to see any difference...
It seems we can only achieve >8bit color with digital displays...
yesgrey is offline   Reply With Quote
Old 11th April 2009, 16:39   #116  |  Link
Brazil2
Registered User
 
Join Date: Jul 2008
Posts: 479
Quote:
Originally Posted by Snowknight26 View Post
MPC-HC freezes and maxes out a single core while its frozen. RAM usage keeps climbing but nothing else happens.
I'm randomly getting the same issue, MPC-HC seems to freeze but in fact it's playing as I can hear the sound but I see no video and after several seconds the memory usage raises up to about 700 MB and then the video is displayed.
I can't find a way to reproduce this each time but it happens to me when I'm successively playing different video formats (Xvid, MPEG2, H264, VC-1) in different containers (AVI, MPG, MP4, MKV, WMV) and with different resolutions (from 512*288 up to 1920*1080).


There is also something wrong when MadVR is used in GraphEdit/GraphStudio.
The caption bar of the MadVR window has no controls (no system menu, no buttons) and it's cropping the video because the whole MadVR window is at the video native resolution but this includes the caption bar.

Screenshots:
With default VMR the whole window is 520*315 but the video part is 512*288 which is the resolution of the video:


With MadVR the whole window is 512*288 including the caption bar which ends up with a cropped video:



In these screenshots you can also notice the difference between colors.
I got the cr3dlut v2 tool but I'm not sure about which parameters to use to create a new 3dlut for SD with MadVR (say for DVD and DVB MPEG2 SD). Please, could anyone post a 3dlut file for this ?
Brazil2 is offline   Reply With Quote
Old 11th April 2009, 17:13   #117  |  Link
yesgrey
Registered User
 
Join Date: Sep 2004
Posts: 1,295
Quote:
Originally Posted by Brazil2 View Post
I got the cr3dlut v2 tool but I'm not sure about which parameters to use to create a new 3dlut for SD with MadVR (say for DVD and DVB MPEG2 SD). Please, could anyone post a 3dlut file for this ?
Here is for PAL, and video levels to PC levels expansion:
Code:
Input_Bit_Depth          8
Input_Video_Format       PAL_DVD   YCbCr
Output_Bit_Depth         16
Output_Video_Format      PAL_DVD      RGB_PC
For NTSC use "NTSC_DVD" and if you don't want video levels to PC levels expansion use RGB_Video.
Soon I will open a thread for cr3dlut helping and discussion...
yesgrey is offline   Reply With Quote
Old 11th April 2009, 17:31   #118  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,463
Quote:
Originally Posted by yesgrey3 View Post
I always thought that being CRT analog it would be able of higher than 8bit bit depth, but recently I have read that the CRT phosphors limit is 8bit... and even if you feed it more you will not be able to see any difference...
It seems we can only achieve >8bit color with digital displays...
ARGYLLCMS has a tool to check the LUT accuracy, of course it's 8bit in DVI but it's 10bit in VGA on nvidia cards(and 9 on ATi )
but because of the D/A > A/D conversions, and the fact that CRT monitors have legacy onboard IC....prolly it doesn't really matter.
as about sending pure 10bit, well for the same reasons I'd be rather dubious...or maybe w/ 5BNC connectors on professional broadcast equipment.
Quote:
Originally Posted by yesgrey3 View Post
But HR does not make such an intensive use of GPU as madVR, and also the memory use in madVR is very high due to the 3dlut (96MB) and all the intermediate buffers at 16bit per component... That's why we need to redefine our working minimuns.
With madVR doing everything in the GPU, we would have more CPU power for software AVC decoding...
oh sure, doing the RGB32 conversion in 32float, plus applying your LUT's and scaling in spline is a HUGE plus(the YUY2 coeffs in HR are completely off)

my GF9600GSO is actually a rebadged 8800GS and it's got 96SP(the regular 9600GT has only 64), so w/ an o/c Q6600 it should take the load hopefully

Last edited by leeperry; 11th April 2009 at 17:45.
leeperry is offline   Reply With Quote
Old 11th April 2009, 17:44   #119  |  Link
Mike5
Registered User
 
Join Date: Feb 2007
Location: Palermo (Italy)
Posts: 67
Ok, I tried with a HD 4650, Win 7, Catalyst 9.3, MPC-HC, Reclock (my display doesn't support 24Hz, so I need it to stretch 23,976/24fps to 25Hz), a 1280x720 projector and a 2.5 mt wide screen.

HD-mkv mostly H.264 720p, some 1080p
No problem of any type. No crashes, no tearing at all (Reclock tearing test is perfect), no glitches.
Quality is very high with respect to EVR(CP), expecially colours. Particularly in "Speed Racer" the difference in colours between the two renderers is impressive. With madVR there is a sort of "higher colour resolution".
Smoothness can be improved. Like other renderers it depends on where you start the playing.

WMV-HD
720p Ok. 1080p gives a distorted and greenish image.

DVD
Impossible to play .ifo files from disk o DVD from drive. It gives DVD: Macromedia Fail. I can't understand what Macromedia has to do with the renderer, but wait some guru to explain me. No problems with single .vob or .mpg files. Quality is approx. on the same level as EVR plus YV12 -> RGB conversion by ffdshow.

Tried also avi, divx, xvid and everything I have with no problems.
Mike5 is offline   Reply With Quote
Old 11th April 2009, 17:46   #120  |  Link
Egh
Registered User
 
Join Date: Jun 2005
Posts: 630
Quote:
Originally Posted by leeperry View Post
the ARGYLLCMS has a tool to check the LUT accuracy, of course it's 8bit in DVI but it's 10bit in VGA on nvidia cards(and 9 on ATi )
but because of the D/A > A/D conversions, and the fact that CRT monitors have legacy onboard IC....prolly it doesn't really matter.
as about sending pure 10bit, well for the same reasons I'd be rather dubious...or maybe w/ 5BNC connectors on professional broadcast equipment.
Fail -- a) DVI standard supports >8bit accuracy (read about dual-link for instance). b) There are monitors with HDMI c) DisplayPort :win:

As for CRTs, there may be some issues but note that normal people don't use cheap CRTs nowadays, so hopefully high-end monitors may display colours a bit better in 10bit mode. Never tried myself so cannot guarantee anything, however I do opt for high-end grafix with deepcolour and HDMI/DisplayPort as for the next upgrade.

Nvidia RAMDAC in 9xxx series is 10bit I think.
Egh is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 18:42.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.