Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 19th July 2013, 17:17   #19621  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
Quote:
Originally Posted by xabregas View Post
Now my problem is i have 60HZ screen with no 24p option and i have 23,976 and 25 FPS material. Do i need to use reclock pal speed down when watching 25 FPS with madvr smooth motion on??

I tested both and it gave me flickering with reclock and without reclock minor ghosting, but more than i have with 23,976 material
Some people are reporting ghosting. I don't know why, I don't see it myself. Either I'm just not as perceptive to this kind of artifacts, or maybe your display uses a different refresh rate than 60Hz. E.g. if your display internally uses 70Hz, then it will probably repeat some frames to get from 60Hz to 70Hz. If those repeated frames happen to be madVR blended frames, you will see obvious ghosting. But I don't really know if that's what's happening on your PC. Can your display handle 72Hz in native display resolution? If so, try that.

Reclock is not needed for smooth motion to work correctly.

Quote:
Originally Posted by xabregas View Post
But still 1 option in trade quality for performance is checked (store custom pixel shader results in 16bit buffer. instead of 32bit Wonder what it does
If your PC is fast enough, uncheck all those "trade quality for performance" options.

Quote:
Originally Posted by DragonQ View Post
madshi, if I play a 1080i/30 file which is really 1080p/24 and then turn on IVTC, it plays back properly at 24 fps. However, smooth motion doesn't enable (60 Hz monitor) and "movie" is still set to 29.970 fps. A bug?
Do you happen to use native DXVA decoding? IVTC doesn't work in that case.

Quote:
Originally Posted by dansrfe View Post
Is BFI planned for any future madVR build? Just curious...
Was discussed before. You should be able to find it via search.

Quote:
Originally Posted by sneaker_ger View Post
So, the first official/public XySubFilter beta has been released and since it is - to my knowledge - the only filter making use of madVR's new subtitle interface, it might be of interest to some
There will be an official thread for it here on doom9 soon. FWIW, the new subtitle interface is not "mine", although I played a role defining it and the header is hosted on my server. Still, it's a BSD licensed open specification and can be used by anyone who's interested. An MPC-HC dev already hinted that maybe they could use it for the MPC-HC internal renderers, too, but that's far from a "done deal". In any case, it's not "madVR's new subtitle interface", but just the "new subtitle interface" and madVR is one of the first who supports it. Just to clarify the situation...

Quote:
Originally Posted by Soukyuu View Post
Here's the first one: I have the latest mpc-hc+madvr+xysubfilter beta. The .ass subtitles are rendered in an opaque black box instead of blending it unto the video, does anyone else have this problem?
The way the new subtitle interface works, madVR needs the GPU to perform subtitle alpha blending - and in 16bit. Unfortunately some older GPUs only support alpha blending in 8bit. For such GPUs you're likely to see an opaque black box instead of smoothly blended subtitles. I'm sorry, but at this point I don't see how I could fix it. It's a hardware limitation of your GPU and the only solution will probably be to upgrade your GPU to a newer model. I believe all newer GPUs can do alpha blending in 16bit without any problems. At least my HD4000 can.

Maybe I'll find a workaround in the long run. But in the short run I don't think I can solve this for old GPUs like yours...

Quote:
Originally Posted by xabregas View Post
So i use MPC internal subtitle filter. Why should i change to xyvsfilter?
You may want to change to XySubFilter, not to xy-vsfilter. XySubFilter allows you to render subtitles in the same quality as MPC's internal subtitle filter - but with less CPU consumption, with correct colors and with correctly scaled 3D effects. The MPC internal subtitle filter is slower, shows incorrect colors and has problems scaling certain ASS effects correctly.
madshi is offline   Reply With Quote
Old 19th July 2013, 17:23   #19622  |  Link
Mark Rejhon
Chief Blur Buster
 
Join Date: Sep 2002
Location: Toronto
Posts: 38
I realized I neglected to *directly* answer a very important question about video renderers that add BOTH interpolation and BFI:
Quote:
Originally Posted by turbojet View Post
Users that are asking for bfi, is it to replace or complement frc?
Either or both.
Some existing HDTV's do this already by combining scanning backlight (equivalent of hardware BFI) and interpolation.

The problem is that to do both in software, you really need a very high native refresh rate:
(1) 60fps -> 120fps (via interpolation) -> 240fps (via black frames).
(2) 60fps -> 240fps (via interpolation) -> 960fps (via black frames). This is how some expensive Sony "960Hz TVs" work.
This can only be done in hardware, and cannot be done over a video cable. To do this sort of stuff in software, you need high native refresh rate. Usually combining both BFI and interpolation is something mainly done in hardware inside a 240Hz HDTV.

The modern fast-refresh monitors are the 120Hz and 144Hz computer monitors. The 120Hz and 144Hz headroom gives you enough headroom to do software-based BFI, and gives you an opportunity to combine interpolation and BFI, or do some very interesting tricks:
(1) For example, you can interpolate a movie 24fps -> 72fps (interpolation) -> 144fps (BFI) -> 144Hz monitor
(2) Or you can do 30fps -> 60fps (interpolation) -> 120fps (BFI) -> 120Hz monitor

BFI can also be used by itself, too.
(1) You can do 60fps video -> 120fps (BFI) -> 120Hz monitor
(2) Or simulate an old double-strobe flicker movie projector via 24fps -> 48fps (duplicate frame) -> 144fps (BFI at 2:1 ratio) -> 144Hz monitor

There are a lot of great uses for BFI, but software-based BFI is mainly useful with 120Hz monitors because otherwise it flickers too much. I do, however, recommend adding this feature at the video renderer levels (e.g. ffdshow, madVR, etc) and adding a warning that BFI is mainly useful for 120Hz computer monitors, that it requires very accurate synchronization with the monitor (fast system, good performance), and that it can severely darken the image. But that it can be useful when motion blur reduction is critical. It definitely noticeably reduces motion blur and BFI looks much better on 120Hz monitors, but terrible on 60Hz monitors.
__________________
Thanks,
Mark Rejhon
www.blurbusters.com

Last edited by Mark Rejhon; 19th July 2013 at 17:34.
Mark Rejhon is offline   Reply With Quote
Old 19th July 2013, 17:28   #19623  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 930
Quote:
Originally Posted by madshi View Post
Do you happen to use native DXVA decoding? IVTC doesn't work in that case.
No, software. Stepping through frames shows IVTC to 24p is working properly (no repeated frames or combing) but smooth motion doesn't turn on unless I force it in the settings.
__________________
HTPC Hardware: Intel Celeron G530; nVidia GT 430
HTPC Software: Windows 7; MediaPortal 1.19.0; Kodi DSPlayer 17.6; LAV Filters (DXVA2); MadVR
TV Setup: LG OLED55B7V; Onkyo TX-NR515; Minix U9-H
DragonQ is offline   Reply With Quote
Old 19th July 2013, 17:36   #19624  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
Quote:
Originally Posted by Mark Rejhon View Post
If you have a 120Hz monitor, you can also do software-based BFI to have blur-free 60fps@120Hz, and the 60Hz flicker is far less annoying.
Ideally 240Hz would be great for a software BFI implementation, but HDMI can't handle that... Even 120Hz is not supported by most almost all TVs out there. Some computer monitors can do 120Hz, but who wants to watch movies on a computer monitor? Because of these limitations I'm not really eager to look into BFI at the moment. I don't want to spend time on something which only works on computer monitors...

Quote:
Originally Posted by Mark Rejhon View Post
Ideally, hardware-based BFI at 120Hz or 240Hz is even better. That eliminates the flicker annoyance for most. Some 120Hz monitors with LightBoost actually already do this, by turning off the backlight while waiting for LCD pixels to finish refreshing (high speed video), and strobing between the refreshes. They flicker like a 120Hz CRT, but that allows the CRT motion clarity effect during 120fps@120Hz videogames, as well as zero-motion-blur playback of 120fps videos (e.g. pre-interpolated first with SmoothVideo, etc).
Turning the backlight off while waiting for LCD pixels to finish refreshing is something you can't really do with a software based solution. So it's a good thing having that in hardware. However, I think probably the hardware BFI implementation in many displays is far from optimal and a good software implementation could potentially improve on that. If only TVs would support high refresh rates via HDMI. <sigh>

Quote:
Originally Posted by DragonQ View Post
No, software. Stepping through frames shows IVTC to 24p is working properly (no repeated frames or combing) but smooth motion doesn't turn on unless I force it in the settings.
Have you added a bug entry to the bug tracker about this? If so, there's no need to discuss it here in the forum any further. No need to post a link to your tracker issue, either. I'll get to that when I find the time...
madshi is offline   Reply With Quote
Old 19th July 2013, 17:38   #19625  |  Link
Soukyuu
Registered User
 
Soukyuu's Avatar
 
Join Date: Apr 2012
Posts: 169
Quote:
Originally Posted by madshi View Post
The way the new subtitle interface works, madVR needs the GPU to perform subtitle alpha blending - and in 16bit. Unfortunately some older GPUs only support alpha blending in 8bit. For such GPUs you're likely to see an opaque black box instead of smoothly blended subtitles. I'm sorry, but at this point I don't see how I could fix it. It's a hardware limitation of your GPU and the only solution will probably be to upgrade your GPU to a newer model. I believe all newer GPUs can do alpha blending in 16bit without any problems. At least my HD4000 can.

Maybe I'll find a workaround in the long run. But in the short run I don't think I can solve this for old GPUs like yours...
Aww man. Is my 260GTX really that old? I'm still mostly satisfied with its performance, but I guess it was just a matter of time hardware compatibility would become a problem =/
Soukyuu is offline   Reply With Quote
Old 19th July 2013, 17:41   #19626  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
To be honest, I've no clue about which generation is 16bit alpha blend capable and which is not. That's not a spec that's advertized anywhere. I don't think there's even a "CAP" bit for that in Direct3D, so I can't even ask Direct3D or the GPU itself if it can do it. We'll have to collect information from users to see which GPU generations exactly support XySubFilter+madVR and which don't...
madshi is offline   Reply With Quote
Old 19th July 2013, 17:52   #19627  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 930
Quote:
Originally Posted by madshi View Post
Have you added a bug entry to the bug tracker about this? If so, there's no need to discuss it here in the forum any further. No need to post a link to your tracker issue, either. I'll get to that when I find the time...
Will do, just wanted to see if it was already known.
__________________
HTPC Hardware: Intel Celeron G530; nVidia GT 430
HTPC Software: Windows 7; MediaPortal 1.19.0; Kodi DSPlayer 17.6; LAV Filters (DXVA2); MadVR
TV Setup: LG OLED55B7V; Onkyo TX-NR515; Minix U9-H
DragonQ is offline   Reply With Quote
Old 19th July 2013, 18:02   #19628  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
Quote:
Originally Posted by Soukyuu View Post
Aww man. Is my 260GTX really that old? I'm still mostly satisfied with its performance, but I guess it was just a matter of time hardware compatibility would become a problem =/
Just tried with my old NVidia 9400 mainboard. I can reproduce the problem there. *However*, by enabling the option "use 10bit image buffer instead of 16bit" the problem goes away. So that should be a good workaround for older GPUs.
madshi is offline   Reply With Quote
Old 19th July 2013, 18:09   #19629  |  Link
Mark Rejhon
Chief Blur Buster
 
Join Date: Sep 2002
Location: Toronto
Posts: 38
I'm the operator of the Blur Busters Blog, and creator of TestUFO motion tests, so I've become an expert in motion blur...
First: I agree, low priority issue for now, but this is an early canary warning that this is going to be slowly a more important feature in the future.
However, I would like to inform you of a few things:

Quote:
Originally Posted by madshi View Post
Ideally 240Hz would be great for a software BFI implementation, but HDMI can't handle that... Even 120Hz is not supported by most almost all TVs out there.
Fortunately, that is gradually becoming easier for 2013-model HDTV's. See HDTV Refresh Rate Overclocking HOWTO: True 120Hz from PC to TV. Several Panasonic, Sony, Vizio models worked. Some TV's such as the new SEIKI 4K HDTV's (4K@30Hz) actually natively support 120Hz@1080p so on those models, it's no longer overclocking. Did you hear about the new $699 SEIKI 39" 4K HDTV? The manufacturer actually officially supports 120Hz from a computer on this.

Quote:
Some computer monitors can do 120Hz, but who wants to watch movies on a computer monitor? Because of these limitations I'm not really eager to look into BFI at the moment. I don't want to spend time on something which only works on computer monitors...
Agreed, but see above.

Quote:
Turning the backlight off while waiting for LCD pixels to finish refreshing is something you can't really do with a software based solution. So it's a good thing having that in hardware.
You can combine hardware BFI (black frame equivalent by backlight) and software BFI (black frame as actual inserted blank frame). That's how a custom version of MAME does it. This is because LightBoost only strobes at 100-120Hz, it cannot strobe at 60Hz, so software-BFI is needed in addition.

Software based BFI (60fps on 120Hz) would do this on normal 120Hz LCD monitors:
--> 8.3ms visible:8.3ms dark == 50% reduction of motion blur

However for doing 60fps @ 120Hz software BFI on a LightBoost display, something rather interesting happens: less motion blur on LCD than plasma (LCD colors are still terrible, though)
LightBoost flashes the backlight for as short as 1.4ms, so over two frames (one 8.3ms visible, one 8.3ms black, total 16.7ms cycle) you are now are getting:
--> 1.4ms visible + 6.9ms backlight off + 1.4ms black frame + 6.9ms backlight off

So effectively, you've combined hardware-based BFI (simulted via backlight) and software-based BFI (actual black frame), you are now getting:
--> 1.4ms visible + 15.3ms black frame (combined hardware+software)

This is a 92% reduction in motion blur over a 60Hz LCD (1.4ms of sample-and-hold blur, instead of 16.7ms of sample-and-hold blur) (web animation of sample-and-hold blur). Which produces CRT-quality motion of 60Hz sources on LightBoost LCD displays; that's less motion blur than plasma displays. This is since plasma displays have about 5ms of motion blur, due to red/green phosphor decay, and plasma motion blur becomes noticeable during very fast pans (e.g. when viewing TestUFO: Moving Photo test at 1440pixels/sec) while this Moving Photo test remains crystal sharp on a LightBoost display, especially at LightBoost=10%. I should point out that at 1440 pixels/sec, 1ms of sample-and-hold motion blur equals 1.4 pixels of motion blur. So 5ms does add up to quite a lot of motion blur during fast high-def pans where there is no source-based motion blur (e.g. soft focus, long exposure, overcompression). For 1080p 60fps high-bitrate video (especially 4K60fps downconverted to 1080p@60fps, to create ultra-sharp high-contrast edges), it is possible to begin to notice 1ms differences in motion blur during fast screen-width-per-second panning (1920 pixels/sec), because that's currently adding 1/1000th of 1920 = 2 pixels of motion blur. Very subtle. But human eyes can see 2 pixel differences at 1080p, if you're not too far from the display. So, it's in favour to expand the ratio of blackness to visible frame, shave off the milliseconds of visible frame, and that's where combining hardware BFI with software BFI can come in, to bypass a hardware BFI limitation (e.g. LightBoost not functioning at 60Hz).

The 92% reduction of blur is also confirmed via this graph too, during motion measurement tests (see graph, as well as various testimonials). However, I agree, most people won't be watching movies on a computer monitor, but it bears worth pointing out that LightBoost computer monitors have recently leapfrogged over televisions in terms of achievable motion resolution, even though they have terrible color quality compared to televisions.

Quote:
However, I think probably the hardware BFI implementation in many displays is far from optimal and a good software implementation could potentially improve on that.
That CAN be true, but properly done hardware BFI (especially backlight assisted, not just via panel) can generally be superior to software BFI:

(1) You don't need to rapidly switch the LCD pixel color value from on/off/on/off. This can reduce contrast severely. Hardware BFI assisted via backlight method (rather than LCD panel method) doesn't necessarily need to turn on/off LCD pixels or even completely; it just turns on/off the backlight at precisely synchronized intervals (or segments of backlight at a time, as in scanning backlights) Although terminology should not be used interchangeably, BFI/scanning backlights/strobe backlights all essentially, at the human eye level, roughly equivalent -- these approaches simply adds a dark period between refreshes -- to reduce the sample and hold effect (animation demo of sample and hold effect -- this animation looks very different on CRT than LCD).
(2) BFI at the LCD panel level (the only method available to software) also potentially causes some nasty inversion artifacts on some panels, due to the interference with the positive/negative voltages (lagom pixel-walk, techmind inversion explanation). So fine-single-pixel checkerboard texture patterns can appear during BFI.
(3) Some LED backlights are designed to pulse brighter when on, so you can compensate for BFI dimness by using brighter backlight flashes to compensate.
Doing hardware-based BFI can avoid these problem. The hardware-based BFI avoids that by using the backlight instead, preventing the need for LCD pixels to go back-and-fourth.

Quote:
If only TVs would support high refresh rates via HDMI. <sigh>
Good news. This is changing already. Some of them now do, and sometimes even semi-officially (e.g. SEIKI 4K).

I agree, low priority issue for now, but this message is an early canary warning that this is going to be slowly a more important feature in the future.
__________________
Thanks,
Mark Rejhon
www.blurbusters.com

Last edited by Mark Rejhon; 19th July 2013 at 18:33.
Mark Rejhon is offline   Reply With Quote
Old 19th July 2013, 18:09   #19630  |  Link
Soukyuu
Registered User
 
Soukyuu's Avatar
 
Join Date: Apr 2012
Posts: 169
Quote:
Originally Posted by madshi View Post
Just tried with my old NVidia 9400 mainboard. I can reproduce the problem there. *However*, by enabling the option "use 10bit image buffer instead of 16bit" the problem goes away. So that should be a good workaround for older GPUs.
Yes, that fixes it for me as well. A friend with a 560 doesn't have this issues, so I guess all nVidia GPUs 400 series and up are not affected.

I couldn't see any difference in the image between having 10 and 16Bit buffer, what does it really affect?
Soukyuu is offline   Reply With Quote
Old 19th July 2013, 18:28   #19631  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
Quote:
Originally Posted by Mark Rejhon View Post
Fortunately, that is gradually becoming easier for 2013-model HDTV's. See HDTV Refresh Rate Overclocking HOWTO: True 120Hz from PC to TV. Several Panasonic, Sony, Vizio models worked. Some TV's such as the new SEIKI 4K HDTV's (4K@30Hz) actually natively support 120Hz@1080p so on those models, it's no longer overclocking. Did you hear about the new $699 SEIKI 39" 4K HDTV? The manufacturer actually officially supports 120Hz from a computer on this.
I'm not a friend of overclocking. It means components are probably running at a higher temperature. Which can be just fine in the short run, but could result in hardware failure in the long run. Anyway, it's good to hear that things are improving and that some TVs now officially support 1080p120. Didn't hear about that SEIKI yet, but I'm in Europe, and that SEIKI is probably not available here, I would guess...

Quote:
Originally Posted by Mark Rejhon View Post
hardware BFI can generally be superior to software BFI because:
(1) You don't need to rapidly switch the LCD pixel color value from on/off/on/off. This can reduce contrast severely. Hardware BFI via backlight method (rather than LCD panel method) doesn't need to turn on/off LCD pixels; it just turns on/off the backlight (or segments of backlight at a time, as in scanning backlights) Although terminology should not be used interchangeably, BFI/scanning backlights/strobe backlights all essentially, at the human eye level, roughly equivalent -- these approaches simply adds a dark period between refreshes -- to reduce the sample and hold effect (animation demo of sample and hold effect -- this looks very different on CRT than LCD).
(2) BFI at the LCD panel level (the only method available to software) also causes some nasty inversion artifacts, due to the interference with the positive/negative frames (lagom pixel-walk, techmind inversion explanation). So fine-single-pixel checkerboard texture patterns can appear during BFI.
(3) Some LED backlights are designed to pulse brighter when on, so you can compensate for BFI dimness by using brighter backlight flashes to compensate.
Doing hardware-based BFI can avoid these problem. The hardware-based BFI avoids that by using the backlight instead, preventing the need for LCD pixels to go back-and-fourth (LCD is slow, backlight is faster).
Hopefully OLED takes care of all these problems. Hopefully they'll get lifetime and image retention issues under control. Then BFI for OLED could be a nice solution.

Quote:
Originally Posted by Mark Rejhon View Post
I agree, low priority issue for now, but this message is an early canary warning that this is going to be slowly a more important feature in the future.
I've had BFI on my "long term" list since I started developing madVR. However, I'd need a 120Hz capable display to actually develop something, and I think I'm going to wait for OLED.

Quote:
Originally Posted by Soukyuu View Post
I couldn't see any difference in the image between having 10 and 16Bit buffer, what does it really affect?
Well, it's quite a bit lower accuracy. Might eventually result in slightly less smooth gradiants...

-------

So here's the new official xy-VSFilter / XySubFilter thread:

http://forum.doom9.org/showthread.php?t=168282

Please post subtitle specific questions there, I mean things like incorrectly formatted subtitles etcs. If you have subtitle questions that are more madVR related than XySubFilter, you can post them here instead, e.g. if you run into GPU performance problems with XySubFilter or things like that...

Last edited by madshi; 19th July 2013 at 18:30.
madshi is offline   Reply With Quote
Old 19th July 2013, 18:33   #19632  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,463
Quote:
Originally Posted by Mark Rejhon View Post
I'm the operator of the Blur Busters Blog
Hi Mark, good to see you around

Apparently Sammy TV's don't do full frame BFI, just like a CRT, more like half/quarter vertical height or so....anyway yes, by the time mVR reaches 1.0 we will see whether there are more HDMI2/DP 120Hz+ displays on the market.
leeperry is offline   Reply With Quote
Old 19th July 2013, 19:06   #19633  |  Link
cyberbeing
Broadband Junkie
 
Join Date: Oct 2005
Posts: 1,859
I've made an official thread for the xy-VSFilter Project, which I'd encourage everybody to use for any XySubFilter discussions not directly related to madVR:

xy-VSFilter Project (High Performance VSFilter Compatible Subtitle Filters)
cyberbeing is offline   Reply With Quote
Old 19th July 2013, 19:09   #19634  |  Link
cyberbeing
Broadband Junkie
 
Join Date: Oct 2005
Posts: 1,859
Quote:
Originally Posted by madshi View Post
Just tried with my old NVidia 9400 mainboard. I can reproduce the problem there. *However*, by enabling the option "use 10bit image buffer instead of 16bit" the problem goes away. So that should be a good workaround for older GPUs.
Can you detect such GPUs and do this automatically?
cyberbeing is offline   Reply With Quote
Old 19th July 2013, 19:21   #19635  |  Link
xabregas
Registered User
 
Join Date: Jun 2011
Posts: 119
Quote:
Originally Posted by madshi View Post
Some people are reporting ghosting. I don't know why, I don't see it myself. Either I'm just not as perceptive to this kind of artifacts, or maybe your display uses a different refresh rate than 60Hz. E.g. if your display internally uses 70Hz, then it will probably repeat some frames to get from 60Hz to 70Hz. If those repeated frames happen to be madVR blended frames, you will see obvious ghosting. But I don't really know if that's what's happening on your PC. Can your display handle 72Hz in native display resolution? If so, try that.

Reclock is not needed for smooth motion to work correctly.


If your PC is fast enough, uncheck all those "trade quality for performance" options.

You may want to change to XySubFilter, not to xy-vsfilter. XySubFilter allows you to render subtitles in the same quality as MPC's internal subtitle filter - but with less CPU consumption, with correct colors and with correctly scaled 3D effects. The MPC internal subtitle filter is slower, shows incorrect colors and has problems scaling certain ASS effects correctly.
Thanks for the answer madshi, i was testing it without catalyst crap installed and i had several ghosting, but i installed CCC again and disabled all the crap settings except deinterlancing method and pulldown detection. Now i have almost zero ghosting with smooth motion on.

As for the refresh rate, not sure whats the correct refresh rate of my screen but CTRL+ J gives me 60.0014HZ on my sony TV and 59,9475HZ on my DELL desktop monitor

Not sure what this means, but i have no ghosting whatsoever on my desktop monitor and on my sony tV i have

i`ve just installed xysub but i dont know where is the option to put subtitles in the botton black bar on 2.35 movies

TIA
xabregas is offline   Reply With Quote
Old 19th July 2013, 19:31   #19636  |  Link
Soukyuu
Registered User
 
Soukyuu's Avatar
 
Join Date: Apr 2012
Posts: 169
Currently, madvr sets both the decoding and subtitle queue via cpu queue parameter. Could we have those separated in the future (if it makes sense internally)?

edit: About the only reason I'd want to have it is for more responsive jumping, since the decoding queue never falls below 63/64 while subtitle queue does drop to 11/64 on some heavily styled subs (and that's on a 4GHz X4 970BE cpu...)

Last edited by Soukyuu; 19th July 2013 at 19:37.
Soukyuu is offline   Reply With Quote
Old 19th July 2013, 20:57   #19637  |  Link
dansrfe
Registered User
 
Join Date: Jan 2009
Posts: 1,212
What exactly is the goal in setting queues? Should we bring it as low as possible or as high as possible? Does increasing the queue size add lag to initial render steps and/or skipping around the video? How do we know what the optimal queue sizes are for the machine we use?
dansrfe is offline   Reply With Quote
Old 19th July 2013, 21:00   #19638  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,495
Quote:
Originally Posted by madshi View Post
There will be an official thread for it here on doom9 soon. FWIW, the new subtitle interface is not "mine", although I played a role defining it and the header is hosted on my server. Still, it's a BSD licensed open specification and can be used by anyone who's interested. An MPC-HC dev already hinted that maybe they could use it for the MPC-HC internal renderers, too, but that's far from a "done deal". In any case, it's not "madVR's new subtitle interface", but just the "new subtitle interface" and madVR is one of the first who supports it. Just to clarify the situation...
Yeah, sorry, my wording was indeed misleading.
I didn't want to open a new thread myself because I was hoping for an official one. It's better if a team member does it, so he can always update the start post with fresh info straight from the source.
sneaker_ger is offline   Reply With Quote
Old 19th July 2013, 21:27   #19639  |  Link
Soukyuu
Registered User
 
Soukyuu's Avatar
 
Join Date: Apr 2012
Posts: 169
Quote:
Originally Posted by dansrfe View Post
What exactly is the goal in setting queues? Should we bring it as low as possible or as high as possible? Does increasing the queue size add lag to initial render steps and/or skipping around the video? How do we know what the optimal queue sizes are for the machine we use?
The lower the queue, the faster it can be filled (=less delay on seeking, especially with "delay playback until queue full" option), but if your PC lacks processing power, you will get dropped frames once it hits a demanding place.

So to find out if your queue needs adjusting, play a video of your choice and look at queue numbers on the OSD. If you get dropped frames and any queue shows as empty (or dangerously low), try increasing it by a few steps. I imagine the default values are sufficient in the majority of scenarios.

So far I only saw ONE video with .ass subtitles that dropped frames on default settings (the same I had to increase my cpu queue to 64 for). Even those subs that used to be regarded as "very heavy" with VSFilter (not Xy-VSFilter) decode without problems on my machine on default settings.
Soukyuu is offline   Reply With Quote
Old 19th July 2013, 21:29   #19640  |  Link
dukey
Registered User
 
Join Date: Dec 2005
Posts: 560
For dvd if the render the menus over the video, and you have large render queues, the latency on the buttons is hilariously bad
dukey is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 01:38.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.