Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 26th November 2017, 06:32   #47321  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,226
Quote:
Originally Posted by madshi View Post
Good testing. But annoying test results, because the source code changes between 0.91.11 and 0.92.1 are pretty large.

What is the easiest way to reproduce this problem?
Play anything, pause, change screen power off to 1 min and wait. Let screen blank then wait a bit, most times the video but not audio playback is frozen upon resuming.
For anyone else having this issue (is it Nvidia only?) try setting LAV to software decoding (Hardware decoding "none") until a fix is found.
ryrynz is offline   Reply With Quote
Old 26th November 2017, 06:46   #47322  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,959
Quote:
Originally Posted by heiseikiseki View Post
some of Kabylake-G use 4GB HBM2 memory for the GPU
so I think it would be ok for 4k.

Here are some bench marks information leaked
https://wccftech.com/intel-kaby-lake...pecifications/

The performance is about 3.3 TFLOPs.

Seems at least much much stronger than GTX750TI or GTX960.
it is still not clear if the IGPU get's replaced by the amd GPU or if both are on the chip. i don't have to remind anyone here how terrible nvidia "optimus" is.

and most important these chip are going to be expensive.
HBM2 is very expensive.
it a faster 560 which is decent no question. but really wait until they are out there is so much missing informations.
huhn is offline   Reply With Quote
Old 26th November 2017, 06:57   #47323  |  Link
heiseikiseki
Registered User
 
Join Date: Jan 2015
Posts: 37
Hello mashi,
I have some small problem confused me a long time with the osd which I don't know is it a bug of madvr or not.
I thought that is caused by my computer is optimus spec.
But recently I buy a new desktop with GTX1060 the problem is still not gone.

Here are some screenshot.

First I set my presented frames for 16


And here is the OSD


the decoder queue and upload queue seems work well is about 17-18 (I set 18)

Some questions here.

1. The render queue always not reach 18.
2. The present queue I set it 16 ,but the maximum here shows 15. (if I sent N, it would be N-1)
3. The Hardware Decoder I set D3D11 on LAV , but the OSD shows DXVA11.

Is it all normal?
heiseikiseki is offline   Reply With Quote
Old 26th November 2017, 07:37   #47324  |  Link
Nyago123
Registered User
 
Join Date: Jan 2016
Posts: 5
My notes on nVidia driver 388.31 + madVR 0.92.9 & playing HDR on a Win10 1709+LG E6+GTX 1080+Denon X4300H AVR:

Originally after the Fall Creator's update, the default Windows-included nVidia drivers wouldn't set HDR for me (MPC-BE). With the reports here on the problems, I went with a 385.xx version which was OK.

But I decided to play around with the most recent driver release (388.31). What I found is:

1. HDR works correctly with Zoom Player - it kicks in when starting the player, and disables when the player exits at least 90% of the time. One time it started with incorrect colors and one time the player hung. For the former, I just double toggled HDR in the Win 10 Display Settings; for the latter I just killed the process and tried again (to be honest, ZP freezes from time to time on me in general - nothing to do with HDR - so I can't tell if this was just "one of those times").

2. HDR works with MPC-HC but stays on when the player exits. I have to use the Win 10 Display Settings dialog and toggle on/off to get back to SDR.

3. HDR doesn't start at all with MPC-BE

I should probably spend a little more time looking at how LAV might be playing into this, but for now Zoom Player is my player of choice for HDR content until nVidia+MS can get this fixed.
Nyago123 is offline   Reply With Quote
Old 26th November 2017, 10:47   #47325  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 1,503
Quote:
Originally Posted by madshi View Post
For SDR screenshots, please compare F5 with PrintScreen. Is there a difference? You can configure F5 behaviour in the new madVR "screenshots" settings page.
Yes, there is, using F5 for different HDR->SDR conversion:
- pixel shader conversion is OK
- 3dlut conversion is Not (image is much darker)

PrintScreen in Windowed mode works fine for both of them.
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v385.28),Win10 LTSB 1607,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED65B8(2160p@23/24/25/29/30/50/59/60Hz)
chros is offline   Reply With Quote
Old 26th November 2017, 11:48   #47326  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
Quote:
Originally Posted by ABDO View Post
that would hurt my gtx 150 ti boost , i realy do not know.
Quote:
Originally Posted by cork_OS View Post
Please don't remove this option, RCA high is too slow for GTX 1060 and below.
Quote:
Originally Posted by Werewolfy View Post
Unfortunately, I don't think so because the GPU load is quite higher with high quality. Even on my Geforce GTX1080 I can see a difference so I supsect weaker GPU will struggle with this.
Ok!

Quote:
Originally Posted by Fabulist View Post
Thank you for taking your time to reply. Does Sharpen Edges really do any bloating on 1080p+ sources? I mean objectively and in technical terms, by like a specific amount or %?

I run tests on various sets and sources and I am unable to see any kind of bloating while running Sharpen Edges at 4 (like other sharpeners do) on up to 65 inches. Is there something I am missing or am I not looking at what I should be looking at? Does it bloat in a different way other sharpeners do, one which I cannot identify, like a different kind of video distortion?
Sharpen Edges internally does some supersampling which reduces bloating. It's all somewhat subjective, please trust your eyes and pick values that look good to you. This is all not really scientific. If you want scientific, you'd have to use deconvolution instead of sharpening, but even then, which deconvolution kernel would you be using. Gaussian or other? Linear light or gamma light?

Quote:
Originally Posted by Blackwalker View Post
I’ll try with 385.2x driver and check the “HDR and Advanced Color” tourned off.
Hope it will work for you!

Quote:
Originally Posted by yukinok25 View Post
It's strange thus, I am actually using NGU-sharp at medium as a chroma, why the crash would report NNEDI3?

It's because I left the option "Let MadVR decide" on somewhere?
Let madVR decide will never switch to NNEDI3 if you haven't selected it. In your bug report there were 2 crashes. One was not clear. The other one pointed to a crash in the Nvidia OpenCL driver. madVR uses OpenCL only for NNEDI3, so my conclusion was that you probably used NNEDI3. Please check the OSD (Ctrl+J) to confirm that NNEDI3 is really not used, neither in chroma upscaling nor image upscaling. If you're really not using NNEDI3, then I'm at a loss why the Nvidia OpenCL driver crashed!

Quote:
Originally Posted by Gopa View Post
newbie questions: I have a 65" 4k TV & so, upscaling 720p anime, results in a heavy GPU load - direct x4 luma upscaling to 2880p & then, downscaling to 2160p. Why is it not possible to just upscale 720p to 2160p (x3 instead of x4)?
It's possible, but difficult to do, and I'm not sure it would actually produce competetive results. It might look worse than x4 + downscaling. Furthermore, it would probably be only barely faster than x4 upscaling + downscaling. Anyway, x3 is still on my list of things to look at, but it's a low priority atm, because I doubt its usefulness.

Quote:
Originally Posted by heiseikiseki View Post
some of Kabylake-G use 4GB HBM2 memory for the GPU
so I think it would be ok for 4k.
Yes, that looks promising, actually! We'll have to wait and see how it works in reality, but it *could* be suitable for madVR!

Quote:
Originally Posted by huhn View Post
it is still not clear if the IGPU get's replaced by the amd GPU or if both are on the chip. i don't have to remind anyone here how terrible nvidia "optimus" is.
FWIW, the key problem with Optimus is that there are really 2 totally separate GPUs, each of which has their own HDMI/DVI driver inside. And what is worse: The actual driving of the display (and HDMI output) is done by the Intel GPU, while the rendering is done by the Nvidia GPU, so the 2 GPUs have to work together. Basically Nvidia has to transport the rendered frames to the Intel GPU, to be sent to the display. Practically this means, both drivers have to work together. That's like driver bugs ^ 2!

If the new Intel CPUs with integrated AMD GPUs only have one GPU instead of two, most (or even all) of the Optimus problems should not occur.

Quote:
Originally Posted by ryrynz View Post
Play anything, pause, change screen power off to 1 min and wait. Let screen blank then wait a bit, most times the video but not audio playback is frozen upon resuming.
For anyone else having this issue (is it Nvidia only?) try setting LAV to software decoding (Hardware decoding "none") until a fix is found.
So it only occurs with hardware decoding? Does it occur with DXVA2 copyback *and* native? How about D3D11?

Quote:
Originally Posted by heiseikiseki View Post
Hello mashi,
I have some small problem confused me a long time with the osd which I don't know is it a bug of madvr or not.
I thought that is caused by my computer is optimus spec.
But recently I buy a new desktop with GTX1060 the problem is still not gone.

the decoder queue and upload queue seems work well is about 17-18 (I set 18)

Some questions here.

1. The render queue always not reach 18.
2. The present queue I set it 16 ,but the maximum here shows 15. (if I sent N, it would be N-1)
3. The Hardware Decoder I set D3D11 on LAV , but the OSD shows DXVA11.

Is it all normal?
That's all perfectly normal. The queues don't have to be 100% full. As long as they're nearly full, everything's fine. The key thing to look at is that you don't get frame drops/repeats or presentation glitches increasing all the time during playback.

Quote:
Originally Posted by Nyago123 View Post
My notes on nVidia driver 388.31 + madVR 0.92.9 & playing HDR on a Win10 1709+LG E6+GTX 1080+Denon X4300H AVR:

Originally after the Fall Creator's update, the default Windows-included nVidia drivers wouldn't set HDR for me (MPC-BE). With the reports here on the problems, I went with a 385.xx version which was OK.

But I decided to play around with the most recent driver release (388.31). What I found is:

1. HDR works correctly with Zoom Player - it kicks in when starting the player, and disables when the player exits at least 90% of the time. One time it started with incorrect colors and one time the player hung. For the former, I just double toggled HDR in the Win 10 Display Settings; for the latter I just killed the process and tried again (to be honest, ZP freezes from time to time on me in general - nothing to do with HDR - so I can't tell if this was just "one of those times").

2. HDR works with MPC-HC but stays on when the player exits. I have to use the Win 10 Display Settings dialog and toggle on/off to get back to SDR.

3. HDR doesn't start at all with MPC-BE

I should probably spend a little more time looking at how LAV might be playing into this, but for now Zoom Player is my player of choice for HDR content until nVidia+MS can get this fixed.
Interesting. But why not stick to 385.xx until the issue if fixed?

Quote:
Originally Posted by chros View Post
Yes, there is, using F5 for different HDR->SDR conversion:
- pixel shader conversion is OK
- 3dlut conversion is Not (image is much darker)

PrintScreen in Windowed mode works fine for both of them.
Can you give me a few more details. How is 3dlut conversion not ok? Is the 3dlut not applied for the screenshot? Or is it applied incorrectly? What happens if you disable the 3dlut? Are F5 and PrintScreen identical then for SDR movies?
madshi is offline   Reply With Quote
Old 26th November 2017, 11:53   #47327  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,226
Quote:
Originally Posted by madshi View Post
So it only occurs with hardware decoding? Does it occur with DXVA2 copyback *and* native? How about D3D11?
Occurs with all of those, haven't been able to reproduce on Intel.
ryrynz is offline   Reply With Quote
Old 26th November 2017, 12:08   #47328  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
In case you guys wonder how NGU Sharp compares to mpv's latest FSRCNN(X), here's a little comparison:

Blu-Ray screenshot
downscaled (PNG) | (JPG, 100 quality)
latest FSRCNN32
latest FSRCNNX32
NGU Sharp - High
NGU Sharp - Very High

To make things as fair as possible I've downscaled the image with Bicubic/Catrom, which is exactly what FSRCNN and FSRCNNX were trained for.

Here are benchmark numbers, for 720p doubling:

Code:
Nvidia 1070:
FSRCNN16: 15.270 ms
FSRCNNX16: 26.397 ms
FSRCNN32: 46.290 ms
FSRCNNX32: ? (estimated: 80.021 ms)
NGU-Sharp High: 3.940 ms
NGU-Sharp Very High: 11.800 ms
Code:
AMD 560:
FSRCNN16: 14.289 ms
FSRCNNX16: 24.412 ms
FSRCNN32: 45.235 ms
FSRCNNX32: ? (estimated: 77.282 ms)
NGU-Sharp High: 12.970 ms
NGU-Sharp Very High: 37.100 ms
These are very weird benchmark results, to say the least. We know that NGU doesn't run as well as it should on AMD Polaris GPUs. But FSRCNN(X) running (ever so slighty) faster on my AMD 560 than on my Nvidia 1070 is just plain weird.

Last edited by madshi; 26th November 2017 at 13:29.
madshi is offline   Reply With Quote
Old 26th November 2017, 12:26   #47329  |  Link
ABDO
Registered User
 
Join Date: Dec 2016
Posts: 64
Quote:
Originally Posted by madshi View Post
In case you guys wonder how NGU Sharp compares to mpv's latest FSRCNN(X)
well, no Surprise here, NGU Sharp look, best, crisp and closer to
Blu-Ray.

edit
i see that FSRCNN bring some colours shift.

Last edited by ABDO; 26th November 2017 at 12:32.
ABDO is offline   Reply With Quote
Old 26th November 2017, 12:33   #47330  |  Link
thighhighs
Registered User
 
Join Date: Sep 2016
Posts: 50
Quote:
Originally Posted by Gopa View Post
Always open to suggestions for alternative settings.
What you think about NGU Sharp without any extra sharpness? IMHO:
NGU Sharp look very sharp by default, edges looks thin and sharp in focus. Another situation when you use NGU AA for image upscaling. NGU AA + post sharp can be better than NGU Sharp for low quality sources, because NGU AA can fix some edges, where NGU Sharp\Standard just sharp it.

Try something like this:

Artifact removal - Debanding: Low\Low or med\med
Artifact removal - Deringing: Off
Artifact removal - RCA: What you like
Image enhancements: Off
Chroma: NGU AA High
Image doubling: NGU Sharp High (looks like you prefer sharp image)
Chroma doubling: Off\ let madvr decide
Upscaling refinement: Off
Dithering: Orderer

edit:
Quote:
Originally Posted by ABDO View Post
well, no Surprise here, NGU Sharp look, best, crisp and closer to
Blu-Ray
+

Last edited by thighhighs; 26th November 2017 at 13:37.
thighhighs is offline   Reply With Quote
Old 26th November 2017, 13:20   #47331  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,959
Quote:
Originally Posted by madshi View Post
In case you guys wonder how NGU Sharp compares to mpv's latest FSRCNN(X), here's a little comparison:

Blu-Ray screenshot
downscaled (PNG) | (JPG, 100 quality)
latest FSRCNN32
latest FSRCNNX32
NGU Sharp - High
NGU Sharp - Very High

To make things as fair as possible I've downscaled the image with Bicubic/Catrom, which is exactly what FSRCNN and FSRCNNX were trained for.
are you sure everything is correct i mean FSRCNN has chroma bleeding...

but in term of anti alasing (ignoring chroma) i give FSRCNN more points but i would nearly for sure use NGU AA on this source.
i'm not a fan of NGU sharp on anime in general lines can look really strange.
huhn is offline   Reply With Quote
Old 26th November 2017, 13:34   #47332  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
Quote:
Originally Posted by huhn View Post
are you sure everything is correct i mean FSRCNN has chroma bleeding...
Well, I don't know how to activate FSRCNN(X) for chroma doubling. I don't know if it's possible. If anyone knows, let me know and I'll redo the FSRCNN(X) screenshots.

Quote:
Originally Posted by huhn View Post
but in term of anti alasing (ignoring chroma) i give FSRCNN more points but i would nearly for sure use NGU AA on this source.
i'm not a fan of NGU sharp on anime in general lines can look really strange.
Can you point me to where in the image exactly you see FSRCNN having lower aliasing than NGU Sharp?
madshi is offline   Reply With Quote
Old 26th November 2017, 13:38   #47333  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,959
Quote:
Originally Posted by madshi View Post
FWIW, the key problem with Optimus is that there are really 2 totally separate GPUs, each of which has their own HDMI/DVI driver inside. And what is worse: The actual driving of the display (and HDMI output) is done by the Intel GPU, while the rendering is done by the Nvidia GPU, so the 2 GPUs have to work together. Basically Nvidia has to transport the rendered frames to the Intel GPU, to be sent to the display. Practically this means, both drivers have to work together. That's like driver bugs ^ 2!

If the new Intel CPUs with integrated AMD GPUs only have one GPU instead of two, most (or even all) of the Optimus problems should not occur.
using the intel iGPU for desktop and co makes sadly a lot of sense. the hole system is clearly made for battery time.
i mean the price of HBM is about 3x GDDR5 and currently the major benfit of HBM is powerconsumption.
AMD evne has a zero power GHPU function and not using this woudl be quite a waste.
i'm just guessing here so whatever will take a some time until we see these devices anyway.

Quote:
These are very weird benchmark results, to say the least. We know that NGU doesn't run as well as it should on AMD Polaris GPUs. But FSRCNN(X) running (ever so slighty) faster on my AMD 560 than on my Nvidia 1070 is just plain weird.
do they use openCL.
i eman even with the handbrake copyback issues on AMD side these cards where usually still a lot faster than nvidia cards back in the nnedi 3 days.
huhn is offline   Reply With Quote
Old 26th November 2017, 13:45   #47334  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,959
Quote:
Originally Posted by madshi View Post
Can you point me to where in the image exactly you see FSRCNN having lower aliasing than NGU Sharp?
the bigs babys head for example.

the red hat top center (where FSRCNN has the huge chroma issue)

BTW. the chroma issue is visible on the JPG screen.
huhn is offline   Reply With Quote
Old 26th November 2017, 13:45   #47335  |  Link
Neo-XP
Registered User
 
Neo-XP's Avatar
 
Join Date: Mar 2016
Location: Switzerland
Posts: 140
Quote:
Originally Posted by madshi View Post
Well, LL on/off sharpen different kinds of edges with different strengths, so in some parts of the image LL off will sharpen more than LL on, and vice versa. I'm surprised you got halos, though. Do you happen to have a couple good test images where I can see why you prefer LL off? That might help with development.
A quick comparison with The Neon Demon ( Bluray / UHD Bluray ):



Let's try to upscale the Bluray ( NGU very high / NGU very high + AS 0.5 / NGU very high + AS 0.5 LL ):



Zoomed (AS with LL off and on) : http://screenshotcomparison.com/comparison/124629

I prefer with LL off, because it is sharper for the same value. Higher values for LL only accentuate artifacts around the edges so it is not fair to compare with different values.

The old NGU is also closer to the UHD Bluray :



Zoomed : http://screenshotcomparison.com/comparison/124639

Old NGU looks a lot better also for me for this movie, but I do not know if it was downscaled with LL or not.
Neo-XP is offline   Reply With Quote
Old 26th November 2017, 13:45   #47336  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
Quote:
Originally Posted by huhn View Post
are you sure everything is correct i mean FSRCNN has chroma bleeding...
Updated mpv screenshots, they have less chroma bleeding now.

Quote:
Originally Posted by huhn View Post
do they use openCL.
No, OpenGL/GLSL.
madshi is offline   Reply With Quote
Old 26th November 2017, 14:02   #47337  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
Quote:
Originally Posted by huhn View Post
the bigs babys head for example.

the red hat top center (where FSRCNN has the huge chroma issue)
I'm sorry, I'm having trouble seeing it. Where do you see aliasing there? Do you mean the top line of the big baby's head? If so, there's some minor aliasing there, but it's also in the Hi-Res image! Shouldn't you give NGU bonus points for reproducing the Hi-Res image more faithfully, warts and all?

Talking about the big baby's head: How about the more obvious differences, like the big baby's mouth, the eye brows, the nose?

Quote:
Originally Posted by Neo-XP View Post
A quick comparison with The Neon Demon ( Bluray / UHD Bluray )

Old NGU looks a lot better also for me for this movie, but I do not know if it was downscaled with LL or not.
Thanks!
madshi is offline   Reply With Quote
Old 26th November 2017, 14:20   #47338  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,463
Quote:
Originally Posted by madshi View Post
Is that 4x NGU Sharp in one step, followed by SR? Or is it 2X NGU Sharp + SR + 2x NGU Sharp + SR?
Direct 4X NGU sharp > SR@2 LL for both luma & chroma > SSIM 2D 100% AB25%, hard to believe how good VCD looks now with all the bells and whistles enabled huh!

I still wish you would properly align chroma for MPEG-1 especially as nevcariel said that he was willing to cooperate but it still looks great when outputting properly aligned RGB32 off ffdshow so no biggy I guess

Quote:
Originally Posted by madshi View Post
I wonder if I should maybe remove the "quality" option and auto pick, based on the strength?
Please don't, I can afford "high" on some videos but not on others and I have some blocky ones that actually appear to look better in "medium". I see "high" as the icing on the cake if I have GPU cycles to spare and PQ improves.

For the record I only use RCA@1, anything higher is too strong to my taste.

And somewhat unrelated to mVR but as much as we care about shielding of A/V cables, it seems that power cables are vastly overlooked and a triple-shielded garden-hose one has tremendously improved sharpness on my Sammy TV. Identical cable actually did the same on my G3-550 powered PC but then again I pickup +40 wifi networks here so not having them interfering with my A/V gear does appear to do the magic

Last edited by leeperry; 26th November 2017 at 14:43.
leeperry is offline   Reply With Quote
Old 26th November 2017, 15:12   #47339  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,959
Quote:
Originally Posted by madshi View Post
I'm sorry, I'm having trouble seeing it. Where do you see aliasing there? Do you mean the top line of the big baby's head?
that line
Quote:
If so, there's some minor aliasing there, but it's also in the Hi-Res image! Shouldn't you give NGU bonus points for reproducing the Hi-Res image more faithfully, warts and all?
well that topic again.
well i guess we want the image close to the original but one bonus of upscaling was anti alaising. NGU is just so sharp that it is hard to totally avoid alaising.
Quote:
Talking about the big baby's head: How about the more obvious differences, like the big baby's mouth, the eye brows, the nose?
NGU is generally sharper and the line thickness is closer to the original.
but i just gave FSRCNN(and i through waifux2 is a bad name...) more points for anti aliasing and nothing else and the new image isn't that much better now the chroma bleeding was hiding the aliasing quite a bit. but the thin lines from NGU have a bad part to at the bottom line of the red hat top center. the line is nearly completely gone now while FSRCNN makes it relative big now but the source has this line.

is there a reason you have to use a JPG with mpv don't tell me they don't support PNG? but why else should you add a jpg and a PNG as source...
huhn is offline   Reply With Quote
Old 26th November 2017, 16:10   #47340  |  Link
Soulnight
Registered User
 
Join Date: Jan 2017
Location: Germany
Posts: 21
Quote:
Originally Posted by Polopretress View Post
I agree. That would be great to make the option "command line to execute when this profile is activated/deactivated "works
Madshi: any idea when you could make this function work?

http://bugs.madshi.net/view.php?id=210
Soulnight is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 19:41.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.