Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 21st March 2018, 01:49   #49661  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 208
Thank you huhn. Direct link working. Can you look it over and tell me if I'm out of my mind?
This is the Sony 'Camp' video. You can d/l it here: http://4kmedia.org/sony-camping-in-nature-4k-demo/

https://i.imgur.com/QuV5CVT.jpg
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W10v1809 X5690 9604GB RGB 4:4:4 8bit Desktop @60Hz 8,10,12bit @Matched Refresh Rates
KODI MPC-HC/BE PDVD DVDFab
65JS8500 UHD HDR 3D

Last edited by brazen1; 21st March 2018 at 01:56.
brazen1 is offline   Reply With Quote
Old 21st March 2018, 02:24   #49662  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,352
you are sending 10 bit to the GPU driver.
it's using NV HDR.
the decoder should be lavfilter.
the file is maybe faulty no HDR meta data.

what i'm supposed to see here?
sending 10 bit to the GPU driver at UHD 60 hz us something i would never with HDMI 2.0 do but do as you please...
huhn is offline   Reply With Quote
Old 21st March 2018, 02:43   #49663  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 208
I use LAV Filters otherwise it wouldn't decode HDR in MPC-HC without them afaik.
Maybe the file is faulty. I just played a different 60fps UHD HDR.
This one shows meta data, BT 2020 upstream and downstream, HDR 1000 nits BT2020->DCI-P3 which tells me what it was mastered in.
What is wrong with sending 10bit UHD 60Hz with HDMI 2.0? That is the native rate and I always match them. My display is native 120Hz fwiw. I have others that I send at their native rates at 23Hz too.

We were having a discussion about how it's impossible to play 60Hz titles at 10bit due to the HDMI 2.0 limitation spec. Does that OSD reading confirm or deny that I'm indeed playing 60Hz at 10bit or is it not relevant at all?
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W10v1809 X5690 9604GB RGB 4:4:4 8bit Desktop @60Hz 8,10,12bit @Matched Refresh Rates
KODI MPC-HC/BE PDVD DVDFab
65JS8500 UHD HDR 3D
brazen1 is offline   Reply With Quote
Old 21st March 2018, 03:03   #49664  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,352
like said before you can't send UHD 60 hz10 bit RGB with HDMI 2.0 and madVR OSD has nothing todo with that.


you are just sending stuff the GPU driver. if you like to MPDN can do 16 bit and if madshi wants to he could add that too(the last time i heard something about this is pretty broken) but that has little to nothing todo what the GPU send the display.

so not sure why you comapre the OSD to what is send to the display device.
huhn is offline   Reply With Quote
Old 21st March 2018, 03:33   #49665  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 208
Thank you. Now I understand. madVR OSD only shows what is being input from the GPU and shows nothing about output to the display. I don't know everything about deciphering the madVR OSD like most of you. I'm learning. Thanks again.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W10v1809 X5690 9604GB RGB 4:4:4 8bit Desktop @60Hz 8,10,12bit @Matched Refresh Rates
KODI MPC-HC/BE PDVD DVDFab
65JS8500 UHD HDR 3D
brazen1 is offline   Reply With Quote
Old 21st March 2018, 05:35   #49666  |  Link
takenori
Registered User
 
Join Date: Nov 2010
Posts: 41
Quote:
Originally Posted by jkauff View Post
I just installed the new Nvidia drivers for my GTX 1080 without using DDU or Clean Install. I created a restore point first, just in case.

Once again, the madVR custom refresh rate entries were still there, but not working. For each one, I clicked Optimize, ran the test and saved the change (no change, actually), then rebooted. Played a test movie, and the refresh rate changed as it should have.

I did this for each entry that I use (only three), and all is working normally now. So basically a new Nvidia driver includes a 3-reboot penalty for my setup, but I can install over the old version and keep my Nvidia and OS settings.
this happened to me to after latest driver.
but a simply checking all custom resolution within nvidiacp do the work
takenori is offline   Reply With Quote
Old 21st March 2018, 08:26   #49667  |  Link
FDisk80
Registered User
 
Join Date: Mar 2005
Location: Israel
Posts: 158
Quote:
Originally Posted by Manni View Post
It's not converted to DCI-P3. It just lets you know that the content was mastered to DCI-P3 using a BT-2020 container. So you are correct to use a BT2020 calibration.

I suggested to Madshi to change the sign he uses to convey this information, as I think it is confusing to many users, but for some reason he didn't think it was confusing so it's stayed like that because he is the boss .



It would be nice if people could refrain from making blanket statements such as this. Yes, new drivers often break things for some. Yes, sometimes they break things for the majority.

But as far as I'm concerned, the latest 390.x drivers work fine, at least in 4K. I'm set to 4K23p with a custom refresh mode created by MadVR giving me a frame drop every 1-2 hours, and I use RGB 12bits 4:4:4. When I play 4K60p, the driver swaps automatically to 8bits 4:4:4. That survives a reboot and works fine here, as it has always done.

There are two limitations that I am aware of:

1) Like all the drivers after 285.28, you have to select 12bits from a non custom refresh mode. Once a custom refresh mode is selected, the resolution is greyed out and can't be changed, but it's still 12bits in 23p and 8bits in 60p if 12bits was selected from a non-custom refresh mode (at 30p max).

2) Compatibility is broken with Asio4all and most Asio drivers. Thanks to a kind member, I was able to find Flexasio, which works fine with some limitations.

So instead of making blanket statements simply because it doesn't work for you, please post details about your rig (OS version and build, GPU model, driver version) so that we can see if there is a common link between those for whom 390.x works fine, and those for whom 390.x doesn't work.

My rig is detailed in my sig.

By the way, 391.24 was just released today, I'm about to try it.
I posted details of the issue a bunch of times, anyway, it's GTX970 connected to a 1080P HDTV via HDMI running on Latest Windows 10 and with latest nvidia drivers, same issue with 391.24.

Tried, DDU in safe mode, clean installed nvidia drivers, same issue.
Last 3 driver releases cannot save tweaked custom resolution when set to 1080P23. Same for madVR, it throws driver error in your face if you try to apply a tweaked refresh rate setup.

And it's obviously a driver issue since the 390.77 works fine while 391.01, 391.05 and 391.24 do not.

Last edited by FDisk80; 21st March 2018 at 08:37.
FDisk80 is offline   Reply With Quote
Old 21st March 2018, 10:35   #49668  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 647
Quote:
Originally Posted by brazen1 View Post
Thank you huhn. Direct link working. Can you look it over and tell me if I'm out of my mind?
This is the Sony 'Camp' video. You can d/l it here: http://4kmedia.org/sony-camping-in-nature-4k-demo/

https://i.imgur.com/QuV5CVT.jpg
This screenshot shows that content is 4K60 4:2:0 10bits and you’re asking MadVR to dither to 10bits, which is wrong because the GPU can’t output 4K60p in 10bits over HDMI 2.0 in RGB 4:4:4. This doesn’t tell you anything about what is actually sent by the GPU to the display. Most likely, if using RGB 4:4:4 as you should be, the GPU is sending 4K60p 8bits, dithering behind MadVR’s back.

You should ask MadVR to dither to 8bits with 4K60p (using profiles to do this automatically) so that there is no dithering from the GPU driver done behind MadVR’s back. That or use 8bits dithering in MadVR all the time.

If you use YCB 4:2:2 then you can send 4K60p 10 or 12bits at 4K60p, but it’s not recommended as the worse chroma upscaling is probably wiping out any (very marginal) benefit of dithering to 10bits vs 8bits.
__________________
Win10 Pro x64 b1806 MCE
i7 3770K@4.0Ghz 32Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 385.28 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.24
Denon X8500H>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 21st March 2018, 10:58   #49669  |  Link
jkauff
Registered User
 
Join Date: Oct 2012
Location: Akron, OH in the U.S.A.
Posts: 422
Quote:
Originally Posted by takenori View Post
this happened to me to after latest driver.
but a simply checking all custom resolution within nvidiacp do the work
I only use madVR custom refresh rates so I can use madshi's optimization data. I don't have any custom resolutions in Nvidia CP.
jkauff is offline   Reply With Quote
Old 21st March 2018, 11:12   #49670  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 647
Quote:
Originally Posted by brazen1 View Post
I use LAV Filters otherwise it wouldn't decode HDR in MPC-HC without them afaik.
Maybe the file is faulty. I just played a different 60fps UHD HDR.
This one shows meta data, BT 2020 upstream and downstream, HDR 1000 nits BT2020->DCI-P3 which tells me what it was mastered in.
What is wrong with sending 10bit UHD 60Hz with HDMI 2.0? That is the native rate and I always match them. My display is native 120Hz fwiw. I have others that I send at their native rates at 23Hz too.

We were having a discussion about how it's impossible to play 60Hz titles at 10bit due to the HDMI 2.0 limitation spec. Does that OSD reading confirm or deny that I'm indeed playing 60Hz at 10bit or is it not relevant at all?
Not relevant at all as explained earlier on a few occasions. 18gb/s is the max bandwidth for HDMI 2.0. That's a hardware limitation and a best case scenario (some older devices can only do 10Gb/s). 4K60p RGB 4:4:4 8bits requires 17.62Gb/s. So as you can see, it's simply impossible to send anything higher than 8bits unless you lower chroma or frame rate.

Quote:
Originally Posted by FDisk80 View Post
I posted details of the issue a bunch of times, anyway, it's GTX970 connected to a 1080P HDTV via HDMI running on Latest Windows 10 and with latest nvidia drivers, same issue with 391.24.

Tried, DDU in safe mode, clean installed nvidia drivers, same issue.
Last 3 driver releases cannot save tweaked custom resolution when set to 1080P23. Same for madVR, it throws driver error in your face if you try to apply a tweaked refresh rate setup.

And it's obviously a driver issue since the 390.77 works fine while 391.01, 391.05 and 391.24 do not.
I am not doubting you have an issue, I am only asking you to stop saying it is an issue for everyone, as you keep posting, because it's misleading and not true.

Saying "390.x drivers are broken" doesn't help anyone.

Saying that in 1080p23 you can't save tweaked custom resolution with a GTX970 on Windows 10 since 390.77 becomes more useful, because from there we can see who else has the same issue and whether they use the same GPU model or default resolution.

So thanks for posting the additional details, I hope that others with the issue will post detailed info as well, and others for whom it works fine (like myself) will do too, so we can gather more data and see if there are common points between those for whom it still works exactly the same and those for whom it's broken.
__________________
Win10 Pro x64 b1806 MCE
i7 3770K@4.0Ghz 32Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 385.28 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.24
Denon X8500H>HD Fury Maestro>JVC RS2000

Last edited by Manni; 21st March 2018 at 11:17.
Manni is offline   Reply With Quote
Old 21st March 2018, 13:49   #49671  |  Link
NoTechi
Registered User
 
Join Date: Mar 2018
Location: Germany
Posts: 45
Hi all,

since I am using a JVC 7900 I am interested in the dynamic HDR of madVR.
My question is how much GPU power a system for madVR would need if I just go for dynamic HDR.

I am struggeling to decide if I should wait for the new upcoming NUC Hades Canyon or start a HTPC from scratch.
I know with a HTPC from scratch I could go for all optimizations in madvr but the system would be louder and more expensive.
The new NUC is reported to have a similiar performance as a nvidia 1060 and if it could handle dynamic HDR my requirement would be met.

NoTechi
NoTechi is offline   Reply With Quote
Old 21st March 2018, 15:19   #49672  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,005
Quote:
Originally Posted by NoTechi View Post
Hi all,

since I am using a JVC 7900 I am interested in the dynamic HDR of madVR.
My question is how much GPU power a system for madVR would need if I just go for dynamic HDR.

I am struggeling to decide if I should wait for the new upcoming NUC Hades Canyon or start a HTPC from scratch.
I know with a HTPC from scratch I could go for all optimizations in madvr but the system would be louder and more expensive.
The new NUC is reported to have a similiar performance as a nvidia 1060 and if it could handle dynamic HDR my requirement would be met.

NoTechi
Intel doesn't support dynamic HDR switching (yet) in madVR. You need a Nvidia or AMD GPU. Otherwise, you will have to manually toggle the HDR switch when watching HDR files.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 16:42   #49673  |  Link
NoTechi
Registered User
 
Join Date: Mar 2018
Location: Germany
Posts: 45
Quote:
Originally Posted by Warner306 View Post
Intel doesn't support dynamic HDR switching (yet) in madVR. You need a Nvidia or AMD GPU. Otherwise, you will have to manually toggle the HDR switch when watching HDR files.
Warner,

the upcoming NUC Hades Canyon has a AMD GPU (Radeon RX Vega M GH) paired with a Intel CPU.
The 7900 projector switches to the HDR preset as Long as the HDR flag is set.
However my question was in regards to GPU power required for dynamic HDR while watching a movie. My understanding was that madVR analysis the currently playing video and adjusts e.g. gamma to get the best HDR settings depending on the movie scene.

NoTechi
NoTechi is offline   Reply With Quote
Old 21st March 2018, 16:50   #49674  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 208
Hi Manni. Thank you. TBH, this whole 4K60p was just a test bed and I've learned plenty from you guys as always. I've yet to run across a real world example of 4K60p other than those test files as my rips are 4K23p but in the event they manifest, this knowledge is good to better understand how to deal with them and if it's ok with you I'll hit you up for profile codes when/if applicable. Actually I think there was that Billy Lynn title but I don't own it. I assume these profiles for dithering are based on resolution and would be created in Display connected to AVR tab?

In the mean time I assume leaving madVR setting to dither at 10bit or higher for 4K23p including (540p through 1080p 8bit at resolution from 23Hz to 60Hz) etc. when using RGB 4:4:4 remains the correct config or should I be using additional profiles since 1080p etc. are 8bit? In short, leave madVR at 10bit or higher for everything except 4k above 30Hz? Furthermore, I'm thinking the GPU is not doing dithering ahead of madVR and madVR will dither them down in these examples? Still foggy in this area. Btw, did new driver present any problems? Considering what I'm learning recently, no reason I shouldn't be using one of the newer drivers if not the latest.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W10v1809 X5690 9604GB RGB 4:4:4 8bit Desktop @60Hz 8,10,12bit @Matched Refresh Rates
KODI MPC-HC/BE PDVD DVDFab
65JS8500 UHD HDR 3D
brazen1 is offline   Reply With Quote
Old 21st March 2018, 17:16   #49675  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 647
Quote:
Originally Posted by brazen1 View Post
In short, leave madVR at 10bit or higher for everything except 4k above 30Hz?
That.

I haven't spent enough time with the drivers to note any new issues. I only was able to confirm that they were working fine here after checking the usual possible issues.
__________________
Win10 Pro x64 b1806 MCE
i7 3770K@4.0Ghz 32Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 385.28 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.24
Denon X8500H>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 21st March 2018, 18:24   #49676  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,005
Quote:
Originally Posted by brazen1 View Post
Hi Manni. Thank you. TBH, this whole 4K60p was just a test bed and I've learned plenty from you guys as always. I've yet to run across a real world example of 4K60p other than those test files as my rips are 4K23p but in the event they manifest, this knowledge is good to better understand how to deal with them and if it's ok with you I'll hit you up for profile codes when/if applicable. Actually I think there was that Billy Lynn title but I don't own it. I assume these profiles for dithering are based on resolution and would be created in Display connected to AVR tab?

In the mean time I assume leaving madVR setting to dither at 10bit or higher for 4K23p including (540p through 1080p 8bit at resolution from 23Hz to 60Hz) etc. when using RGB 4:4:4 remains the correct config or should I be using additional profiles since 1080p etc. are 8bit? In short, leave madVR at 10bit or higher for everything except 4k above 30Hz? Furthermore, I'm thinking the GPU is not doing dithering ahead of madVR and madVR will dither them down in these examples? Still foggy in this area. Btw, did new driver present any problems? Considering what I'm learning recently, no reason I shouldn't be using one of the newer drivers if not the latest.
Remember, madVR processing starts at a bit depth higher than 10-bits to avoid color conversion errors. It simply dithers the result down to the output bit depth set. So this choice is irrelevant, as you will not be changing the color space, just the number of steps between each color. Outputting an 8-bit source at 10-bits means less dithering is added creating less noise in the image. The gradient gets smoother as the bit depth is increased. So think of the image in terms of a gradient with fixed top and bottom values. The choice of bit depth impacts what is in between the top and bottom values. More steps = a smoother, less noisy image.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 18:32   #49677  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,005
Quote:
Originally Posted by NoTechi View Post
Warner,

the upcoming NUC Hades Canyon has a AMD GPU (Radeon RX Vega M GH) paired with a Intel CPU.
The 7900 projector switches to the HDR preset as Long as the HDR flag is set.
However my question was in regards to GPU power required for dynamic HDR while watching a movie. My understanding was that madVR analysis the currently playing video and adjusts e.g. gamma to get the best HDR settings depending on the movie scene.

NoTechi
HDR -> SDR conversion requires GPU power. HDR passthrough does not (or maybe it takes a little; I don't know. But not as much). A setting of passthrough lets the display decide how the content is mapped rather than madVR. So you can't use madVR to improve HDR presentation on a HDR-compatible display. That is up to the format used and the quality of the display.

So any GPU with at least 4GB of VRAM and HEVC decoding will do. GPUs with greater power will be more capable of using madVR processing features such as artifact removal and image upscaling. madVR is very good at the image upscaling of 1080p Blu-rays to 4K. So consider this feature when buying a GPU for madVR. A GTX 1060, at minimum, is required to push madVR to higher settings. But a 1050 Ti will allow for basic madVR settings and no limitations on features. It all depends on how much money you want to spend. GPU prices are terrible right now. So there is no hurry to upgrade to 4K.

Last edited by Warner306; 21st March 2018 at 18:36.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 19:07   #49678  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 208
Thanks Warner for the further details...
I'm sorry guys. I just can't get my head wrapped around all of this. Here's what I'm struggling to understand:

Installed new driver. RGB 4:4:4 and set it to my native 2160p 8bit 60Hz. Then I switched to 2160p 12bit 23Hz and 24Hz. Then set back to 2160p 8bit 60Hz. Next I played a 2160p 23Hz HDR 10bit title no FSE. Looked in NCP during playback and it is showing 8bit at 23Hz as if it ignored my previous command to play 23Hz at 12bit. My display does not show detailed info so I check info from my Denon AVR. It shows RGB 4:4:4 8bit. To me, I don't think this is correct and why I ask you guys. So, during playback I select 12bit in the NCP. I go back to info from AVR and it shows RGB 4:4:4 12bit now. I know title is 10bit so AVR info means nothing I guess? True? Either does bit set depth setting in NCP? True? And madVR does not report anything beyond what the GPU is sending it? True? So how do I know if my display is outputting 8bit or taking advantage of the higher 10bit depth of an HDR title? Sorry I am so na´ve!

To make understanding more difficult, after reboot that 12bit setting no longer appears in NCP or my AVR even though I manually changed during playback before I rebooted. It's back to 8bit as if I never set it.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W10v1809 X5690 9604GB RGB 4:4:4 8bit Desktop @60Hz 8,10,12bit @Matched Refresh Rates
KODI MPC-HC/BE PDVD DVDFab
65JS8500 UHD HDR 3D

Last edited by brazen1; 21st March 2018 at 19:20.
brazen1 is offline   Reply With Quote
Old 21st March 2018, 19:09   #49679  |  Link
NoTechi
Registered User
 
Join Date: Mar 2018
Location: Germany
Posts: 45
Quote:
Originally Posted by Warner306 View Post
HDR -> SDR conversion requires GPU power. HDR passthrough does not (or maybe it takes a little; I don't know. But not as much). A setting of passthrough lets the display decide how the content is mapped rather than madVR. So you can't use madVR to improve HDR presentation on a HDR-compatible display. That is up to the format used and the quality of the display.

So any GPU with at least 4GB of VRAM and HEVC decoding will do. GPUs with greater power will be more capable of using madVR processing features such as artifact removal and image upscaling. madVR is very good at the image upscaling of 1080p Blu-rays to 4K. So consider this feature when buying a GPU for madVR. A GTX 1060, at minimum, is required to push madVR to higher settings. But a 1050 Ti will allow for basic madVR settings and no limitations on features. It all depends on how much money you want to spend. GPU prices are terrible right now. So there is no hurry to upgrade to 4K.
Warner many thanks I start to understand now!

So it looks like those who are using a projector and are playing HDR content they convert it to SDR BUT let madVR improve the picture including dynamic improvements depending on the movie scene. Most likely they remove the HDR flag so the projector stays in some non HDR but BT2020 setting instead of auto switching to HDR. Especially on projectors with limited lumen output compared to TVs this "fake HDR/pimped SDR" might bring better HDR like results then having HDR passthrough or not using madVR at all.
As I get you right this conversion to SDR plus madVR improvements will need lots of power where this new NUC might come to its limits.

Thanks for clarification Warner and yes gpu prices are insane atm :/

NoTechi
NoTechi is offline   Reply With Quote
Old 21st March 2018, 19:17   #49680  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,005
Quote:
Originally Posted by NoTechi View Post
Warner many thanks I start to understand now!

So it looks like those who are using a projector and are playing HDR content they convert it to SDR BUT let madVR improve the picture including dynamic improvements depending on the movie scene. Most likely they remove the HDR flag so the projector stays in some non HDR but BT2020 setting instead of auto switching to HDR. Especially on projectors with limited lumen output compared to TVs this "fake HDR/pimped SDR" might bring better HDR like results then having HDR passthrough or not using madVR at all.
As I get you right this conversion to SDR plus madVR improvements will need lots of power where this new NUC might come to its limits.

Thanks for clarification Warner and yes gpu prices are insane atm :/

NoTechi
I think passthrough is the higher-quality method. Your display knows itself best, so it should be calibrated to maximize HDR content. Every display is designed to map using its own methods. It is not a universal algorithm.

Last edited by Warner306; 21st March 2018 at 19:35.
Warner306 is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 09:55.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.