Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
15th December 2013, 03:13 | #21081 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
Quote:
|
|
16th December 2013, 16:16 | #21083 | Link |
Registered User
Join Date: Aug 2008
Location: the Netherlands
Posts: 851
|
Is it possible to output x.v.Color from my HTPC when madVR is used with MPC-BE and when using a ATI HD7950 GPU? I ask this because my display device supports x.v.Color and I want to check out some MI4K (Mastered In 4K) Blu-Ray content. But if you want to be able to see the larger color space x.v.Color the chain must be correct so:
MI4K discs (which can be played with MPC-BE) => output x.v.Color from HTPC => display device that supports x.v.Color. An example of a current setup that can do this is: MI4K Blu-Ray discs => PS4 => Sony VW500 projector. |
16th December 2013, 17:17 | #21084 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
first of all you should look if your beamer is capable of showing colors out of srgb. you can see that after calibration. and at the end it "doesn't matter" if you output it with xv color or rgb.
the created 3d lut should use as many informations from the x v color spaces as possible. but i'm missing a lot of knowledge in this case so... at first you should make sure that your device can show colors which excessed rgb. Last edited by huhn; 16th December 2013 at 17:49. |
16th December 2013, 22:45 | #21085 | Link |
Registered User
Join Date: Feb 2013
Posts: 137
|
I installed Windows 8.1 yesterday and I have a problem with refresh rate. When the video goes in fullscreen exclusive mode, refresh rate always switch from 24hz to 23hz.
I have read that this issue was fixed in madVR 86.3, I don't understand why it still happens now. I have tried the display mode switcher from madVR but it doesn't fix the issue. It works well in fullscreen overlay mode but never in exclusive mode. Last edited by Werewolfy; 16th December 2013 at 23:01. |
17th December 2013, 19:27 | #21088 | Link | |
Registered User
Join Date: Apr 2009
Posts: 1,019
|
Quote:
The only real difference is that power consumption is a bit higher, and seeking performance is not as smooth. In Windowed or Windowed Overlay mode, seeking freezes up every so often if I'm just holding down a key to seek through a video. FSE mode does not have this problem for me. Last edited by 6233638; 17th December 2013 at 19:31. |
|
18th December 2013, 00:41 | #21089 | Link |
Registered User
Join Date: Feb 2013
Posts: 137
|
Thanks for the information
I use overlay mode because I have better rendering times but it doesn't make a huge difference. Too bad we have those nasty issues with Windows 8.1, I really don't know why Microsoft has changed a thing that worked perfectly before. |
18th December 2013, 03:41 | #21090 | Link | |
Registered User
Join Date: Apr 2009
Posts: 1,019
|
Quote:
Supported hardware (tablet devices right now) is able to switch from 60Hz to 48Hz when playing films without any indication that the refresh rate has changed. This gives you judder-free playback of films, and reduces power consumption. |
|
19th December 2013, 02:14 | #21091 | Link | |
Registered User
Join Date: Nov 2011
Location: Denmark
Posts: 137
|
Quote:
|
|
19th December 2013, 08:28 | #21092 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
I think this might due to the interaction of smooth motion and fade detection, frames that had been generated twice by smooth motion (which should be dropped) being dropped and showing up in the OSD. If you set the fade level to the same as the normal level for debanding you should stop seeing dropped frames in the OSD.
madVR also needs to rerender the last 5 frames when it detects a fade so fade detection when debanding might cause real frames drops when using small buffer sizes. Last edited by Asmodian; 19th December 2013 at 08:33. |
19th December 2013, 18:26 | #21093 | Link |
Kid for Today
Join Date: Aug 2004
Posts: 3,477
|
BTW, we've often discussed what a good GPU for mVR would be, the answer was the GTX660 and IME a factory overclocked HD7850 does the trick very nicely
I constantly see gamers whining that their CPU acts a bottleneck in games showing all kinds of fps comparisons to make their point, but I'm very surprised to see that the Haswell Pentium's are essentially just as fast as their old S775 quads in many benchmarks: http://www.cpubenchmark.net/high_end_cpus.html I've also seen mVR users trying to keep their GPU in low power state in order to save a few watts, but for instance the 55€ Pentium G3420 comes with a 54W TDP and would appear to be an ideal choice for foobar+mVR users? When using DXVA2 CP decoding, the only things running off the CPU would be the audio decoding/post-processing, Reclock, LAV Splitter and mVR's deinterlacing basically...54W does sound very yummy |
19th December 2013, 19:05 | #21094 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,347
|
Faster CPUs don't consume more power, unless you actually give them more stuff to work on.
In fact, high-end CPUs are usually of a better silicon quality, which offer less leakage, and thus less power use. Buying a slower CPU to save power is just wrong, it doesn't work. At best you use the same power as a faster CPU, at worst, even more. That is of course, assuming the same workload. Can't compare 100% load on both CPUs, have to compare the same workload, ie. video playback, and not benchmarks.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
19th December 2013, 19:25 | #21095 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
the tdp is misleading it only shows the theoretical max power usage but typical usage and prime benchmarking are normally lower and the g/a-pu is a part of the tdp too.
2 cores should spare power in idle that's about it when the same architecture is used. |
19th December 2013, 19:32 | #21096 | Link |
Kid for Today
Join Date: Aug 2004
Posts: 3,477
|
Indeed, TDP is highly theoritical and I realize that a 2X/3X pricier CPU could yield a (slightly?) lower watt consumption than a G3420 for the same job, but it might very well be the same story as with the most efficient Platinum/Titanium 80+ PSU's that cost a lot more than a regular Bronze 80+ and come with a very low ROI in comparison. This would require a lot of real-world power measurements I guess.
All I'm seeing is that a 55€ 54W Pentium G3420 would appear to be almost as fast as my old 95W Q9450 in many benchmarks and I barely see the CPU load of the latter going over 15% in the Windows Task Manager while watching movies. |
20th December 2013, 01:31 | #21097 | Link | |
Audiophile
Join Date: Oct 2006
Posts: 353
|
Quote:
|
|
20th December 2013, 11:48 | #21099 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
Quote:
but changing back to "present several frames in advance" make you trouble in exclusive mode until i restart the player. |
|
20th December 2013, 11:58 | #21100 | Link |
_
Join Date: May 2008
Location: France
Posts: 692
|
...an excellent value for the money if we consider a lot of applications can't handle more than 1 thread, still today (task manager still shows 50% use with my 2 cores CPU when working hard with some applications).
|
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
|
|