Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
15th March 2011, 20:01 | #16621 | Link |
Registered User
Join Date: May 2007
Posts: 54
|
Found the solution
Copying files below to C:/windows - cook.dll - drv33260.dll - drv43260.dll - pncrt.dll - sipr3260.dll Thank you ==== I wonder if I can use MPC as udp streaming client like VLC, is it possible? Also I have TS35 RC with 2.5mm jack (come with SkyStar HD2), can use my RC with MPC? |
16th March 2011, 02:53 | #16624 | Link |
Registered User
Join Date: Sep 2009
Location: Sydney, Australia
Posts: 1,073
|
mr.duck, make sure to test with: http://forum.doom9.org/showthread.php?t=156191
Support ticket would be good. |
16th March 2011, 03:26 | #16625 | Link |
Registered User
Join Date: Jan 2010
Posts: 479
|
I think it will never work with the current Geforce cards (any on the market right now), except the Qaudro cards with native DisplayPort outputs (so softmodded Geforce cards won't work either, as I don't know any Geforce cards with real DisplayPort output) and official 10-bit support (which only works with real 10-bit capable displays --where the EDID from the display input reports the support-- like professional displays and a few "lucky" mainstream hardwares).
The only reason that it works with many Radeon VGAs that they always output either real 10-bit, or dithered 8 bit/color from their LUT (no matter what is in the frame-buffer), so they can support 10-bit output on capable displays (just like the Quadros) or emulate 10 bit precision on 8-bit displays (as it most likely happens with most of the displays without DisplayPort input -- even though HDMI 1.3+ supports deepcolor up to 12 bit/color as well, but I think the Radeons always output dithered 8 bit on their HDMI outputs, just like they do on single-link DVI, etc... -> but this HDMI thing is only my speculation...) But it's not a necessarily bad thing at all, as you probably end up using the built-in dithering of the Radeons. However, you can also apply dithering on software level which you can control (completely turn off, choose static or dynamic methods and control the amount of the dynamic noise...) while the spatial/temporal dithering in the hardware is fixed (it can not be turned off or adjusted in any ways). Moreover, the software dithering works from the surface bit depth (up to 32-bit floating point with the Full floating point presentation mode), while the hardware dithering starts from the framebuffer bit depth which is limited to 10-bit integer (at least in DX9 but I wouldn't expect a DX11 renderer any time soon...) So, I think I would prefer to eliminate this hardware (VGA) level dithering, if the display doesn't really support 10 bit and use the software level dithering instead. However, the hardware (VGA) level dithering does a really good job if you calibrate your display through your VGA LUT. In this case, it also makes sense to use 10-bit frame buffers with 8-bit displays as well. -> So, this is why I tried to improve the CMS implementation in MPC-HC. In my theory, it would be the best to leave the 1D VGA LUT alone (which has 10-bit effective precision with VGA level dithering and 8-bit otherwise) and do every corrections at software level (with up to 32-bit floating point precision). I can circumscribe three main cases: A): 8-bit display (let's forget about the 6-bit TNs now, they will never be accurate) with 8-bit effective VGA LUT precision (most non-TN [or stupid c-PVA] displays with old [pre-HDxxx] Radeons or any Geforce cards) -> In this case, I belive it's the best to skip any kind of hardware level (display OSD or VGA LUT) calibration and let the software to do the CMS and apply high quality dithering for the 8-bit output. B): Real 10-bit displays (or native 8-bit displays with internal dithering and 10+ bit capable controllers, as a border-line case) with VGA cards which can really output native 10-bit/color. (As much as I know it's only supported through DisplayPort but HDMI and even though dual-link DVI should theoretically work too, it's not guaranteed...). - This is a very rare case today (only some professional and very few mainstream displays on the market). -> In this case, I still believe that it's better to avoid the hardware level corrections and do the high quality software corrections with fine dithering for the 10-bit output. C): Your professional display has 12+ bit programmable internal 3DLUT and >12 bit internal processing (very rare and expensive). -> Congratulations, you should carry out the hardware calibration and enjoy your perfect display. (But may be you still need to apply some dithering for the 10-bit output due to the limited range YCC -> full range RGB conversion. But the dithering noise is almost unnoticeable with real 10-bit displays... ; But you don't need ICC profile based color corrections...) And of course, there are some unique "mixed" situations (like a professional display with only 1D internal LUTs). This requires unique testing and judge what it the best solution. (I think the hardware calibration should aim panel-friendly targets and let the software to carry out most of the 3D color corrections.) But it's a "small" problem with my ideas that they go straight against the good old professional routines and standards (like ICC standard) which work good enough in practice. -> But note that these kind of software level corrections which I am talking about were not possible (at least not in real time) until we started to use fast VGA cards to carry them out. (You can see that Full Floating point processing and big 3DLUTs are still heavy for today's hardwares...) ///And note that I said they work "good enough". But it's obviously not perfect because nobody would pay for the expensive professional displays with programmable high precision internal 3DLUTs otherwise...) ------------- And a practical problem which I face right now that we mostly use LCD displays (or LCD projectors) today with relatively low contrast ratios which would require smart black-point compensation on the CMS engine but it doesn't seems to exist (or work well) in current ICC based CMS engines. (And NeoPDP panels offer high contrast but they aren't issue-free either. They apply heavy hardware level dithering for example which makes the profiling and software level dithering harder - if not effectively impossible [in good quality]) -> At this point, I should mention that yCMS goes on a very nice way (according to my opinion) but the development seems slow. (It's still don't work with separated per-channel TRC data to offer proper white balance correction... and some "broken" digital displays would require even more, like the XYZ cLUT based profiles do in the ICC word -> but they have their own "issues" as well, so...) ---- I think I went much further than answering your question, so a little summary: -> Don't worry about the 10-bit output, it's not the holly grail (only a very little piece of the apple). (Or you can buy a real 10-bit display and a Quadro with DP output. -> You will still need high quality software CMS unless that high-end display offers high bit internal 3DLUTs - which will still require high quality software processing because of the YCC->RGB conversion, as it's not possible to feed the hardware with the source format directly, so you need software processing --- or hardware processing but that's a whole different story without PCs and player softwares...) --- Otherwise, I am still looking forward to see when madVR will support 10-bit output. (And I hope this statement doesn't disturb you. ) Last edited by janos666; 16th March 2011 at 03:31. |
16th March 2011, 03:40 | #16626 | Link |
Registered User
Join Date: Jan 2010
Posts: 479
|
Another thing -> bugreport (?)
I can't play divx SD material with EVR-CP (using JanWillem's latest x86 test build ; even if I disable all the fancy settings) on the PC which I am currently using (not my own home PC but a very similar one - in both software and hardware). It doesn't matter which source filter or decoder I use. (None of them are DXVA decoders.) HD plays fine with every fancy settings too... I can see same artifacts on the first few images and then I need to use the RESET button (on the PC chassis). The standard EVR works fine. ------------- Another possible bug report: It seems this build has some sync problems. (May be inaccurate refresh rate or source fps detection?) I also noticed it with my home PC but I wasn't sure about it because that's a fixed-60Hz panel. Here I use a display with exact 24/1.001*3 Hz timing and it still trys to play with V-sync (and causes some judder). I had to disable V-sync. (The older build worked fine. -> I don't know which one was that before I updated to the latest...) ---- Sorry that I can't say more. There is no MSVC on this PC and I can't gather more informations after a complete system break-down...) Last edited by janos666; 16th March 2011 at 03:44. |
16th March 2011, 03:48 | #16627 | Link |
Registered User
Join Date: Sep 2009
Location: Sydney, Australia
Posts: 1,073
|
janos666, which build number of MPC-HC are you using now? Also, SD divx material is approximately what resolution in your case? Please also tell me what is listed in the menu: Play-> Filters and post a Ctrl+J graph which should cover most other settings. Post a video sample if possible.
|
16th March 2011, 04:36 | #16628 | Link |
Registered User
Join Date: Jan 2010
Posts: 30
|
New OSD
Hi guys!
How about the display of these OSD? >>> Result >>> By analogy with the regulation of the volume: ---------------------------------------------------------------------------------------------------------------------------------------------------------------------- Or after pressing these keys: >>> Result >>> Last edited by Silent Rain; 16th March 2011 at 04:40. |
16th March 2011, 07:59 | #16629 | Link |
Registered User
Join Date: Nov 2008
Posts: 454
|
Hi.
I have question about .bdmv files. Rev.2980 - i have internal MPEG splitter disabled - using standalone version instead. When i open index.bdmv, MPC-HC popup cannot play - is that OK? BTW, is there a way how to open index.bdmv automatically in graphstudio?
__________________
Working machine: Win10x64 + Intel Skull Canyon My HTPC. How to start with Bitcoin |
16th March 2011, 08:21 | #16630 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,347
|
The .bdmv handling relys on the internal mpeg splitter, if you disable it, there is no reason it should still be used for it. This also allows other DirectShow filters to be used for handling .bdmv, and doesn't force the mpeg splitters parsing on to it.
Any reason you want the .bdmv parsing to still work, but not with the internal mpeg splitter? Could always separate the options, if there is some useful use-case.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
16th March 2011, 08:36 | #16631 | Link | |
Registered User
Join Date: Nov 2008
Posts: 454
|
Quote:
GB-PVR/nPVR doesn't have any BluRay handler, it only ask DirectShow to render the file like Graphstudio do. So if there will be a way how to asociate .bdmv to autoopen MPC MPEG splitter, it will be possible to play BluRays easy (without menus of course). But i am not sure if that is possible. Mabye something like ?.bdmv source filter can do it? No idea.
__________________
Working machine: Win10x64 + Intel Skull Canyon My HTPC. How to start with Bitcoin |
|
16th March 2011, 08:54 | #16632 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,347
|
Thats not how it works, the standalone MPC-HC MPEG Splitter does not work with .bdmv files, it only does that when integrated in MPC-HC (there is some trickery going on).
Wait for the 0.20+ series of LAV Splitter, it'll add BluRay support, which should work for most players out of the box (just open the .bdmv, and main movie starts playing, among other features). A first version should be around next week, i hope.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
16th March 2011, 11:11 | #16633 | Link |
Registered User
Join Date: Mar 2002
Posts: 2,323
|
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config |
16th March 2011, 20:28 | #16634 | Link |
Registered User
Join Date: Jan 2010
Posts: 479
|
Ignore the DivX problem. It seems the VGA was not stable with it's factory clocks (OC edition with refence hardware design and higher BIOS clocks ; not very unusual that this kind of factory OC fails with some cards...).
It runs fine with reference memory colocks. This is interesting that it played HD movies with DXVA just fine and it also passed the Furmark stability test (I ran it for 20 minutes) but a simple DivX playback (with software decoders) could crash the whole system after a few frames. The Sync problem is still there, however... |
17th March 2011, 14:16 | #16635 | Link | |
Registered User
Join Date: Mar 2011
Posts: 380
|
[SPOILER]
Quote:
Thanks for the help Actually it's this http://sourceforge.net/apps/trac/mpc-hc/ticket/915 but without dxva being necessary. Color space converter filter is my best friend for now(?) I can't understand exactly what triggers that.Is rgb conversion necessary on a monitor?If yes what happens with dxva or with the last changes that I get NV12 mixer output when decoding with ffdshow? Catalyst is the problem? |
|
17th March 2011, 16:57 | #16636 | Link |
Registered User
Join Date: Jan 2005
Posts: 99
|
I've made available the patches for MPC HC filters used in the Stereoscopic Player here:
http://www.3dtv.at/OpenSource |
17th March 2011, 20:28 | #16638 | Link | |
Registered User
Join Date: Oct 2009
Location: France
Posts: 616
|
Quote:
__________________
HTPC : i7 920 6Go Win10(x64) / Nvidia 1050Ti / P6T Deluxe / Harman-Kardon AVR-355. |
|
Tags |
dxva, h264, home cinema, media player classic, mpc-hc |
|
|