Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players
Register FAQ Calendar Today's Posts Search

Reply
 
Thread Tools Search this Thread Display Modes
Old 30th October 2017, 14:32   #46941  |  Link
nsnhd
Registered User
 
Join Date: Jul 2016
Posts: 130
Quote:
Originally Posted by madshi View Post
So when showing a 1080p frame on a 1080p display, actually image quality gets lost due to the gaps between the pixels and the placement of the RGB subpixels. These problems of imperfect display technology are nicely reduced by upscaling the 1080p image to a higher resolution.
Madshi, how can I upscaling 1080p image on a 1080p display, I can do the chroma upscaling only ?
nsnhd is offline   Reply With Quote
Old 30th October 2017, 14:45   #46942  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
you can super sample but that doesn't fix the issue madshi is talking about. the gap and the subpixel will not change even when you supersample.

you need a screen with a higher resolution.

this should be the same issue: https://en.wikipedia.org/wiki/Screen-door_effect
huhn is offline   Reply With Quote
Old 30th October 2017, 15:47   #46943  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
I have a new machine that only has Intel HD 530 graphics (Core i5-6500) and I can't seem to get MadVR working well with MPC-HC 1.7.13 at all. I'm not looking for great upscaling or anything, in fact all I really want is smooth motion; bicubic scaling is fine for me. I can't avoid tonnes of dropped frames and presentation glitches even with very low settings though.

Here's a screenshot playing a 1080p50 file (well, 1080i/25 deinterlaced by LAV) at 1080p60. Rendering times look fine for smooth playback, yet none of the queues is filling. Turning off smooth motion doesn't improve matters, and I've tried D3D9, D3D9 overlay, and D3D11. Enabling the "separate devices" options causes out-of-order frames so I've avoided those. Even fullscreen exclusive mode doesn't help.

Is there something obvious that could cause this?
Attached Images
 
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7
DragonQ is offline   Reply With Quote
Old 30th October 2017, 15:59   #46944  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
could you upload the screen somewhere else?
huhn is offline   Reply With Quote
Old 30th October 2017, 16:16   #46945  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Quote:
Originally Posted by huhn View Post
could you upload the screen somewhere else?
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7
DragonQ is offline   Reply With Quote
Old 30th October 2017, 16:23   #46946  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
the screen is really clear about the problem.

you are not decoding fast enough. this is 1080p50 which is not part of a broadcast standard i know
is this software deinterlanced and what codec is that file? madVR usually get the information if it is VC-1, mpeg2, h264 and maybe other but this information is missing.
huhn is offline   Reply With Quote
Old 30th October 2017, 16:43   #46947  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Quote:
Originally Posted by huhn View Post
the screen is really clear about the problem.

you are not decoding fast enough. this is 1080p50 which is not part of a broadcast standard i know
is this software deinterlanced and what codec is that file? madVR usually get the information if it is VC-1, mpeg2, h264 and maybe other but this information is missing.
As I said above:

Quote:
Originally Posted by DragonQ View Post
Here's a screenshot playing a 1080p50 file (well, 1080i/25 deinterlaced by LAV) at 1080p60.
The codec is AVC. You are right that CPU usage maxed out when using avcodec, which is why the decoder queue isn't filling up. I tried DXVA2 native and CPU usage is much lower, and the decoding queue is at maximum, but it still has the same problem: rendering time is way too high. I don't really understand this, surely this CPU and IGP are perfectly capable of decoding such video files?

Using EVR things are way better, aside from the judder caused by the 50/60 fps mismatch. 20% CPU usage using DXVA2 native, 50% using avcodec. Why would the CPU usage be so much higher with madVR?
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7

Last edited by DragonQ; 30th October 2017 at 16:53.
DragonQ is offline   Reply With Quote
Old 30th October 2017, 16:45   #46948  |  Link
leandronb
Guest
 
Posts: n/a
Quote:
Originally Posted by khanmein View Post
https://blogs.windows.com/windowsexp...uild-17025-pc/

We fixed the issue where if you RDP into a PC running this build with certain GPU configurations, when you go to sign in to the PC locally it will appear stuck at a black screen with only the cursor available

We fixed an issue where toggling some DX9/DX10/DX11 games between windowed and fullscreen (for example using Alt + Tab) could result in the game window become black on certain PCs.
Quote:
Originally Posted by madshi View Post
See khanmein's reply.

Also, which GPU and driver version is this? If Nvidia, DON'T use 387.xx or 388.xx, use 385.xx, please.

Quote:
Originally Posted by xabregas View Post
Same problem here. I reverted to windows 8.1. Guess what?? Problem solved and it didnt hurt. Windows 10 is a bug and when we think the bug is solved like in 1703 version they invent a fall creator evil update to show us the bugs they came up in 6 months of work and introduce new amazing options like the sleeping light and the gpu tab in task manager. In 4 months i believe everything will be fine and you will have 1/2 months of no bugs until the spring/summer creators update.
Seems like i will either downgrade to windows 8.1 or wait for an update. I think i will go for the first option since i just installed and setup everything as i like, would be too time consuming going back.
I am a nvidia driver 388.00, i will downgrade to 385, what does it change?
I will either have to use 60hz to play movies or just deal with the current issues. I think i can live with the small stuttering when using seekbar and the frozen image after leaving fullscreen.
  Reply With Quote
Old 30th October 2017, 16:52   #46949  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
lavfilter can hardware deinterlance. but i take your answer as software deinterlancing try it without check the CPU load is hardware decoding used.

is ffdshow used?

what so ever the issue is that madVR doesn't get the frames in time. there could be a lot of reasons for that and i'm not sure what it is.
i don't see a reason a i5 6500 can decode a 1080p AVC stream without a problem.

you edit your post can you make a screen with DXVA native your old screen had 8 ms rendertimes.
huhn is offline   Reply With Quote
Old 30th October 2017, 17:00   #46950  |  Link
Razoola
Registered User
 
Join Date: May 2007
Posts: 454
Quote:
Originally Posted by DragonQ View Post
I have a new machine that only has Intel HD 530 graphics (Core i5-6500) and I can't seem to get MadVR working well with MPC-HC 1.7.13 at all. I'm not looking for great upscaling or anything, in fact all I really want is smooth motion; bicubic scaling is fine for me. I can't avoid tonnes of dropped frames and presentation glitches even with very low settings though.

Here's a screenshot playing a 1080p50 file (well, 1080i/25 deinterlaced by LAV) at 1080p60. Rendering times look fine for smooth playback, yet none of the queues is filling. Turning off smooth motion doesn't improve matters, and I've tried D3D9, D3D9 overlay, and D3D11. Enabling the "separate devices" options causes out-of-order frames so I've avoided those. Even fullscreen exclusive mode doesn't help.

Is there something obvious that could cause this?
The first step if I were you is to play 50fps content at 50Hz and not 60hz. This step along will give you more processing cycles per frame. Set madVR up to change refresh rate as needed.
Razoola is offline   Reply With Quote
Old 30th October 2017, 17:02   #46951  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by huhn View Post
the digicam has other issues one of the mast of the ship nearly disappears on the NGU sharp digicam image (i guess the shutter time was to long the image is pretty bright in general)
True.

Quote:
Originally Posted by huhn View Post
we could argue about that forever because it never really fair to compare to different image sizes.
IMHO my digicam shots are not perfect, but pretty fair.

Quote:
Originally Posted by huhn View Post
it a RGB pixel type correction still on your to do list? (it has a specail name i didn't remember c... correction)
Convergence correction? Yes, it's still on my to do list, but it's only meant for front/back projection, because those devices can have different RGB convergence on the left side of the screen compared to the right side, because projector lenses are not perfect, so the RGB channels can be stretched differently to the screen. This algo will not help to fix RGB subpixel position problems in LCD monitors. I don't think this subpixel position problem can be properly fixed. Ok, I think ClearType tries to improve it. *Maybe* I could try, too, but I don't think I can do miracles there. For front projection the potential benefits are bigger.

Quote:
Originally Posted by huhn View Post
fine by me. i guess we are done with the topic now?
Yes, please!

Quote:
Originally Posted by huhn View Post
Yes, but I think the SDE mainly talks about just the pixel gap. It's a problem that's best known in the front projection world, simply because the pixels are so large there, that also the pixel gap is becoming very large. It was a much bigger problem with SD front projectors or even 720p projectors as it is now with 1080p front projectors. Also, everyone has been working on improving the fillrate to reduce this problem.

All front projectors (that I know) don't have RGB subpixels, but they try to project the RGB components to the same place on screen. So front projection does not have this RGB subpixel situation that LCD monitors have. They "just" have the pixel gap problem.

Quote:
Originally Posted by DragonQ View Post
I have a new machine that only has Intel HD 530 graphics (Core i5-6500) and I can't seem to get MadVR working well with MPC-HC 1.7.13 at all.
Quote:
Originally Posted by huhn View Post
you are not decoding fast enough.
^ huhn has said it all.

This does not seem to be a madVR specific problem. Your decoder is simply not fast enough. I don't know the reason.

Quote:
Originally Posted by leandronb View Post
I am a nvidia driver 388.00, i will downgrade to 385, what does it change?
Maybe it will fix the problem? Or maybe not. We won't know until you try.
madshi is offline   Reply With Quote
Old 30th October 2017, 17:10   #46952  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Quote:
Originally Posted by huhn View Post
lavfilter can hardware deinterlance. but i take your answer as software deinterlancing try it without check the CPU load is hardware decoding used.
Good point, I've turned off YADIF for now and the CPU usage seems more normal. Still loads of frame drops though.

Quote:
Originally Posted by huhn View Post
is ffdshow used?
Not as far as I can tell. Literally MPC-HC out of the box with only changes to LAV and setting the output renderer to madVR.

Quote:
Originally Posted by huhn View Post
you edit your post can you make a screen with DXVA native your old screen had 8 ms rendertimes.
Even worse when using DXVA decoding and deinterlacing:



Quote:
Originally Posted by madshi View Post
This does not seem to be a madVR specific problem. Your decoder is simply not fast enough. I don't know the reason.
The decoding looks fine if I use avcodec and DXVA deinterlacing, but rendering times are through the roof:



The dropped frames disappear entirely if I deactivate deinterlacing. Could it be that, for some reason, DXVA deinterlacing is horrifically slow with Intel IGPs?
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7

Last edited by DragonQ; 30th October 2017 at 17:16.
DragonQ is offline   Reply With Quote
Old 30th October 2017, 17:29   #46953  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
can you check the temp of your CPU without very heavy thermal throttling i have no real idea what is happening there.

your GPUs takes 16 ms to deinterlance a frame...

the decoding queue still looks odd or upload is not fast enough which makes no sense.

edit: try to disable deinterlance as a test (control+d) maybe the rendertimes get normal again.
and can you post the filter list from mpc-hc.

Last edited by huhn; 30th October 2017 at 17:33.
huhn is offline   Reply With Quote
Old 30th October 2017, 17:43   #46954  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
I managed to get it playing perfectly setting "use DXVA chroma upscaling when doing DXVA deinterlacing", setting image downscaling to DXVA2, and also setting chroma and image buffers to 10bit. Pretty sure my ancient Arrandale chip didn't need such drastic measures to play these kinds of files! I have to use D3D9 because D3D11 is a complete mess and unusable.

Not sure about throttling - without admin rights I am not sure how to view CPU temperature or even current clock speed. Everything is ticked in the MPC-HC filter list; LAV Filters are definitely being used.
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7

Last edited by DragonQ; 30th October 2017 at 17:47.
DragonQ is offline   Reply With Quote
Old 30th October 2017, 17:52   #46955  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
https://www.cpuid.com/softwares/hwmonitor.html

decoding a normal bit rate h264 1080p stream with software deinterlancing is no real problem for such a CPU.

i can't follow you admin right problems.

your iGPU should be able to more than this...
huhn is offline   Reply With Quote
Old 30th October 2017, 18:14   #46956  |  Link
khanmein
Registered User
 
Join Date: Oct 2012
Posts: 118
@leandronb Did you try out 388.13 with new branch r388_10-4. I suggest you either use another driver or install the preview which fixed the issues.

Be patient to wait for the next cumulative updates. Cheers.

Last edited by khanmein; 31st October 2017 at 01:38.
khanmein is offline   Reply With Quote
Old 30th October 2017, 18:28   #46957  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Quote:
Originally Posted by huhn View Post
https://www.cpuid.com/softwares/hwmonitor.html

decoding a normal bit rate h264 1080p stream with software deinterlancing is no real problem for such a CPU.

i can't follow you admin right problems.

your iGPU should be able to more than this...
I would agree, it is rather odd. My Xeon X5650 shows ~5% CPU usage whilst decoding the same file - it's clocked 1 GHz higher but several generations behind so I would expect the two CPUs to perform similarly per core (5% of 12 threads should equate to 15% of 4 threads). I don't see what else the CPU would be doing as long as deinterlacing is done by the GPU. I checked the drivers and they're not the latest but they're not crappy generic ones either.
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7

Last edited by DragonQ; 30th October 2017 at 20:10.
DragonQ is offline   Reply With Quote
Old 30th October 2017, 20:33   #46958  |  Link
YxP
Registered User
 
Join Date: Oct 2012
Posts: 99
Enabling reduce compression artifacts and/or reduce random noise with NNEDI3 chroma (image is fine) upscaling crashes madvr. Win10 x64 GTX 1060 388.00. This doesn't really affect me but I thought I'd mention. Maybe someone else can reproduce this?
YxP is offline   Reply With Quote
Old 30th October 2017, 21:49   #46959  |  Link
xabregas
Registered User
 
Join Date: Jun 2011
Posts: 121
Quote:
Originally Posted by DragonQ View Post
Good point, I've turned off YADIF for now and the CPU usage seems more normal. Still loads of frame drops though.


Not as far as I can tell. Literally MPC-HC out of the box with only changes to LAV and setting the output renderer to madVR.


Even worse when using DXVA decoding and deinterlacing:





The decoding looks fine if I use avcodec and DXVA deinterlacing, but rendering times are through the roof:



The dropped frames disappear entirely if I deactivate deinterlacing. Could it be that, for some reason, DXVA deinterlacing is horrifically slow with Intel IGPs?
Why dont u let madvr do deinterlacing on its own without dxva enabled. U say u are using dxva deinterlacing where? On the decoder?

Because i see madvr doind the default deinterlacing anyway...
xabregas is offline   Reply With Quote
Old 30th October 2017, 22:24   #46960  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
madVR deinterlancing is done by DXVA.
huhn is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 23:26.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.