Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 26th November 2022, 20:10   #761  |  Link
StrifeLeonhart
Registered User
 
Join Date: Apr 2011
Posts: 8
Quote:
Originally Posted by JNW View Post
Not exactly what you wanted but best you could do would be to use software decoding. That will use your CPU instead of GPU.
My friend that is exactly what I was trying to explain to Asmodian, that is exactly what I wanted to do, thank you so much for saying what I wanted to say but didn't know how.

I don't mind about adding more work for the CPU, I'm just trying to make the CPU take the work load instead of the GPU.

Now I would like to ask Asmodian, which options would I need to go through the "MaxQuality" settings to use Software decoding.
StrifeLeonhart is offline   Reply With Quote
Old 27th November 2022, 05:23   #762  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,921
you should stop ignoring him and understand that this will cost you GPU power...
huhn is offline   Reply With Quote
Old 27th November 2022, 23:18   #763  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Quote:
Originally Posted by StrifeLeonhart View Post
I don't mind about adding more work for the CPU, I'm just trying to make the CPU take the work load instead of the GPU.
Using software decoding does not make the CPU take the workload instead of the GPU. It makes the CPU do a lot more work, while also making the GPU do more work.

I know it is counterintuitive, it seems like the CPU doing a lot more work would mean the GPU has less to do, but the way hardware decoding works on modern GPUs means that madVR has the most GPU power available when using native hardware decoding. The GPU has extra fixed function hardware that does the decoding directly into GPU memory which takes less GPU power than copying the data from system memory.

There are reasons to use software decoding but moving workload from the GPU to the CPU is not one of them.

Quote:
Originally Posted by StrifeLeonhart View Post
Now I would like to ask Asmodian, which options would I need to go through the "MaxQuality" settings to use Software decoding.
You do not change anything in madVR, hardware v.s. software decoding is an option in LAV Video.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 9th December 2022, 15:32   #764  |  Link
alexnt
Registered User
 
Join Date: Jan 2017
Posts: 27
Is it correct to use
1080p23, 1080p24, 1080p60, 1080p120, 2560x1440p23, 2560x1440p24, 2560x1440p60, 2560x1440p120
under "list all display modes madvr may switch to" ?

My monitor is 1440p@144 but I use it @120Hz
The content I watch on this monitor is mostly 23.976 & 24 fps.
alexnt is offline   Reply With Quote
Old 9th December 2022, 22:18   #765  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,921
you sadly have to check the refreshrates manual 144 and 240 are usually 143 and 239 so a muliply of 23 refreshrate.

looking at all the modes provided maybe enough.
2560x1440p120 and 2560x1440p144

you can skip 144.

why would you ever want to switch to 1080?
huhn is offline   Reply With Quote
Old 9th December 2022, 22:43   #766  |  Link
alexnt
Registered User
 
Join Date: Jan 2017
Posts: 27
Quote:
Originally Posted by huhn View Post
you sadly have to check the refreshrates manual 144 and 240 are usually 143 and 239 so a muliply of 23 refreshrate.

looking at all the modes provided maybe enough.
2560x1440p120 and 2560x1440p144

you can skip 144.

why would you ever want to switch to 1080?
Thanks for the reply!
Because movies are 1080 but as I understand from what you said this not a correct assumption.

I'll stick with 2560x1440p120

Last edited by alexnt; 9th December 2022 at 22:49.
alexnt is offline   Reply With Quote
Old 24th December 2022, 08:01   #767  |  Link
JNW
Registered User
 
Join Date: Sep 2017
Posts: 51
Quote:
Originally Posted by Asmodian View Post
Using software decoding does not make the CPU take the workload instead of the GPU. It makes the CPU do a lot more work, while also making the GPU do more work. Verified with task manager.

I know it is counterintuitive, it seems like the CPU doing a lot more work would mean the GPU has less to do, but the way hardware decoding works on modern GPUs means that madVR has the most GPU power available when using native hardware decoding. The GPU has extra fixed function hardware that does the decoding directly into GPU memory which takes less GPU power than copying the data from system memory.

There are reasons to use software decoding but moving workload from the GPU to the CPU is not one of them.



You do not change anything in madVR, hardware v.s. software decoding is an option in LAV Video.
I'd never argue with you but not how it works on an Intel 11th Gen XE. Get better performance dropping to software. Depends on GPU most likely. Verified task manager. Same madVR, mpv.


EDIT: Merry Christmas Doom9

Last edited by JNW; 24th December 2022 at 08:11.
JNW is offline   Reply With Quote
Old 24th December 2022, 08:19   #768  |  Link
Klaus1189
Registered User
 
Join Date: Feb 2015
Location: Bavaria
Posts: 1,667
So what GPU do you use?
Klaus1189 is online now   Reply With Quote
Old 24th December 2022, 22:27   #769  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Quote:
Originally Posted by JNW View Post
Get better performance dropping to software.
I am curious how you measured "better performance" along with which GPU you are using. Are you comparing to DXVA2 or D3D11, copyback or native?

Do you get lower GPU power draw when running software decoding?

The point I was arguing against with was the idea that you can free up GPU resources by using software decoding. Unless your GPU is pretty old, e.g. using a hybrid decoder, software decoding does not free up GPU resources. It actually uses more GPU resources due to the increased data transfer over the PCIe bus. This is usually a very minor effect with higher end GPUs; however, it is counter intuitive if you expect the CPU to have taken a lot of work from the GPU.

Sometimes you do get better performance overall with software decoding. This is because the GPU switches to a higher power (higher clock speed) mode due to the increased workload, which actually runs madVR better. The drivers don't seem to be that good at picking power levels for madVR's workload. I suggest using a mode other than "Optimal power" for Nvidia GPUs, unless power usage is very important to you.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 4th September 2023, 02:49   #770  |  Link
zamadatix
Registered User
 
Join Date: Jan 2011
Posts: 6
Thanks for this thread, it helped me get IVTC working because I did not realize it would be broken with DXVA or D3D11 decoding (which is default in a lot of cases).

For those on MPC-BE you can configure this under View -> Options -> Internal Filters -> Video Decoders -> Video decoder configuration. In that window you'll find a "Hardware Acceleration" area. In my case, simply switching it to NVDEC works since I have an Nvidia GPU.
zamadatix is offline   Reply With Quote
Old 22nd October 2023, 04:14   #771  |  Link
TheGameMaster
Registered User
 
Join Date: Sep 2017
Posts: 7
Maybe someone here can help. My MadVr is stuck on super-xbr for doubling. I want to continue to use NGU-Anti Alias. But no mater what I set MadVr to, it still runs super-xbr. I am using the 169 beta release on an HDR system. I did a complete re-install with no change. Any ideas?
TheGameMaster is offline   Reply With Quote
Old 22nd October 2023, 12:06   #772  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,921
you are mixing up chroma with luma?
huhn is offline   Reply With Quote
Old 22nd October 2023, 23:45   #773  |  Link
TheGameMaster
Registered User
 
Join Date: Sep 2017
Posts: 7
Scaling Algorithms - Image Upscaling

I set it to NGU Anti-Alias and it's wont' switch to it. Chroma upscaling is set to NGU Anti-Alias Low. I have had it set to Very High luma doubling, and medium luma quadrupling, and high chroma for years. But after this update, it switched, on it's own, to super-xbr when you do a ctrl-J to view it. Unless you do not double, regardless of setting in MadVR settings, it only uses super -xbr for doubling. Very annoying. I want it to work like it did. I did a complete re-install and went back to an older version of MadVR and no change, and that included deleting the setting.bin and the madvr reg settings in windows 11. It's really strange.
TheGameMaster is offline   Reply With Quote
Old 24th October 2023, 18:30   #774  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Can you post a screenshot of your settings pages?
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 4th November 2023, 06:13   #775  |  Link
TheGameMaster
Registered User
 
Join Date: Sep 2017
Posts: 7
screen shots

Here are the two screen shots. The control-j and the upscale page. As you can see, it is set to NGU-Antialias in settings. But in control-J it shows super-xbr. I have uninstalled, cleared the registery, and reinstalled. Same issue. I even tried going back to an earlier version of MPC-BE and MadVR to see if that would fix it. But no luck.
Attached Images
  
TheGameMaster is offline   Reply With Quote
Old 9th November 2023, 02:53   #776  |  Link
TheGameMaster
Registered User
 
Join Date: Sep 2017
Posts: 7
The OS I was on was Windows 11 Pro 22H2 Beta channel. I upgraded to 23H2 Canary channel (in order to force the upgrade I had to switch channels). Post-upgrade, MadVR is now behaving normally again. It is no longer stuck on super-xbr. That said, forcing me to use super-xbr did open my eyes to how useful that code set can be. It does run a lot cooler with less dropped frames than NGU-Anti-Alias with my set up. I'm running two GPUs - Radeon RX 5700XT and a still supported Nivida GTX 750ti - on a Aurus 570 Master board and Ryzen 3900 12 core CPU with 64 gigs of RAM. I use the 750ti as my audio card and hardware video decoder (CUVID - in LAV Video with hardware deinterlaceing). It frees up the 5700XT to do the rendering and presentation. Anyway, it works now.
TheGameMaster is offline   Reply With Quote
Old 9th November 2023, 14:14   #777  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,921
hardware decoding is actually "free".
the real cost is the upload of the decode data which is gigantic with a copyback decoder like cuvid a native decoder could be cheaper.

what so ever a 5700 xt doesn't really care i speak out of experience with it.
huhn is offline   Reply With Quote
Old 9th November 2023, 14:19   #778  |  Link
Charky
Registered User
 
Join Date: Apr 2018
Location: Paris, France
Posts: 92
Quote:
Originally Posted by TheGameMaster View Post
I'm running two GPUs - Radeon RX 5700XT and a still supported Nivida GTX 750ti - on a Aurus 570 Master board and Ryzen 3900 12 core CPU with 64 gigs of RAM. I use the 750ti as my audio card and hardware video decoder (CUVID - in LAV Video with hardware deinterlaceing). It frees up the 5700XT to do the rendering and presentation. Anyway, it works now.
Seems like an unnecessarily complicated setup.

EDIT : well, huhn beat me to it
__________________
Charky

"Rule #1 : If it works, don't change anything."
Charky is offline   Reply With Quote
Old 30th January 2024, 22:07   #779  |  Link
mark0077
Registered User
 
Join Date: Apr 2008
Posts: 1,106
Just regarding this option

"calibration [disable calibration controls for this display]"

the guide mentions that if its set to disable calibration, that madVR defaults to assuming the TV is using REC 709. Is this still true though if the content is HDR and madVR is outputting in one of its HDR modes (like passthrough or pixel shader mode)j, or does madVR instead assume BT. 2020 in such cases?
mark0077 is offline   Reply With Quote
Old 30th January 2024, 22:16   #780  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,921
in HDR output it will assume bt 2020 and PQ this also counts when output in HDR format is check in pixel shader else with pixel shader it will assume bt 709 and gamma 2.2.
huhn is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 19:15.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.