Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 18th December 2018, 08:50   #53921  |  Link
Betroz
Is this for real?
 
Betroz's Avatar
 
Join Date: Mar 2016
Location: Norway
Posts: 160
Like someone else suggested here earlier, for those who run games and use MadVr on the same PC, just run a dual boot setup with Win 8.1 for MadVr and Win10 for games. Or two installs of Win10 if possible.
__________________
My HTPC : i7 6900K @ 4.2 | nVidia GTX 1080Ti | TV : Samsung 75Q9FN QLED
Betroz is offline   Reply With Quote
Old 18th December 2018, 10:57   #53922  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 1,587
Quote:
Originally Posted by nghiabeo20 View Post
Hi, can someone help me with the calibration settings?
This is your topic: https://forum.doom9.org/showthread.php?t=172783&page=14
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v385.28),Win10 LTSB 1607,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED65B8(2160p@23/24/25/29/30/50/59/60Hz)
chros is offline   Reply With Quote
Old 18th December 2018, 12:06   #53923  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,224
Quote:
Originally Posted by kitame View Post
is TFLOPs still a reliable indicator of which card would be better for MadVR? or had that changed?

e.g.
RX Vega 56 = 10.5 TFLOPs
GTX 1070 = 7.2 TFLOPs

in reviewer's game benches both perform roughly the same.
i really wish reviewers added MadVR to their list.
it was sadly never an indicator for madVR performance between different architectures.

tflops are now fundamentally flawed too.
the 1070 tflops are with stock clocks but all properly cooled cards boost to around 1900 not the ~1500 they are sold at.
huhn is offline   Reply With Quote
Old 18th December 2018, 12:54   #53924  |  Link
kitame
Registered User
 
Join Date: May 2012
Posts: 85
Quote:
Originally Posted by huhn View Post
it was sadly never an indicator for madVR performance between different architectures.

tflops are now fundamentally flawed too.
the 1070 tflops are with stock clocks but all properly cooled cards boost to around 1900 not the ~1500 they are sold at.
7.2 TFLOPs is actually at 1900, max boost is rated 1900 in Wiki too.
from what is described in wiki, the TFLOPs is calculated purely based on Shader/CUDA cores and it's clock speed.
this means GTX1070 which only has 1920 CUDA cores would be way slower than RX Vega 56 which has 3584 shaders.


in any case, is there any way to get a rough estimate to their actual performance in MadVR?

edit: by the way, in certain "compute benchmarks" Vega series were shown outperforming even the GTX1080, so it's TFLOPs rating isn't entirely baseless.
https://www.anandtech.com/show/11717...d-56-review/17

Last edited by kitame; 18th December 2018 at 13:03.
kitame is offline   Reply With Quote
Old 18th December 2018, 12:58   #53925  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,899
Quote:
Originally Posted by kitame View Post
in any case, is there any way to get a rough estimate to their actual performance in MadVR?
Not from paper stats like that. You can roughly use gaming performance as an indicator, but thats not going to be fully accurate either since game rendering uses far more parts of a GPU then just image shading.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 18th December 2018, 13:04   #53926  |  Link
kitame
Registered User
 
Join Date: May 2012
Posts: 85
Quote:
Originally Posted by nevcairiel View Post
Not from paper stats like that. You can roughly use gaming performance as an indicator, but thats not going to be fully accurate either since game rendering uses far more parts of a GPU then just image shading.
is there a particular benchmark that would reflect MadVR's workload? or at least close to it?
kitame is offline   Reply With Quote
Old 18th December 2018, 13:11   #53927  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,224
sadly no.

for example polaris is on paper fast in games but slow when madVR is used with NGU but only then it performance as expect with "any" other scaler/feature.

this was kind of the same with nnedi3 so who knows with the next algorithm. maybe you need an RTX card with tensor cores no one knows.
huhn is offline   Reply With Quote
Old 18th December 2018, 13:36   #53928  |  Link
kitame
Registered User
 
Join Date: May 2012
Posts: 85
i see, however RTX cards are currently way overpriced, and there were rumors that it was intentionally so to get rid of Pascal cards.

edit: speaking of benchmarks though, i wonder if Madshi is interested in making a benchmark on MadVR.

Last edited by kitame; 18th December 2018 at 13:39.
kitame is offline   Reply With Quote
Old 18th December 2018, 13:38   #53929  |  Link
j82k
Registered User
 
Join Date: Jun 2017
Posts: 148
Ok so with newer nvidia drivers (tested with 417.35) when starting an HDR movie:

madVR native display bitdepth set to 8-bit --> nothing happens, TV stays in SDR mode
madVR native display bitdepth set to 10-bit --> HDR triggers correctly

After starting a movie with 10-bit I can change it back to 8-bit and HDR stays active but when closing mpc-hc it doesn't return to SDR mode. Only closing it with 10-bit selected returns the TV to SDR mode.

Doesn't this point to it being a madVR problem then?
Why would it even make a difference regarding HDR triggering or not what bitdepth I have selected in madVR?
j82k is offline   Reply With Quote
Old 18th December 2018, 13:56   #53930  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 1,587
Quote:
Originally Posted by j82k View Post
Why would it even make a difference regarding HDR triggering or not what bitdepth I have selected in madVR?
Maybe it's a hardcoded limitation to do not screw up other stuff?
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v385.28),Win10 LTSB 1607,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED65B8(2160p@23/24/25/29/30/50/59/60Hz)
chros is offline   Reply With Quote
Old 18th December 2018, 14:17   #53931  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,224
Quote:
Originally Posted by kitame View Post
i see, however RTX cards are currently way overpriced, and there were rumors that it was intentionally so to get rid of Pascal cards.

edit: speaking of benchmarks though, i wonder if Madshi is interested in making a benchmark on MadVR.
this has come up a couple of time and can't be reliable done.

turing cards are similar price as pascal cards in the same performance range at least before pascal was sold off. not a nice situation to say it friendly but it is time to stop cry about it that's just how a monopole works.

i'm personally not a fan of the current situation but AMD has to be blamed for this too.
huhn is offline   Reply With Quote
Old 18th December 2018, 15:18   #53932  |  Link
SweetLow
Registered User
 
Join Date: Jul 2015
Posts: 71
Quote:
Originally Posted by madshi View Post
However, during playback/rendering, the whole chain is fully lossless. There's no lossy compression going on anywhere in between madVR and TV. So every dithered bit should reach the display untouched. As a result, bitdepth is much less important, because madVR's dithering is of very high quality.
We have VESA DSC on the horizon.
SweetLow is offline   Reply With Quote
Old 18th December 2018, 15:21   #53933  |  Link
x7007
Registered User
 
Join Date: Apr 2013
Posts: 253
Quote:
Originally Posted by j82k View Post
Ok so with newer nvidia drivers (tested with 417.35) when starting an HDR movie:

madVR native display bitdepth set to 8-bit --> nothing happens, TV stays in SDR mode
madVR native display bitdepth set to 10-bit --> HDR triggers correctly

After starting a movie with 10-bit I can change it back to 8-bit and HDR stays active but when closing mpc-hc it doesn't return to SDR mode. Only closing it with 10-bit selected returns the TV to SDR mode.

Doesn't this point to it being a madVR problem then?
Why would it even make a difference regarding HDR triggering or not what bitdepth I have selected in madVR?
I just use AUTO .... it detect properly every time with NV Mode if with proper drivers and OS HDR when I manually change to 10 bit in NVCP drivers .

HDR supposed to be 10 bit always in games and movies right ? but how do we know how much depth is it using when it auto detect in a game ? because when you enable it from OS HDR it's only 8 bith + dithering , it messes up my DreamScreen color signal .

Is there any app to tell us how much depth is it using?

Last edited by x7007; 18th December 2018 at 15:51.
x7007 is offline   Reply With Quote
Old 18th December 2018, 15:42   #53934  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,899
Quote:
Originally Posted by SweetLow View Post
We have VESA DSC on the horizon.
HDMI 2.1 still has enough bandwidth for the forseeable future without needing compression, unless you want to get into 8K, which is probably beyond the point of diminishing returns in quality.
HDMI 2.1 should be able to transport 4K, 12-bit, 120Hz at full bandwidth, thats plenty for the next couple years.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 18th December 2018 at 15:44.
nevcairiel is offline   Reply With Quote
Old 18th December 2018, 15:57   #53935  |  Link
SweetLow
Registered User
 
Join Date: Jul 2015
Posts: 71
Quote:
Originally Posted by nevcairiel View Post
HDMI 2.1 still has enough bandwidth for the forseeable future without needing compression, unless you want to get into 8K
I know. It's just little clarification that some things can change.
P.S. I remember you were talking about the "noise floor" before madshi's post
SweetLow is offline   Reply With Quote
Old 18th December 2018, 16:23   #53936  |  Link
kitame
Registered User
 
Join Date: May 2012
Posts: 85
Quote:
Originally Posted by nevcairiel View Post
unless you want to get into 8K, which is probably beyond the point of diminishing returns in quality.
actually that would depend on it's purpose.
like those examples they use to showcase how much better 4K is than 1080P, they could fit in a wider scene without compromising the quality and clarity.



to point out, this would be way better with a large format display, completely replacing triple monitor with one gigantic screen.
and on that note, 8K is equivalent to 3 2560x1600 monitors, with 2720 extra vertical pixels.

Last edited by kitame; 18th December 2018 at 16:32.
kitame is offline   Reply With Quote
Old 18th December 2018, 17:41   #53937  |  Link
Mykola
Registered User
 
Join Date: Jun 2017
Posts: 4
Quote:
Originally Posted by madshi View Post
B) FreeSync / G-SYNC

Games create a virtual world in which the player moves around, and for best playing experience, we want to achieve a very high frame rate and lowest possible latency, without any tearing. As a result with FreeSync/G-SYNC the game simply renders as fast as it can and then throws each rendered frame to the display immediately. This results in very smooth motion, low latency and a very good playability.

Video rendering has completely different requirements. Video was recorded at a very specific frame interval, e.g. 23.976 frames per second. When doing video playback, unlike games, we don't actually render a virtual 3D world. Instead we just send the recorded video frames to the display. Because we cannot actually re-render the video frames in a different 3D world view position, it doesn't make sense to send frames to the display as fast as we can render. The movie would play like fast forward, if we did that! For perfect motion smoothness, we want the display to show each video frame for *EXACTLY* the right amount of time, which is usually 1000 / 24.000 * 1.001 = 41.708333333333333333333333333333 milliseconds.

FreeSync/G-SYNC would help with video rendering only if they had an API which allowed madVR to specify which video frame should be displayed for how long. But this is not what FreeSync/G-SYNC were made for, so such an API probably doesn't exist (I'm not 100% sure about that, though). Video renderers do not want a rendered frame to be displayed immediately. Instead they want the frames to be displayed at a specific point in time in the future, which is the opposite of what FreeSync/G-SYNC were made for.

If you believe that using FreeSync/G-SYNC would be beneficial for video playback, you might be able to convince me to implement support for that by fulfilling the following 2 requirements:

1) Show me an API which allows me to define at which time in the future a specific video frame gets displayed, and for how long exactly.
2) Donate a FreeSync/G-SYNC monitor to me, so that I can actually test a possible implementation. Developing blindly without test hardware doesn't make sense.
Any news about that? How about creating your own timer? Something like spin wait? Modern systems can easily sacrifice one CPU core. And what about modern DirectCompute? Does it have enough features, to trigger frame swap without switching to userland code?
Mykola is offline   Reply With Quote
Old 19th December 2018, 00:19   #53938  |  Link
Olivier C.
Registered User
 
Join Date: Jan 2014
Location: France
Posts: 76
madVR OSD Font Size

madshi,

May I ask if you could at least add a font size parameter in IMadVRTextOsd::OsdDisplayMessage (not needed in madVR GUI nor settings atm) in order to display some easily readable messages on UHD displays (especially when screen size / distance is quite critical).

Thanks a lot

Quote:
Originally Posted by madshi View Post
madVR currently does not allow you to specify OSD size or font. I might add such features into a future version. But for now my priority is making madVR feature complete first. Cosmetical things like changing OSD looks is pretty low on my priority list right now. You'll have to wait a while until I get to these things, I'm sorry.
Olivier C. is offline   Reply With Quote
Old 19th December 2018, 01:55   #53939  |  Link
madFloyd
Registered User
 
Join Date: Jan 2018
Posts: 8
x264

Quote:
Originally Posted by Manni View Post
As Asmodian said, MadVR doesn't use any of the new features yet, but when/if it does it should make a significant difference.

FYI Madshi himself doesn't advise buying GTX GPUs anymore at this stage but recommends RTX instead. He has made no promises or committed to anything, but it's most likely his intention to take advantage of the new architecture in the future with MadVR.

So no differences now, but down the line possibly a significant one.

If you want best possible PQ in HDR using MAdVR's excellent HDR tonemapping with all options/features available, I wouldn't buy anything below a 2080 (roughly equal to 1080ti), and I recommend a 2080ti if you can afford it to have some headroom. My 1080ti is struggling with the latest HDR test algos and I have to use it in D3D11 native as Copyback isn't up to the task without degrading quality. That means I lose black bar detection with D3D11 native, or have to lower the quality of chroma upscaling (using NGU AA High at the moment, so it's possible to downgrade without too much of a downside visually).

My plan is to upgrade the 1080ti as soon as nVidia releases a model supporting HDMI 2.1 and 7nm, at some point in 2019.
Manni, where does one choose between native and copyback? Is this a madVR setting or a LAV setting?
madFloyd is offline   Reply With Quote
Old 19th December 2018, 04:02   #53940  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,796
It is a LAV setting.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 13:44.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, vBulletin Solutions Inc.