Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players
Register FAQ Calendar Today's Posts Search

Reply
 
Thread Tools Search this Thread Display Modes
Old 19th May 2016, 11:27   #581  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by BluesFanUK View Post
Is there even much of a difference between 8 bit and 10 bit? I've just bought the Dell Ultrasharp U2515H (True 8 bit) and have a 980ti. MadVR for some reason allows you to select 10 bit in the control panel then it shows as 10 bit on screen. Surely it should be detected from the GPU direct?
How? There's no reliable way to do that. That's why you have to experiment yourself.
Quote:
Originally Posted by BluesFanUK View Post
Is there a difference between 8 bit and 10 bit (8 bit + dithering)? Sorry if i'm being dense, but I couldn't notice a difference on my old Acer 4K S277HK (8 bit + dithering).
Visual quality wise: good question. If you don't see then it doesn't matter (I can't really see it either but I'm using it.)
Performance wise: definitely! 10 bit needs more bandwidth (especially when frame rate rises) than 8 bit. So this can be another reason to use 8 bit if your display doesn't support it!
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 19th May 2016, 12:11   #582  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,843
Difference is quite small. If you use good dithering than it's hard to see.
I've done tests on pro equipment (Sony OLED reference monitors) and you really need to be looking for it (and have real 10bit next to it) to see the difference. It all depends on the footage, but on the real world sample with home viewing conditions it's not so obvious.
kolak is offline   Reply With Quote
Old 5th September 2016, 03:11   #583  |  Link
rivera
Registered User
 
Join Date: Apr 2016
Posts: 25
Regarding RGB48 checkbox in MPC-HC settings.
Shall I leave only this checkbox checked and rest of other ones - unchecked?
rivera is offline   Reply With Quote
Old 5th September 2016, 16:00   #584  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
You should enable all checkboxes to let madVR do the RGB conversion instead of LAV Filters.
aufkrawall is offline   Reply With Quote
Old 5th September 2016, 16:26   #585  |  Link
rivera
Registered User
 
Join Date: Apr 2016
Posts: 25
Quote:
Originally Posted by aufkrawall View Post
You should enable all checkboxes to let madVR do the RGB conversion instead of LAV Filters.
Sorry, I can't see a logic.
How these checkboxes in LAV filter's settings are connected to madVR ?
rivera is offline   Reply With Quote
Old 5th September 2016, 17:01   #586  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Quote:
Originally Posted by rivera View Post
Sorry, I can't see a logic.
How these checkboxes in LAV filter's settings are connected to madVR ?
Those control what LAV is allowed to send to madVR.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 5th September 2016, 17:26   #587  |  Link
rivera
Registered User
 
Join Date: Apr 2016
Posts: 25
Quote:
Originally Posted by Asmodian View Post
Those control what LAV is allowed to send to madVR.
So, if I select only RGB48 then doesn't that mean that only 16bit color will be used?

I have Nvidia GTX960 card.
In Nvidia Control Panel "12bit" color output is selected.
TV is Panasonic PR65VT60 plasma (10bit panel).

So, just in case I decided to permit only RGB48 output in MPC-HC.
In case of 1080p videos everything runs smooth.
But in case of 2160p videos (some test clips) no drops too, but it runs very slow (0..2 frames in a queue).
If not only RGB48 is selected (i.e RGB24, RGB32 are selected too) then 2160p videos run smooth.
So I wonder why these stuttering occurred.
rivera is offline   Reply With Quote
Old 11th September 2016, 21:27   #588  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,926
Did anyone here looked deeper into the G-sync Module apart from the Variable V blank but more in the matter of the Scaling unit and FRC and into the possibility that they maybe temporarily interpolate frames on the fly gathered from the Sync Data with a G-sync Panel ?

and maybe even interpolate from a cheap 60 FPS Panel 120 and 144 HZ Signal results that their customers then sell as "real" 120 and 144 HZ Gamer Displays ?
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 11th September 2016 at 21:52.
CruNcher is offline   Reply With Quote
Old 12th September 2016, 00:35   #589  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Quote:
Originally Posted by CruNcher View Post
Did anyone here looked deeper into the G-sync Module apart from the Variable V blank but more in the matter of the Scaling unit and FRC and into the possibility that they maybe temporarily interpolate frames on the fly gathered from the Sync Data with a G-sync Panel ?

and maybe even interpolate from a cheap 60 FPS Panel 120 and 144 HZ Signal results that their customers then sell as "real" 120 and 144 HZ Gamer Displays ?
FRC? The G-sync module doesn't do frame rate conversations, except for doubling frames when needed. You think it might be blending 144 Hz signals into a 60 Hz one to drive the panel? That seems far fetched. They also do not have a scaling unit.

It really does not seem like that is happening and TFTCentral's testing with high speed chase cams would surely show it.

There is a problem with frame "blending" but it is due to slow response times. On VA panels real response times can easily be 10ms or even higher so the panel is never fully caught up, the new frame comes before the pixels have fully set for the last one.

Why in the 10-bit thread? Even the newest G-sync modules will only accept 8-bit input, won't they?
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 12th September 2016, 20:54   #590  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,926
I doubt they will stay at 8 bit with HDR incoming but yeah most probably to far fetched though i wondered if the Size of the FPGA Framebuffer could be enough if Nvidias DCC would be running on it additionally
Also seeing that Sony is doing the same in their driver box for the PSVR taking the 60 Hz input and FRC a 120 result and for VR it needs to be absolutely stable there you would be realizing motion misspredicitions pretty much instantly faster then you ever would on a Desktop Display.
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 12th September 2016 at 21:06.
CruNcher is offline   Reply With Quote
Old 14th October 2016, 14:14   #591  |  Link
iSeries
Registered User
 
Join Date: Jan 2009
Posts: 625
Hi,

For some reason I can't see 10bit in nvidia driver control panel (GTX950), only 8bit and 12bit. What happens if I set the driver to 12bit and have Madvr output 10bit? Does the driver just pad the signal with zeroes to 12bit, or does it do something undesirable? Also would anyone have an idea why I can't see 10bit in the driver? I could on my previous AMD card, with the same TV.
iSeries is offline   Reply With Quote
Old 14th October 2016, 16:49   #592  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
nvidia doesn't expose a 10 bit option if 12 bit is possible.

just hope it is padding no one knows this for sure what nvidia is doing.
huhn is offline   Reply With Quote
Old 8th November 2016, 15:43   #593  |  Link
Spc.
Registered User
 
Join Date: Nov 2016
Posts: 2
Hi everyone.
I have Dell U2410 monitor which supports 10bpc input, it has 8bit+FRC panel and i wanted to test 10bit input if it makes a difference in picture quality using method this thread mentions.

My Hardware:
Dell U2410
AMD RADEON PowerColor TurboDuo R9 280X 3GB GDDR5 OC (Tahiti)
16GB RAM
Core i7-3930k @ 4.4GHz
Windows Server 2016 Datacenter (MSDN)

Here's a test i made:

R9 280X (Tahiti) 6bpc > DisplayPort > U2410 (262144 Colors) (18 bits):
http://www.netsky.org/10bpc/6bpcamdeng.png

R9 280X (Tahiti) 8bpc > DisplayPort > U2410 (16777216 Colors) (24bits):
http://www.netsky.org/10bpc/8bpcamdeng.png

R9 280X (Tahiti) 10bpc > DisplayPort > U2410 (1073741824 Colors) (30bits):
http://www.netsky.org/10bpc/10bpcamdeng.png

I used Canon EOS 7D Mark II with EF 24-105mm L IS USM Lens (48bit RAW Files) to test image quality tests on my monitor.
As you can see image quality between 8bit and 8bit+FRC is huge.

** If you're viewing pictures in FireFox let me tell you that FireFox has a problem rendering 48bit png files, because it assumes all pictures on the web are sRGB so FireFox does read embeded ICC Color profile so you will see upper blue line as purple line, use google chrome to get exact colors, i hope FireFox fixes this bug soon.


I also wanted to test HDMI port on my Radeon, it does the same thing, 10bpc output on HDMI port gives me the same 10bpc quality.

On nvidia cards only Display Port connection works with higher than 8bpc settings.

While with AMD cards you can choose 10bpc on HDMI and DisplayPort, nvidia cards can only output 10bpc on DisplayPort.
If you select HDMI connection to Dell U2410 nvidia does not let you choose higher than 8bpc.

Here's a picture of R9 280X using a HDMI or Display Port:
http://www.netsky.org/10bpc/DELL+Tahiti.png

Here's a picture of GTX 960 using a Display Port:
http://www.netsky.org/10bpc/MaxwellG...yPortU2410.png

If i choose HDMI connection, 10 bpc option disappears and I only get 8bpc option (nvidia).


My conclusion:
1. 8bit+FRC does look very good when using 10bit input and difference is very big in 8/10 input.
2. AMD Cards can use HDMI (EDID 1.3) or DisplayPort (EDID 1.4) to output 10bit per channel to monitor.
3. Nvidia Cards only work with Display Port (EDID 1.4) to output 10bpc, HDMI (EDID 1.3) does not work higher than 8bpc on DELL U2410.

Last edited by Spc.; 8th November 2016 at 15:50.
Spc. is offline   Reply With Quote
Old 8th November 2016, 15:55   #594  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
did you disable the dithering in the GPU or why is 6 bit not dithered?
huhn is offline   Reply With Quote
Old 8th November 2016, 16:12   #595  |  Link
Spc.
Registered User
 
Join Date: Nov 2016
Posts: 2
Quote:
Originally Posted by huhn View Post
did you disable the dithering in the GPU or why is 6 bit not dithered?
Yes i did disable dithering in registry (AMD).
Spc. is offline   Reply With Quote
Old 8th November 2016, 16:14   #596  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
for a more fair comparison set the madVR bit deep to the same value as the output bit deep in the CCC (RIP BTW.).
huhn is offline   Reply With Quote
Old 25th August 2017, 19:57   #597  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Do you guys already know this test video?
https://github.com/jursonovicst/gradient
Very helpful. Windows 10 video app clearly shows banding here for the 10 bit content, which is not the case with MPC HC EVR.

Handbrake automatically seems to apply dither when converting to 8 bit, so the 10 bit gradients are also banding-free then. At least when rendered by madVR and mpv (mpv however doesn't activate dithering by default), EVR struggles with the dithered 8 bit. I expect MPDN to be banding-free as well.
aufkrawall is offline   Reply With Quote
Old 25th August 2017, 21:53   #598  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by aufkrawall View Post
Do you guys already know this test video?
https://github.com/jursonovicst/gradient
Thanks, I didn't have this file. I had the greyscale gradient one, but this one is even better.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 10th January 2020, 11:03   #599  |  Link
copenhagenstreaming
Registered User
 
Join Date: Dec 2019
Location: Copenhagen, Denmark
Posts: 3
Quote:
Now switch FSE On and go fullscreen again, now the GPU actually sends 10bit image to the display, and if your display supports 10bit input, you should now see 1024 gradients from black to white.
These gradients are 4 times narrower compared to 8bit and are practically indiscernible (to me).
In other words, you should NOT see any gradients.
After following your guide, I see no difference between 8 and 10 bit. The gradients are exactly the same. The lower bit modes are clearly distinguishable. In the nVidia control panel I have selected 12 bit, as 8 and 12 bit are the only options.

What can be the issue?

Quote:
Now switch FSE
Can you outline exactly how this is done in MPC-HC?

My setup is a laptop with Geforce 1060 GTX > HDMI 2m > Denon AVR-X2600H > HDMI 7m > Epson Home Cinema 5050UB (4:4:4 up to 8-bit, 4:2:2 up to 12-bit)

Thanks a bunch!
__________________
Win10 Pro x64
MPC-HC/LAV/MadVR
Laptop w. Geforce GTX1060 > Denon AVR-X2600H > Epson Home Cinema 5050UB
copenhagenstreaming is offline   Reply With Quote
Old 10th January 2020, 19:42   #600  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
you don't need FSE anymore more on windows 10.
it'S not an option in mpc-hc.
huhn is offline   Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 23:56.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.