Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 22nd December 2017, 11:22   #47901  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 698
Quote:
Originally Posted by x7007 View Post
Pixel 4:4:4 Game Console or HDMI 1 Inputs tv setting Full RGB Set in TV settings ( Black Level LOW) , Nvidia set Limited
TV LG OLED 4K 3D 55C6
HDMI Cable tried 2 Monoprice Premium HDMI cables 1.5 Meter and also some brand that cost 38$ which is Highest possible, the same.
MadVR 0-255 which has the issues but HDR looks amazing, 16-235 the white dots disappear but the quality is lowered by 50%.
INNO3D 1080GTX Default Color Settings
No Reciever, Straight to the TV HDMI1 , I tried also HDMI2 , it's the same.
Don't have Bluray Player, downloading movies from the internet.

This is what happening for me in movies. but this guys posted for his PS4 , it's the same issue.

https://youtu.be/IeFN4XT1pJ8



EDIT : found the Issue and the FIX

I think I've found the issue..

I need to change the NVidia settings from Default Color to NVidia Color Settings then choose YCbCr422 12bit Limited . Then the HDR movies works perfect with 0-255 on the TV. When using the Default it uses automatically 12 bit RGB ... so it does some weird conversion in the mode which causes issue with white pixels . I need to choose every time I watch HDR to put 12 bit YCbCr 422 so it won't try to use RGB.. is there any other way to that ?? so many manual things to change.......

With the PS4 it's the same thing. I don't have PS4, but from the youtube video I can understand the issue so it's not standalone part to the TV.... the TV doesn't go into the right mode properly . but there it's a game, but still you can play games with 12 bit RGB . HDR needs to be YCbCr 422 . so I think that's the issue in mind.
Hi, I did say a while back I also got white dots when there were bandwidth issues, this is one of the reasons I run my HTPC at 4:2:0, I dont play any games on it so this I find is the most convenient option as I can just leave it as this and it works well for everything without having to change any settings depending on what i'm doing.

Glad you got it sorted.
mclingo is offline   Reply With Quote
Old 22nd December 2017, 11:26   #47902  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 698
I dont think the white dots is the cables, i'm using these decent cables end to end - https://www.ebay.co.uk/itm/JuicEBitz...53.m2749.l2649.

I get these sometimes when playing HDR movies at 4:4: 4k@24hz, if I drop to 4:2:0 its fine.

I guess it could be an issue with LG 4k TV's maybe, or my AMP but they both should support the current HDMI 2 specs.

Last edited by mclingo; 22nd December 2017 at 11:42.
mclingo is offline   Reply With Quote
Old 22nd December 2017, 11:46   #47903  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,029
Quote:
Originally Posted by mclingo View Post
I dont think the white dots is the cables
Don't guess, try a few cables.
ryrynz is offline   Reply With Quote
Old 22nd December 2017, 12:23   #47904  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 698
Quote:
Originally Posted by ryrynz View Post
Don't guess, try a few cables.
he has I think.
mclingo is offline   Reply With Quote
Old 22nd December 2017, 14:13   #47905  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 395
The thing is, if it was a problem with the TV processing it would manifest itself in a more consistent way than random white dots. For instance, if it was a problem with processing brightness values or something, it wouldn't be random white dots it would be whole zones of the picture affected and it would not flicker even on a fixed image but be stable. Truly random white dots has been a symptom of bandwidth or signal integrity issues since HDMI 1.0
Even if it persists after trying many cables, it could also be an issue with the TV's inputs or the graphics card's output. Did you also try other HDMI inputs and outputs (and multiple cables on each)?
__________________
HTPC: W10 1803, E7400, 1050 Ti, DVB-C, Denon 2310, Panasonic GT60 | Desktop: W10 1809, 4690K, HD 7870, Dell U2713HM | MediaPortal 1/MPC-HC, LAV Filters, ReClock, madVR
el Filou is offline   Reply With Quote
Old 22nd December 2017, 19:33   #47906  |  Link
d3rd3vil
Registered User
 
Join Date: Jun 2016
Posts: 79
it is time. We need Dolby Vision support more than ever!
d3rd3vil is offline   Reply With Quote
Old 22nd December 2017, 20:09   #47907  |  Link
Razoola
Registered User
 
Join Date: May 2007
Posts: 454
Quote:
Originally Posted by d3rd3vil View Post
it is time. We need Dolby Vision support more than ever!
Do we really? I would argue that madvr can already make an image better than some of these pre processed Dolby vision tv shows by processing non Dolby vision video.
Razoola is offline   Reply With Quote
Old 22nd December 2017, 20:18   #47908  |  Link
Siso
Registered User
 
Siso's Avatar
 
Join Date: Sep 2013
Location: Bulgaria
Posts: 367
Quote:
Originally Posted by d3rd3vil View Post
it is time. We need Dolby Vision support more than ever!
Keep dreaming
Siso is offline   Reply With Quote
Old 22nd December 2017, 21:38   #47909  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,367
Quote:
Originally Posted by Razoola View Post
Do we really? I would argue that madvr can already make an image better than some of these pre processed Dolby vision tv shows by processing non Dolby vision video.
The issue is Dolby discs, I have a few Dolby Vision UHD blurays and they don't look as good as they could.

There is a second video track included in the stream, it is only 1920x1080 and black and white. It looks like extra information for shadows and highlights but nothing we have now can use it. They still look pretty good in HDR10 fallback mode though.

I have been unable to get HDR content to look as good in SDR mode as it does in HDR mode, at least on my OLED TV. It can look pretty good but putting the TV into HDR mode does improve highlights, shadows too but highlights are the biggest change.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 22nd December 2017, 22:34   #47910  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,363
we don't know what is more accurate at the moment. could be heavy dynamic contrast applying by the screen in HDR mode.
huhn is offline   Reply With Quote
Old 22nd December 2017, 22:49   #47911  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 208
Quote:
Originally Posted by Manni View Post
No downside as long as you don't need a more recent driver for games or whatever reason. For 3D, you need the workaround I posted earlier in the thread. That's the only way to get 100% reliable 3D playback with nVidia as far as I can see. 3D also needs FSE, at least here.

You have to enable 3D manually before every playback to be sure it's on. I use a batch file launched from my iPad when I select the 3D button in iRule. It enables 3D on nVidia using MCE Controller and selects my 3D calibration on the projector. The register file added by Madshi after we had this discussion isn't enough for me, but it might work for you.
About a year ago, I noticed certain file playback turned off stereoscopic system wide for some strange bizarre unknown reason to me. HDR is particular since it never happened prior. So, I too wrote a script that launches via Kodi when starting a 3D title only via nvstlink.exe. For some reason the 3D setup in madVR isn't 100% reliable but this with that works perfectly. The registry file in madVR is only to set stereoscopic settings because NCP was having driver probs doing it at the time. I still use it.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W10v1809 X5690 9604GB RGB 4:4:4 8bit Desktop @60Hz 8,10,12bit @Matched Refresh Rates
KODI MPC-HC/BE PDVD DVDFab
65JS8500 UHD HDR 3D
brazen1 is offline   Reply With Quote
Old 23rd December 2017, 01:07   #47912  |  Link
AngelGraves13
Registered User
 
Join Date: Dec 2010
Posts: 239
Quote:
Originally Posted by Asmodian View Post
The issue is Dolby discs, I have a few Dolby Vision UHD blurays and they don't look as good as they could.

There is a second video track included in the stream, it is only 1920x1080 and black and white. It looks like extra information for shadows and highlights but nothing we have now can use it. They still look pretty good in HDR10 fallback mode though.

I have been unable to get HDR content to look as good in SDR mode as it does in HDR mode, at least on my OLED TV. It can look pretty good but putting the TV into HDR mode does improve highlights, shadows too but highlights are the biggest change.
Chroma is 1/4 size of video under 4:2:0, so it would be 1080p. The separate video stream is the Dolby Vision layer, which cannot be decoded (yet).

I hope we'll be able to decode sometime in the future for better HDR.
AngelGraves13 is offline   Reply With Quote
Old 23rd December 2017, 05:35   #47913  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 4,063
Well I tried my hand at down scaling uhd to 1080p today. It did not go as planned. I'm not sure what settings I need to use in order to make it work smoothly with my 6 gb 1060. I think I had chroma set to ngu aa low. I tried setting HDR to SDR conversion. I did not try messing with dithering so maybe I'll try that next. My rendering times were in the mid-40s and I was dropping frames quite a bit. The quality was quite exceptional so if I can get it stable that will be a really nice thing. My ancient 3770 is probably getting a little tired LOL.

Sent from my Pixel XL using Tapatalk
__________________
HTPC: Windows 10, I7 3770k, GTX 1060, Pioneer Elite VSX-LX303, LG C8 65" OLED
Laptop: MSI GT70 Dominator (Optimus)
SamuriHL is offline   Reply With Quote
Old 23rd December 2017, 05:37   #47914  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,363
did you change your power settings to something else than optimal?
huhn is offline   Reply With Quote
Old 23rd December 2017, 05:47   #47915  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 4,063
No that's set fine. I'll have to mess with it some more and see what I can come up with.

Sent from my Pixel XL using Tapatalk
__________________
HTPC: Windows 10, I7 3770k, GTX 1060, Pioneer Elite VSX-LX303, LG C8 65" OLED
Laptop: MSI GT70 Dominator (Optimus)
SamuriHL is offline   Reply With Quote
Old 23rd December 2017, 07:07   #47916  |  Link
x7007
Registered User
 
Join Date: Apr 2013
Posts: 244
0.70.2-83 Nightly (December 13, 2017)
in case anyone missed that.
x7007 is offline   Reply With Quote
Old 23rd December 2017, 09:33   #47917  |  Link
x7007
Registered User
 
Join Date: Apr 2013
Posts: 244
When I am watching 1080p MKV movies, what the best mode should my TV be ?

YCbCr 4:4:4 8 bit
YCbCr 4:4:4 10 bit
YCbCr 4:4:4 12 bit

RGB 8 bit
RGB 10 bit
RGB 12 bit

for 1080p I'm using the resolution changer in MadVR so it will change to 1920x1080p when it detect 1080p movie. in this resolution it changes from 3840x2160 8 bit RGB to 1920x1080 12 bit RGB.

Because windows automatically detect RGB 12 bit for no reason I can imagine by NVidia Default Color setting.
As for as it go for HDR mode the best is YCbCr 4:2:2 for 12 bit obvious reasons

I think I need to change it also for 1080p Movies and not only for HDR Movies and Games.

Last edited by x7007; 23rd December 2017 at 09:55.
x7007 is offline   Reply With Quote
Old 23rd December 2017, 09:50   #47918  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,363
full range RGB is the best if your end device support it properly. this counts for HDR too.

10 bit can be better if your Tv properly support it so jsut stick to RGB and get to 10 bit if possible.

send 12 bit doesn't give you any with the current madVR version madVR it self is limited to 10 bit output to the GPU driver.
huhn is offline   Reply With Quote
Old 23rd December 2017, 09:57   #47919  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,367
I use:
SDR: RGB 8 bit
HDR: RGB 12 bit (my drivers do not offer a 10 bit option)

RGB is ideal for madVR. Technically the GPU using RGB 10 or 12 bit with madVR set to 10 bit is the highest quality mode with madVR. For the few HDR demos that are over 30 fps I used RGB 8 bit (due to the limits of HDMI 2.0) but it required tricking the drivers to get HDR on with 8 bit output.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 23rd December 2017, 09:57   #47920  |  Link
x7007
Registered User
 
Join Date: Apr 2013
Posts: 244
Quote:
Originally Posted by huhn View Post
full range RGB is the best if your end device support it properly. this counts for HDR too.

10 bit can be better if your Tv properly support it so jsut stick to RGB and get to 10 bit if possible.

send 12 bit doesn't give you any with the current madVR version madVR it self is limited to 10 bit output to the GPU driver.
Yes, that's what the TV or NVidia drivers decide to choose when you change resolution.

For HDR NVidia decide something wrong as I tested the last time, that's what I am trying to understand if it's the same for SDR because I'm also changing resolution and it's not 4K . it's like totally big mess. because they are 2 different things and they both doesn't work out of the box.

Quote:
Originally Posted by Asmodian View Post
I use:
SDR: RGB 8 bit
HDR: RGB 12 bit (my drivers do not offer a 10 bit option)

RGB is ideal for madVR. Technically the GPU using RGB 10 or 12 bit with madVR set to 10 bit is the highest quality mode with madVR. For the few HDR demos that are over 30 fps I used RGB 8 bit (due to the limits of HDMI 2.0) but it required tricking the drivers to get HDR on with 8 bit output.
the HDMI is set to PC INPUT all the times , so Chroma 4:4:4 is enabled when NVidia is set to AUTO (Default Color setting) for 4K or 1080p without HDR.
When I'm using my TV in 4K 3840x2160 it uses 8 bit RGB.
When I watch 1080p using 1920x1080 resolution it changes to 1080p 12 bit RGB.
When I watch HDR it is 4K 3840x2160 10 bit RGB Which the last time we checked caused issue. ( White pixels on black screen ) which was fixed when changing in NVidia specific to YCbCr 4:2:2 10 bit.

So what you saying is it's ok to use RGB for SDR so it seems ok whichever bit it wants to use , MadVR uses 10 bit anyway, so again it won't hurt in that matter so we are fine

Last edited by x7007; 23rd December 2017 at 10:03.
x7007 is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 04:58.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.