Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Announcements and Chat > General Discussion

Reply
 
Thread Tools Search this Thread Display Modes
Old 21st January 2015, 08:27   #1  |  Link
Remicade
Registered User
 
Join Date: Dec 2009
Location: Romania
Posts: 98
HDMI-HDMI vs DVI-HDMI on nvidia ?

I have GTX 750 ti when I use HDMI-HDMI the graphic card outpout 16-235 and when I use DVI-HDMI 0-255. Why ?
Remicade is offline   Reply With Quote
Old 21st January 2015, 09:58   #2  |  Link
Ghitulescu
Registered User
 
Ghitulescu's Avatar
 
Join Date: Mar 2009
Location: Germany
Posts: 5,769
It's very simple
DVI is for computers.
HDMI is for video.
__________________
Born in the USB (not USA)
Ghitulescu is offline   Reply With Quote
Old 21st January 2015, 16:05   #3  |  Link
hello_hello
Registered User
 
Join Date: Mar 2011
Posts: 4,823
Quote:
Originally Posted by Ghitulescu View Post
It's very simple
DVI is for computers.
HDMI is for video.
Was he referring to the video output levels or the "global" output levels?

"DVI for computers" and "HDMI for video" doesn't really work in that context as lots of video cards have HDMI out rather than DVI, while TVs have dedicated HDMI/PC inputs and computers are used to play video.

Remicade,
Is there a setting to change the video output levels in the Nvidia control panel under the Video section? Is it available for both HDMI and DVI? If so, do either work?
You weren't very specific with your question in respect to whether you need help changing the levels or if either method produces the "correct" levels or whether you were referring only to the video levels.

I've read lots of complaints in the past regarding HDMI always being limited range with Nvidia drivers, but I think it's only a problem if you're running a version of Windows newer than XP.
Apparently the Nvidia drivers automatically select a "global output range" based on whether the chosen resolution is from the TV or PC list of resolutions in the Nvidia Control Panel. There's a thread here including a link to a utility for forcing full range video output if you need it and a post explaining how the "global output range" effects the video levels, although I'm not sure I've got my head around it.

It seems like it might be a good idea. Connect at PC resolutions and refresh rates and Windows runs in full range mode as it's always done, or connect at TV resolutions and refresh rates and everything is scaled back to limited range....
Although there's not much of a line between them any more.

This post says there's now a "global levels" option under Desktop Colour Settings in the control panel. That might effect the way the Video output range setting works but I don't really know.

My old Nvidia card is dual VGA/DVI and by default they both output 16-235 for video (Windows XP). I've always thought it silly they don't expand video to full range by default, but at least it's pretty simple. Windows full range, video levels expanded to full range or not, some basic colour adjustments and that's about it.

Last edited by hello_hello; 22nd January 2015 at 00:45.
hello_hello is offline   Reply With Quote
Old 21st January 2015, 17:46   #4  |  Link
Ghitulescu
Registered User
 
Ghitulescu's Avatar
 
Join Date: Mar 2009
Location: Germany
Posts: 5,769
Quote:
Originally Posted by hello_hello View Post
"DVI for computers" and "HDMI for video" doesn't really work in that context as lots of video cards have HDMI out rather than DVI, while TVs have dedicated HDMI/PC inputs and computers are used to play video.
It does work .
That some TVs have DVI and some monitors have HDMI inputs, this won't change this. I've carried out a simple test - in amazon there are currently 170 HDMI monitors for PC against 1427 DVI ones (I neglected those models that offer both, these must be less than 170 anyway).
Similarly, that a particular DVD player is able to play DivX, this doesn't change the fact that DivX was created for computers and not for TVs.
__________________
Born in the USB (not USA)
Ghitulescu is offline   Reply With Quote
Old 21st January 2015, 19:34   #5  |  Link
SeeMoreDigital
Life's clearer in 4K UHD
 
SeeMoreDigital's Avatar
 
Join Date: Jun 2003
Location: Notts, UK
Posts: 12,219
Quote:
Originally Posted by Remicade View Post
I have GTX 750 ti when I use HDMI-HDMI the graphic card outpout 16-235 and when I use DVI-HDMI 0-255. Why ?
The software for some graphics cards permits you to send 0-255 colour directly via the HDMI output: -




Cheers
__________________
| I've been testing hardware media playback devices and software A/V encoders and decoders since 2001 | My Network Layout & A/V Gear |
SeeMoreDigital is offline   Reply With Quote
Old 21st January 2015, 19:35   #6  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
The new Nvidia drivers offer range options for each display. It is amazing, after only a decade or two it is possible to actually set the range correctly for each display simply using the driver control panel!

You do not want to use the setting SeeMoreDigital posted, that only changes the video range. You can change the range for the entire display on the display properties page. Sorry not on an Nvidia computer at the moment.

Last edited by Asmodian; 21st January 2015 at 19:37.
Asmodian is offline   Reply With Quote
Old 21st January 2015, 20:51   #7  |  Link
Remicade
Registered User
 
Join Date: Dec 2009
Location: Romania
Posts: 98
Quote:
Originally Posted by hello_hello View Post
Remicade,
Is there a setting to change the video output levels in the Nvidia control panel under the Video section? Is it available for both HDMI and DVI? .
Well right now I use HDMI-HDMI connction, I have RGB and Dynamic Range Limited 16-235 in nvidia Control Panel, I cannot switch to 0-255, I hit Apply and the settings are back to 16-235. The video card is connected to Samsung 48H5030 LCD TV, I use madVR and I adjusted brightness and contrast with AVS 709 HD. If I switch to DVI-HDMI will be any benefit ? I watch only movies on TV, I don't play games or use as a monitor.
Remicade is offline   Reply With Quote
Old 21st January 2015, 21:25   #8  |  Link
foxyshadis
ангел смерти
 
foxyshadis's Avatar
 
Join Date: Nov 2004
Location: Lost
Posts: 9,558
DVI-HDMI takes away all of the YUV options (and audio) and leaves you with only RGB. The system trying to force you to use YUV (which also subsamples color to 4:2:0... check out black text on a red background) is Windows and the video driver being too clever for their own good, trying to force it to be a TV instead of a monitor. There are tricks you can use to fix it, like hacking the monitor .ini file in the Windows drivers folder to input your own EDID, but that's a huge pain and only works sometimes. Using DVI makes all the pain go away.

Of course, if you're using it as a TV and not a monitor you need very crisp text and most accurate color on, then it probably won't matter to you either way. Now the fact that it is a TV might mean that the overscan is affected on DVI, you'll have to look into disabling or mitigating that, and it might also mean that the TV can only input YUV and will convert from RGB to YUV and back even on DVI. TVs are a pain.
foxyshadis is offline   Reply With Quote
Old 21st January 2015, 22:06   #9  |  Link
SeeMoreDigital
Life's clearer in 4K UHD
 
SeeMoreDigital's Avatar
 
Join Date: Jun 2003
Location: Notts, UK
Posts: 12,219
Quote:
Originally Posted by Remicade View Post
If I switch to DVI-HDMI will be any benefit ? I watch only movies on TV, I don't play games or use as a monitor.
In short... no! Given all commercial video is currently 4.2.0 just use HDMI to HDMI
__________________
| I've been testing hardware media playback devices and software A/V encoders and decoders since 2001 | My Network Layout & A/V Gear |
SeeMoreDigital is offline   Reply With Quote
Old 21st January 2015, 23:33   #10  |  Link
hello_hello
Registered User
 
Join Date: Mar 2011
Posts: 4,823
Quote:
Originally Posted by Ghitulescu View Post
That some TVs have DVI and some monitors have HDMI inputs, this won't change this. I've carried out a simple test - in amazon there are currently 170 HDMI monitors for PC against 1427 DVI ones (I neglected those models that offer both, these must be less than 170 anyway).
PCs are often connected to TVs.
Video card manufacturers have blurred the line between DVI and HDMI for quite a while. DVI cards can often output HDMI signalling. The video card in this PC must be at least six years old. It's capable of YCbCr 4:4:4 output. DVI is officially only RGB.
TVs may be predominantly HDMI but it supports DVI signalling, making DVI on TVs a bit redundant.

Nvidia have a system where the output is considered either HDMI or DVI, I think depending on resolution. This video card has no HDMI out. Only DVI. The Nvidia control panel shows it as being connected to my TV via HDMI.

DVI is going the way of the dodo. Chances are either HDMI and/or Display Port will succeed it. Just ask Intel, four years ago.

Quote:
Originally Posted by Ghitulescu View Post
Similarly, that a particular DVD player is able to play DivX, this doesn't change the fact that DivX was created for computers and not for TVs.
It doesn't stop the majority of DVD/Bluray players on the planet from playing DivX video though, does it?
hello_hello is offline   Reply With Quote
Old 21st January 2015, 23:41   #11  |  Link
hello_hello
Registered User
 
Join Date: Mar 2011
Posts: 4,823
Quote:
Originally Posted by foxyshadis View Post
DVI-HDMI takes away all of the YUV options (and audio) and leaves you with only RGB.
Not for my 8600GT (re YUV options).
See my previous post.

You had to dare me.....

http://en.wikipedia.org/wiki/Radeon_...Other_features
Each DVI output includes dual-link HDCP encoder with on-chip decipher key. HDMI was introduced, supporting display resolutions up to 1,920×1,080, with integrated HD audio controller with 5.1-channel LPCM and AC3 encoding support. Audio is transmitted via DVI port, with specially designed DVI-to-HDMI dongle for HDMI output that carries both audio and video.
hello_hello is offline   Reply With Quote
Old 21st January 2015, 23:55   #12  |  Link
foxyshadis
ангел смерти
 
foxyshadis's Avatar
 
Join Date: Nov 2004
Location: Lost
Posts: 9,558
Quote:
Originally Posted by hello_hello View Post
Not for my 8600GT (re YUV options).
See my previous post.

You had to dare me.....

http://en.wikipedia.org/wiki/Radeon_...Other_features
Each DVI output includes dual-link HDCP encoder with on-chip decipher key. HDMI was introduced, supporting display resolutions up to 1,920×1,080, with integrated HD audio controller with 5.1-channel LPCM and AC3 encoding support. Audio is transmitted via DVI port, with specially designed DVI-to-HDMI dongle for HDMI output that carries both audio and video.
Huh, I haven't seen that before. I haven't seen 4:4:4 YUV output either, only 4:2:0, and that's on two systems less than two years old. (One Intel, one nVidia.) I guess they just do whatever they feel like, instead of trying to stay consistent.
foxyshadis is offline   Reply With Quote
Old 22nd January 2015, 00:03   #13  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,340
Quote:
Originally Posted by foxyshadis View Post
I haven't seen 4:4:4 YUV output either, only 4:2:0, and that's on two systems less than two years old.
Thats unlikely, considering 4:2:0 YUV was not supported on HDMI at all until the recent HDMI 2.0 spec, before it was just 4:4:4 or 4:2:2
And you specifically need to force YUV output on those cards, they all default to RGB.

Also, many NVIDIA cards can just use a passive generic DVI->HDMI adapter without any change in functionality otherwise, they detect automatically if a TV or PC display is connected.
AMD needs a proprietary adapter, but otherwise should offer similar things.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 22nd January 2015, 00:15   #14  |  Link
hello_hello
Registered User
 
Join Date: Mar 2011
Posts: 4,823
Quote:
Originally Posted by Asmodian View Post
The new Nvidia drivers offer range options for each display. It is amazing, after only a decade or two it is possible to actually set the range correctly for each display simply using the driver control panel!

You do not want to use the setting SeeMoreDigital posted, that only changes the video range. You can change the range for the entire display on the display properties page. Sorry not on an Nvidia computer at the moment.
You might want to use the setting SeeMoreDigital posted. I do. It depends on the display and what it'll work with in respect to input levels.
I can tell my TV to expect PC levels, which of course means I've got to expand the video levels for it just like a PC monitor, but what happens if the TV/monitor is set to expect TV levels?
I'm asking mainly in respect to image quality with everything reduced to limited range. Does it make a noticeable difference?

Quote:
Originally Posted by Remicade View Post
Well right now I use HDMI-HDMI connction, I have RGB and Dynamic Range Limited 16-235 in nvidia Control Panel, I cannot switch to 0-255, I hit Apply and the settings are back to 16-235. The video card is connected to Samsung 48H5030 LCD TV, I use madVR and I adjusted brightness and contrast with AVS 709 HD. If I switch to DVI-HDMI will be any benefit ? I watch only movies on TV, I don't play games or use as a monitor.
When you connect via DVI, are you still using the same input on the TV? Can you change the HDMI input level? HDMI black level, I think Samsung call it. "Normal" means PC levels. Go figure. It's a TV and the the PC levels input setting is "normal". For my TV, "Low" sets the TV to expect TV levels.

It may even be possible there's some sort of communication going on between the TV and PC when using HDMI where the TV says "hey, I want TV levels". I don't know if it can, but there's still a lot of variables.
When you compare DVI and HDMI in respect to the way video looks, does it look different? I know you said HDMI is limited levels and DVI is full range but I'm just wondering what the TV's doing?

Apparently Nvidia drivers can send "content type information" over HDMI that says "be a desktop", and then when you run a video full screen, it says "be a TV", or something like that, or you can set it manually. Apparently it's under Display/Adjust Colour settings. If it can get the TV to switch modes, I could only imagine what else it might do.

As a side note:
Here's an example of what I meant in an earlier post when I said Nvidia seem to refer to HDMI and DVI as interchangeable, but not dependant on using a DVI or HDMI out.

http://nvidia.helpmax.net/en/display...splay-content/
This instruction applies to HDMI monitors (not treated as DVI) and GPUs that support AVI infoframes on Windows Vista and later.
It strongly implies you can connect via HDMI and the drivers will "treat a monitor as DVI" under some circumstances. And logically DVI can be treated as HDMI as seems to be the case for my card and TV. If that's what happens, I don't know how it works though.

Last edited by hello_hello; 22nd January 2015 at 00:40.
hello_hello is offline   Reply With Quote
Old 22nd January 2015, 00:23   #15  |  Link
hello_hello
Registered User
 
Join Date: Mar 2011
Posts: 4,823
Quote:
Originally Posted by foxyshadis View Post
Huh, I haven't seen that before. I haven't seen 4:4:4 YUV output either, only 4:2:0, and that's on two systems less than two years old. (One Intel, one nVidia.) I guess they just do whatever they feel like, instead of trying to stay consistent.
If you'd care to approve the pic attached.......
Attached Images
 
hello_hello is offline   Reply With Quote
Old 22nd January 2015, 03:06   #16  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by hello_hello View Post
You might want to use the setting SeeMoreDigital posted. I do. It depends on the display and what it'll work with in respect to input levels.
I can tell my TV to expect PC levels, which of course means I've got to expand the video levels for it just like a PC monitor, but what happens if the TV/monitor is set to expect TV levels?
I'm asking mainly in respect to image quality with everything reduced to limited range. Does it make a noticeable difference?
It isn't that you do not want limited range it is that Nvidia recently added an option to set the range for all content, not only video. If you use the setting SeeMoreDigital posted video will be correct on a limited range display but your desktop will be clipped. Using the new limited option everything looks correct.

Last edited by Asmodian; 22nd January 2015 at 04:31. Reason: added image
Asmodian is offline   Reply With Quote
Old 22nd January 2015, 06:15   #17  |  Link
hello_hello
Registered User
 
Join Date: Mar 2011
Posts: 4,823
Quote:
Originally Posted by Asmodian View Post
It isn't that you do not want limited range it is that Nvidia recently added an option to set the range for all content, not only video. If you use the setting SeeMoreDigital posted video will be correct on a limited range display but your desktop will be clipped. Using the new limited option everything looks correct.
Given everything would be scaled from bitmaps to images in webpages I was just curious if it'd be noticeable or whether it all looks the same in the end. I've not seen a computer running in "global limited range" mode so I don't know.

Cheers.
hello_hello is offline   Reply With Quote
Old 22nd January 2015, 08:00   #18  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
The option in "Adjust video image settings" doesn't change bitmaps, web pages, the OS, thumbnails, etc. The new one does. To me it is instantly noticeable but I notice shadow crush pretty quickly. It is a very obvious change toggling between them. This option is screen specific so your normal monitor is still full range. Nvidia finally gave an option to override the default global output levels for HDMI, DVI, etc. With display port becoming main stream I assume they could not decide what a good default was so they added the option.

My desktop image looks so much better at the correct range.

As I never do much browsing on a TV the biggest benefit for me is thumbnails and the OS in general. The faint shades of gray are visible again.
Asmodian is offline   Reply With Quote
Old 22nd January 2015, 21:33   #19  |  Link
hello_hello
Registered User
 
Join Date: Mar 2011
Posts: 4,823
Quote:
Originally Posted by Asmodian View Post
The option in "Adjust video image settings" doesn't change bitmaps, web pages, the OS, thumbnails, etc. The new one does. To me it is instantly noticeable but I notice shadow crush pretty quickly. It is a very obvious change toggling between them.
I'm still not sure we're on exactly the same wavelength. The question I've been trying to ask would be this:

Setup 1: TV set to expect PC levels, Nvidia global output range PC levels. Windpws displays correctly.
Setup 2: TV set to expect TV levels, Nvidia global output range TV levels. Windows displays correctly.

Now comes the question...... do they look the same?
I'm wondering if the scaling has a negative effect on the way Windows and programs look. I'm pretty sure I've read posts in the past where someone's claimed the expanding of video levels to PC levels for a PC monitor increases the likelihood of banding, which seems plausible if there's rounding errors, but I'm not sure I've seen a difference between PC levels in and out and TV levels in and out in that respect.
It's almost impossible to make valid comparisons without two identical TVs and two identical video cards etc so you could compare them side by side.

Quote:
Originally Posted by Asmodian View Post
My desktop image looks so much better at the correct range.
It would.
I don't have the "global" levels option (still using XP) but I can set the output to YCbCr 4:4:4, and when I do the TV defaults to expecting TV levels and won't let me change it. Video looks fine without expanding the levels but Windows itself looks a bit off/crushed/dark. I assume it's still full range.
hello_hello is offline   Reply With Quote
Old 23rd January 2015, 02:46   #20  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by hello_hello View Post
I'm still not sure we're on exactly the same wavelength. The question I've been trying to ask would be this:

Setup 1: TV set to expect PC levels, Nvidia global output range PC levels. Windpws displays correctly.
Setup 2: TV set to expect TV levels, Nvidia global output range TV levels. Windows displays correctly.

Now comes the question...... do they look the same?
Ah I see, yes they look the same.
Asmodian is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 09:15.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.