View Single Post
Old 11th September 2014, 14:19   #28  |  Link
Zachs
Suptitle, MediaPlayer.NET
 
Join Date: Nov 2001
Posts: 1,721
Quote:
Originally Posted by huhn View Post
all card since nvidia 2xx and amd 5xxx should be able to output 10 bit on a direct x "fullscreen" surface. and I don't think there should be a way without this. of cause professional cards can do it with openGL.

edit: program crashes when a file with ASS subtitle is load but no vsfilter is installed. I only tried the 64 bit version.

and about 10 bit is there a way to see if the player tries to send a 10 bit I think RGB "picture to the GPU?

I highly doubt my TV is 10 bit but I can send it 12 bit RGB and it except it.

shouldn't 10 bit dithered RGB look like 8 bit rounding on a 8 bit screen. ok there is a chance my screen can use dither too... not easy to test without a 10 bit screen.
Can you get the stack trace for the crash?

Re: 10 bit support, my check for A2R10G10B10 always fails on my GTX560 (on a C++ test app):
PHP Code:
Direct3DCreate9Ex(D3D_SDK_VERSION, &d3d); // create the Direct3D interface

if (FAILED(d3d->CheckDeviceType(0D3DDEVTYPE_HALD3DFMT_A2R10G10B10D3DFMT_A2R10G10B10FALSE)))
    return 
false
If anyone could give me a hand on how to actually get 10-bit support on my GTX560 I'd appreciate it! (EDIT: Apparently, even MPC-HC says my GTX560 has no 10-bit output support)

To find out if it's 10 bit enabled, get the player to show stats (ctrl+J). If you see "High bit depth output enabled" then it is 10 bits in FSE mode.

Last edited by Zachs; 11th September 2014 at 15:01.
Zachs is offline   Reply With Quote