View Single Post
Old 19th November 2011, 12:11   #718  |  Link
JanWillem32
Registered User
 
JanWillem32's Avatar
 
Join Date: Oct 2010
Location: The Netherlands
Posts: 1,083
The renderer itself can only dither when it's commanded to do so (setting dithering level 0 will definitely disable it). As to the capacity of the driver to dither down a 10-bit frontbuffer (in exclusive mode) to 8-bit, I've never seen it happening. My main working display is analog (analog connections on my HD4890 are fed by a 10-bit DAC) and my secondary device accepts 10-bit input easily trough HDMI. Maybe someone can try it out again with an old 8-bit DVI-D monitor on a video card that's generally capable of 10-bit output, but of course won't be able to with such a monitor. Will the monitor give a black screen, will the driver refuse to enable 10-bit output mode (so it can be seen in the stats screen), fake the 10-bit output by rounding to 8-bit, or fake the 10-bit output by dithering to 8-bit?
The last option is actually unlikely. Dithering is pretty heavy, so it's a task for the shadercore (biggest block of transistors in a GPU). When the data is written to a back buffer (which later on shifts places to become a front buffer), the shadercore generally doesn't read or write to/from it again. Reading from the front buffer is done by the micro-controllers for the hardware output ports. These do have some logic on board to convert the front buffer format in memory to signals to output on TMDS (DVI/HDMI), Mini-packet (DP) or analog (through a DAC). I can't imagine that any of those micro-controllers would have a dithering unit on board.
The other three options can simply be observed.
Note that some digital displays will actually report the type and bit depth of the incoming signal.
__________________
development folder, containing MPC-HC experimental tester builds, pixel shaders and more: http://www.mediafire.com/?xwsoo403c53hv
JanWillem32 is offline   Reply With Quote