View Single Post
Old 11th April 2009, 16:31   #118  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Quote:
Originally Posted by yesgrey3 View Post
I always thought that being CRT analog it would be able of higher than 8bit bit depth, but recently I have read that the CRT phosphors limit is 8bit... and even if you feed it more you will not be able to see any difference...
It seems we can only achieve >8bit color with digital displays...
ARGYLLCMS has a tool to check the LUT accuracy, of course it's 8bit in DVI but it's 10bit in VGA on nvidia cards(and 9 on ATi )
but because of the D/A > A/D conversions, and the fact that CRT monitors have legacy onboard IC....prolly it doesn't really matter.
as about sending pure 10bit, well for the same reasons I'd be rather dubious...or maybe w/ 5BNC connectors on professional broadcast equipment.
Quote:
Originally Posted by yesgrey3 View Post
But HR does not make such an intensive use of GPU as madVR, and also the memory use in madVR is very high due to the 3dlut (96MB) and all the intermediate buffers at 16bit per component... That's why we need to redefine our working minimuns.
With madVR doing everything in the GPU, we would have more CPU power for software AVC decoding...
oh sure, doing the RGB32 conversion in 32float, plus applying your LUT's and scaling in spline is a HUGE plus(the YUY2 coeffs in HR are completely off)

my GF9600GSO is actually a rebadged 8800GS and it's got 96SP(the regular 9600GT has only 64), so w/ an o/c Q6600 it should take the load hopefully

Last edited by leeperry; 11th April 2009 at 16:45.
leeperry is offline   Reply With Quote