Quote:
Originally Posted by kolak
I deal with different type of data and done many test, which actually surprised me.
a= take 10bit HD file and scale at 8bit to SD.
b= take 10bit HD file, dither it and scale to SD (at 8bit).
c= take same file scale at 10bit to SD and than dither.
There is actually quite visible difference in all of them, but I was surprised that even b and c show visible difference. I though that there will be no real difference, but there is.
Just a not- we're talking about proper 10bit source- eg shot on RED, Alexa etc cmera.
|
Yes there can be quite big visual difference when going from higher to lower bitdepth one way or another. Also depends on the actual content though. I'll do "c" if it is possible. But in this particular case I was talking about 8bit source to begin with. Sorry for not making it clear, this was kind of continuation of my initial post. Also what you say applies more to genuine 10bit sources than to upconverted.