Quote:
Originally Posted by cyberbeing
I'll try out using BT.1886 (-f0 -G2.4) next time I calibrate, but unless it's much different, I've always found using a 2.4 power curve (-f1 -G2.4) to be horrible on my >10,000:1 GDM-F520 CRT. There gets to be a point where everything has unrealistic contrast, and while dark tones are defined and evenly spaced, they become difficult to be seen unless you are in a pitch black room.
|
That sounds like you were calibrated to an
average of 2.4, and not 2.40 gamma at every point on the curve. (10% and below are particularly important, and difficult to get right on a CRT)
If your display has more than 10,000:1 contrast, the BT.1886 curve
is 2.40 gamma.
Quote:
Originally Posted by cyberbeing
I've heard people say that sRGB shouldn't be calibrated to either, yet if you don't, color management software will change the curve to sRGB anyway on sRGB images.
|
I can't think of any professional calibration package that calibrates to the sRGB curve by default - few even
offer the sRGB curve as an option. A 2.2 power curve is typical for PC displays.
The sRGB specification also defines white at 80cd/mē and black at 1cd/mē, giving you a display contrast of 80:1, which looks ridiculous today.
Quote:
Originally Posted by Stephen R. Savage
I've read in a few places that to properly view images the "correct way", one should reinterpret the Rec709 (approximately "2.2") encoded data as 2.4 without correction. Apparently this is intended to correct for the difference between studio (1000 lux) and regular (4-32 lux) viewing conditions. In this case, would I want gamma processing enabled or not?
|
The BT.709 curve is approximately 1.96 gamma,
not 2.2 and is "camera gamma" it is not intended to be used on displays. Nothing was specified because CRTs were in use at the time, and they have a native gamma of approximately 2.4 when correctly set up. (at least studio monitors do)