[Tig] Apple Color strange behavior

Webster, Anne anne.webster at dolby.com
Thu Sep 2 21:23:26 BST 2010

Thanks to those who have pointed out the Mac gamma differences.

However, there is still a mathematical mystery from my point of view.
Bob, your answer below may answer it, but I'm not sure I understand.  
This is what I expect to happen:  
1 - Encode in Matlab:  x^(1/1) i.e. no encoding, linear code values save
directly to file
2 - Decode in Apple Color:  [x^(1/1)]^(1.8) = x^1.8

However, I'm seeing a total curve of x^(1.8/2.2) in the RGB scopes.
Where did the 2.2 actually come from?

I'm not sure what my tiff tag says, can anyone suggest a free program
that can read them for me?  How does the tiff gamma tag get used in

Thanks again for all your help.

On Thu, 12 Aug 2010, Rob Lingelbach wrote:
> I created a 16 bit tiff file (using Matlab) containing a gray ramp, 
> i.e. code values from 0 to 65535 in each RGB channel in a linear 
> horizontal ramp (so no gamma encoding).  When I input this into 
> Apple Color, I therefore expect it to show me linear relationship in 
> the RGB scopes.

Default behavior for an integer TIFF file will be to assume that the 
values are gamma corrected so that they represent intensities (i.e. 
not "linear").  It is possible to add a gamma tag to a TIFF file to 
indicate the gamma which is actually used.

Apple has always been weird about gamma since it uses different 
display gamma than the rest of the world.

Bob Friesenhahn
bfriesen at simple.dallas.tx.us,
GraphicsMagick Maintainer,    http://www.GraphicsMagick.org/


More information about the Tig mailing list