W3C home > Mailing lists > Public > www-style@w3.org > December 1997

Re: Gray, color keywords

From: Chris Lilley <Chris.Lilley@sophia.inria.fr>
Date: Mon, 8 Dec 1997 18:08:46 +0100 (MET)
Message-Id: <9712081808.ZM21073@grommit.inria.fr>
To: Susan Lesch <lesch@macvirus.com>, www-style@w3.org, "gordon" <gordon@quartz.gly.fsu.edu>
On Dec 8,  2:57am, Susan Lesch wrote:

> > I know it must make a difference somewhere, but even with 32bit color
> > I can't 'see' the difference in these two bands of gray.  Do I need
> > to rustle up an old CGA card to see the difference?
> Maybe these will help. The html version uses BGCOLOR in a TABLE and
> has more information but I hope you can see the difference in both?

For what it's worth, I was unable to see a difference in color in the HTML
version, using a Silicon Graphics Indigo2 with a 24bit Solid Impact display
and a high quality SGI trinitron monitor which is correctly set up as per
Charles Poynton's recommendations. This set-up has a gamma correction of
1.7, giving a system gamma of 1.47 (assuming the CRT is 2.5 gamma)

The difference in the GIF was clearly visible, the reason being that as
you say on the HTML page the screenshot was from "Mac OS 8 set to
16-bit color". So the actual values in your GIF are #777777 and #888888
which are clearly distinct.

I checked this by displaying the HTML page on a PowerMac with a
Matrox Millenium card and a recently calibrated Barco Personal
Calibrator display, set to 24 bits. This set-up - in contrast to most
macs - has no gamma correction, giving a measured system gamma of
2.35(red), 2.23(green) and 2.27(blue).

The difference between the two grays was *not* visible.

Switching the display to thousands of colors, using the Matrox control
panel, the difference was clearly visible.

Remember that using a 15bit (thousands) display depth is equivalent to
using a 32by32by32 color cube; while this is clearly much better than
the 6by6by6 (216 colors) color cube used on 8 bit displays by Mac and
PC versions of Netscape, or the 5by5by5 cube (125) used by Unix versions
of Netscape, it is still an approximation.

Incidentally the equation on your page looks suspect.

> 2.01875 * R + 3.9663 * G + 0.7323 * B - 854.0313

normalising the coefficients so that they add to one (ie dividing by
2.01875 + 3.9663 + 0.7323 = 6.71735) gives

0.3005 * R + 0.5904* G + 0.1090 * B - 127.13

which looks (modulo round off errors) to be the NTSC luminance equation
minus 127 (to change values from a 0..255 range to a -127..128 range).
It is thus only valid for the NTSC broadcast monitor, which is grossly
atypical of modern computer monitors. [The only reason this equation
is still doing the rounds is that you can't argue with or update 100
million TV sets with the corresponding decode coefficients hard-wired
into them].

Chris Lilley, W3C                          [ http://www.w3.org/ ]
Graphics and Fonts Guy            The World Wide Web Consortium
http://www.w3.org/people/chris/              INRIA,  Projet W3C
chris@w3.org                       2004 Rt des Lucioles / BP 93
+33 (0)4 93 65 79 87       06902 Sophia Antipolis Cedex, France
Received on Monday, 8 December 1997 12:10:55 UTC

This archive was generated by hypermail 2.3.1 : Monday, 2 May 2016 14:26:46 UTC