- From: Clive Bruton <clive@typonaut.demon.co.uk>
- Date: Tue, 17 Mar 1998 02:39:54 -0000
- To: <www-style@w3.org>
George Olsen wrote at 16/03/98 11:31 pm >Unfortunately, typographic sizes *are* still dependent on the OS (i.e., >type on Windows will be about a third larger than on Mac -- don't know what >Unix does) unless you've specified the type size in pixels. I keep seeing this kind of stuff written here, and elsewhere, and I keep seeing people contradicting it, but still it gets written!? If you specify type in pixels you take a guess at what dpi someone's screen *might* be running at, this is a bad move. The only sensible way to spec type for the web is in multiples of an "em", that way it scales. Win and Mac have notionally different pixel per em values, and notionally different dpi values. If you take those notional defaults and apply them to type you get the following results: 12pt type on Win = 16ppem = 16px high = 12pt measured with a depth scale held up to the screen 12pt type on Mac = 12ppem = 12px high = 12pt measured with a depth scale held up to the screen So, at these notional defaults the type appears to be exactly the same size, because in fact it is exactly the same (real world) size. The same experiment, this time with px being our starting point: 12px type in Win = 12ppem = 12px high 9pt measured with a depth scale held up to the screen 12px type in Mac = 12ppem = 12px high 12pt measured with a depth scale held up to the screen So, measure your type in pt and you get the same notional real world size on Mac and Win, with the Win at a higher ppem value. Measure your type in px and the Mac type looks bigger by about a third in a real world application, though the ppem values are identical. On the other hand knock the screen res up to 120dpi on either system and 12px still equals 12ppem, but it's a little over 7pt high, not something that would read too well on screen. -- Clive
Received on Monday, 16 March 1998 21:43:31 UTC