- From: Joel N. Weber II <devnull@gnu.ai.mit.edu>
- Date: Sat, 23 Aug 1997 20:42:51 -0400
- To: fahrner@pobox.com
- CC: clive@typonaut.demon.co.uk, www-font@w3.org, www-style@w3.org
A point is a good starting reference in the physical world, and among devices that respect absolute physical units. Printers worth their salt know what a point is, but the display systems attached to 95+% of desktop machines don't know a point from a pixel from a bit of sneeze-dew. Mac users are generally out of luck unless they set their displays *physically* to 72 dpi, as are most Windows folk unless their displays are 96 or 120 dpi (hold a ruler up and count the dots...). I understand that some display systems permit sophisticated users to map points to pixels however they please. This will make points a good general unit as soon as 90% of users have such a mapping as part of their personal profile, something that kicks in when you enter a username and (network) password. Without rebooting - perhaps even with a live point-to-pixel slider.... Let's say Windows 2002 is released in 2005, and reaches 90% penetration by 2009? IIRC, when you set up XFree86 (the version of X11 commonly used on free OSes such as GNU/Linux, FreeBSD, etc), you tell it the physical size of your monitor. And an X11 call can be used to retrive this info. (It's also been rumored that suns running X11 often return incorrect information about the physical size of the screen.) Having a points to pixels mapping *per user* would be stupid; I've used several different telnet programs, X servers, etc. It might be useful, though, for a user to ask for everything to be multiplied by some factor, to accomodate near-blindness, etc.
Received on Sunday, 24 August 1997 22:52:54 UTC