- From: Boris Zbarsky <bzbarsky@MIT.EDU>
- Date: Tue, 12 Jan 2010 13:53:15 -0500
- To: "Linss, Peter" <peter.linss@hp.com>
- CC: www-style W3C Group <www-style@w3.org>
On 1/12/10 1:15 PM, Linss, Peter wrote: > Second, having "true" units does no one any good when the ratio from device pixels to real world physical units isn't reliably known. If it's known, then there's no reason why regular units can't be "true" units. The point (no pun intended) is that there is such a reason: doing it is not compatible with most of the web content out there. > I do see a use case for a device pixel unit, but it still scares me as it's ripe for abuse and will likely be misunderstood. The best use case I can come up with for it is a hairline border, where I want it to be the thinnest possible line that the output device can render. If there's really a compelling need for that, I think I'd rather have a "hairline" unit than device pixel. Indeed, since it may well be that the thinnest possible line the output device can render is actually invisible to human eyes. For example, a 1-pixel line on a 2400-dpi printer is nominally 0.01mm == 10um wide; http://en.wikipedia.org/wiki/10_micrometres and http://en.wikipedia.org/wiki/1_micrometre have some things in this range (e.g. very very fine human hairs, cotton/silk/nylon fibers, 1/10 the thickness of a typical sheet of paper, two red blood cells next to each other, that sort of thing). I suspect that most papers wouldn't actually show that line at that width, but if they did it would be below even the theoretical maximum visual acuity of human eyes (.35mm at 1m viewing distance according to <http://en.wikipedia.org/wiki/Eye#Acuity>) if held at normal reading distance. I might be ok with a "hairline" width for this use case, with the UA trying to do something sane with it, I guess. -Boris
Received on Tuesday, 12 January 2010 18:53:50 UTC