- From: Tab Atkins Jr. <jackalmage@gmail.com>
- Date: Tue, 18 May 2010 16:19:43 -0700
- To: Alex Meiburg <timeroot.alex@gmail.com>
- Cc: www-style@w3.org
On Tue, May 18, 2010 at 4:09 PM, Alex Meiburg <timeroot.alex@gmail.com> wrote: > As for the mapping (from 0-255 to 0-1 float), this might be well based in > part off how UA's currently store the Alpha value. If a decimal is actually > rounded to a hex value and used like that.... great. I think 0x80 should map > to 0.50196078431372548 then. But if alpha values are actually stored as > floats, we really would be taking some (minimal) freedom away from the > developer, so I feel it's a bit more open to discussion how it should be > handeled. Could someone explain how UA's handle these? Webkit is weird. It stores the alpha as the highest byte in a 32-bit integer. In other words, it stores it as an integer from 0-255. Then, when actually displaying the color, if the alpha value is 255, the color is fully opaque - the opacity is 1.0. Otherwise, the alpha value is divided by **256** to provide an opacity value. Yes, this is crazy, as this means that the opacity jumps from 254/256 straight to 256/256. I suspect we need to fix this, but I also suspect it'll break a decent number of tests that I'll need to fix. ~TJ
Received on Tuesday, 18 May 2010 23:26:07 UTC