- From: Simon Pieters <simonp@opera.com>
- Date: Sat, 07 Feb 2015 11:05:49 +0100
- To: "Tab Atkins Jr." <jackalmage@gmail.com>
- Cc: "www-style list" <www-style@w3.org>, "Boris Zbarsky" <bzbarsky@mit.edu>
On Fri, 30 Jan 2015 20:15:55 +0100, Tab Atkins Jr. <jackalmage@gmail.com> wrote: >> Possibly the spec could have different strategies depending on how it's >> stored, and not tie it to rgba() vs. opacity. > > I don't think the difference between representing 256 and 1001 values > is significant. I'm fine with the algorithm popping out slightly > different results. Or else we can specify that <alphavalue> is > converted to a byte for the purpose of serializing, if we really want > to be identical. > > Letting opacity go to 6 digits just because it's capable of accurately > representing that many doesn't seem very useful. That much precision > is never useful in the first place; you can only barely tell the > difference between .01 increments, let alone .001 increments. > > But this also isn't very important, and so matching IE/Gecko is fine. OK. I agree that it's not particularly useful, but it might be relevant for other things in CSS to use more than 3 digits. I've made the spec match Boris' description (hopefully) when <alphavalue> is internally represented as an 8-bit unsigned integer, and otherwise go through <number> serialization and limit that to 6 decimals. The spec didn't allow sci-not previously and I've still not allowed it, since it exposes float->double rounding errors and will not parse in legacy implementations. https://hg.csswg.org/drafts/rev/8be9d35eb39d -- Simon Pieters Opera Software
Received on Saturday, 7 February 2015 10:06:23 UTC