Re: [csswg-drafts] [cssom] Serialization of large numbers should use scientific notation (#6471)

I'm not sure I totally understand the intent of the resolution for the new definition. The current definition to serialize `<number>` is:

> A base-ten number using digits 0-9 (U+0030 to U+0039) in the shortest form possible, using "." to separate decimals (if any), rounding the value if necessary to not produce more than 6 decimals, preceded by "-" (U+002D) if it is negative.

Reading the IRC log and observing values produced in Chrome and Firefox, I think the intent is to limit to 6 significant digits, not to 6 decimals. Is it correct?

Does it refer to `Number.toPrecision()` as the *serialization of JS* to match? `Number.toPrecision()` does not always produce *the shortest form possible*, eg. `(100).toPrecision(2)` is `1.0e+2` instead of `1e2`, and I also read: *Leading 0s are dropped*. Leading 0s are not dropped by `Number.toPrecision()`. 

What is the purpose of serialization in the shortest possible form? Is it to optimize the internal storage of values?

Lastly, this resolution only applies to `<number>` and not to `<integer>`, isn't it? Then this issue does not justify modifying Syntax to produce a `<number-token>` with a `type` equal to `integer` in some cases where scientific notification is used, isn't it?

-- 
GitHub Notification of comment by cdoublev
Please view or discuss this issue at https://github.com/w3c/csswg-drafts/issues/6471#issuecomment-1145989869 using your GitHub account


-- 
Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config

Received on Friday, 3 June 2022 13:54:49 UTC