Re: [css3-selectors] :nth-child issues

Tab Atkins Jr. wrote:
> On Tue, Oct 20, 2009 at 4:20 PM, Anton Prowse <> wrote:
>> Tab Atkins Jr. wrote:
>>> On Tue, Oct 20, 2009 at 4:04 PM, Anton Prowse <>
>>> wrote:
>>>> is, pedantically speaking, incorrect since "-0" is not an integer; and
>>>> more importantly, from the following:
>>> Why is -0 not an integer?
>> Because the set of integers is precisely ...,-2,-1,0,1,2,... and its
>> elements do not have aliases.
> Anne already answered you with a quote,

"-0" is not an integer, other than in the sense that it is occasionally
used as a representation of the integer more frequently represented as
"0".  In fact, it has a more subtle mathematical meaning.(*)  Hence,
before I was reminded of what the acceptable notations were, I believed
it necessary to call out in the prose that "-0" is allowed even though
one does not naturally think of it as being encompassed by the term

However, as was clear, I hadn't given thought to the fact that the
"integers" in that section were intended to be represented by the CSS
value "[sign]<integer>" where <integer> is a specific representation of
a natural number.  (I was mistakenly assuming a more mathematically
faithful representation.)

Perhaps the intended representation could be indicated, eg by a link to
the relevant section of CSS3-values from

   # The a and b values must be integers (positive, negative, or zero)

but more directly, what do you
> mean "its elements do not have aliases".  -0 is definitely an integer
> equivalent to 0.  0 is the only number with multiple representations
> in the integers.

These statements don't mean anything at a mathematical level.  "-0" is
no more (or less) meaningful as an alternative representation of 0 than
"+4" is of 4, or the roman numeral VII is of 7; and the one notation is
only equivalent to the other because one defines it to be.

The distinction between notation and the thing it represents is subtle
and not normally important (and well-chosen notation blurs the
difference very naturally), but this distinction is paramount when
dealing with parsers and formal grammar, for example.  My "-0a" concern
arose because, without specifying the permitted representations (or
assuming the wrong representation), it doesn't follow at all that "-0"
is an acceptable integer, and a great many parsers in a great many
software applications agree!

(*) As for where "-0" notation comes from: in formal mathematics one may
write "-0" to mean "the additive inverse of 0" and use it in expressions
such as 0 + -0 = 0, but in that case the "-0" is merely mathematical
notation and does not in itself say anything about which of the integers
"-0" actually is.  (It is a theorem -- which requires proof -- that the
additive inverse of 0 turns out to be 0.)

Anton Prowse

Received on Wednesday, 21 October 2009 01:27:28 UTC