W3C home > Mailing lists > Public > whatwg@whatwg.org > February 2012

[whatwg] Why isn't the "pattern" attribute applied to <input type="number">?

From: brenton strine <whatwg@gmail.com>
Date: Fri, 10 Feb 2012 02:39:14 -0800
Message-ID: <CALgSTaQ4xqWXp=gFMS3qdM+0p+HOpbded=VW-ey08z7Y6tL24A@mail.gmail.com>
Regarding the an input with type in the "number" state, the spec states
that the "pattern" attribute "must not be specified and do[es] not
apply<http://dev.w3.org/html5/spec/common-input-element-attributes.html#do-not-apply>
to
the element". (
http://dev.w3.org/html5/spec/states-of-the-type-attribute.html#number-state-type-number
)

Why is it specifically blocked? Doesn't that encourage the use of a less
semantic "text" input type for numbers that need to be validated beyond
simple max and min?

What if you want the number to be either 13 or 16 digits long, as with a
credit card

pattern="(\d{5}([\-]\d{4})?)"

or you want a US ZIP or ZP4 code which can either be nnnnn or nnnnn-nnnn

pattern="(\d{5}([\-]\d{4})?)"

To get the pattern to validate, I have to (non-semantically) change the
input to the text state? I much prefer the current behavior of Firefox
(tested 9 and 10) which does validate the pattern.

Brenton Strine
Received on Friday, 10 February 2012 02:39:14 UTC

This archive was generated by hypermail 2.3.1 : Monday, 13 April 2015 23:09:11 UTC