W3C home > Mailing lists > Public > whatwg@whatwg.org > February 2012

[whatwg] Why isn't the "pattern" attribute applied to <input type="number">?

From: Mounir Lamouri <mounir@lamouri.fr>
Date: Mon, 13 Feb 2012 10:54:03 -0500
Message-ID: <4F39321B.9030601@lamouri.fr>
On 02/10/2012 05:49 AM, Ms2ger wrote:
> On 02/10/2012 11:39 AM, brenton strine wrote:
>> Regarding the an input with type in the "number" state, the spec states
>> that the "pattern" attribute "must not be specified and do[es] not
>> apply<http://dev.w3.org/html5/spec/common-input-element-attributes.html#do-not-apply>
>>
>> to
>> the element". (
>> http://dev.w3.org/html5/spec/states-of-the-type-attribute.html#number-state-type-number
>>
>> )
>>
>> Why is it specifically blocked? Doesn't that encourage the use of a less
>> semantic "text" input type for numbers that need to be validated beyond
>> simple max and min?
>>
>> What if you want the number to be either 13 or 16 digits long, as with a
>> credit card
>>
>> pattern="(\d{5}([\-]\d{4})?)"
>>
>> or you want a US ZIP or ZP4 code which can either be nnnnn or nnnnn-nnnn
>>
>> pattern="(\d{5}([\-]\d{4})?)"
>>
>> To get the pattern to validate, I have to (non-semantically) change the
>> input to the text state? I much prefer the current behavior of Firefox
>> (tested 9 and 10) which does validate the pattern.
> 
> Using input type=number for those cases is wrong. You would not use a
> credit card number or a ZIP code in calculations. (In fact, in the
> United Kingdom, post codes contain letters.)

Ms2ger is correct. What you want is something like proposed in this bug:
https://www.w3.org/Bugs/Public/show_bug.cgi?id=12885

--
Mounir
Received on Monday, 13 February 2012 07:54:03 UTC

This archive was generated by hypermail 2.4.0 : Wednesday, 22 January 2020 16:59:39 UTC