[whatwg] <input type=number> for currency input


As the spec currently stands, use of <input type=number> is unsuitable 
for currency and other input that require a minimum number of decimal 
points displayed. When displaying decimal currency values, the typical 
convention is that a precision of two decimal points is used, 
regardless of numeric value. That is, one dollar is displayed as 
"1.00", not "1". However the latter is the result when using <input 
type=number> in implementations that follow the spec, such as Chrome 
and Firefox.

Section " Number state (type=number)" currently states: "If 
the user agent provides a user interface for selecting a number, then 
the value must be set to the best representation of the number 
representing the user's selection as a floating-point number" - 
effectively by calling JavaScript's ToString on the number. This gives 
rise to the undesirable representation above.

Since both the spec and existing implementations use the step attribute 
to effectively specify the maximum number of decimal points for the 
representation of the number, it also seems reasonable for the step 
attribute to also define the minimum. This can perhaps be acheived by 
changing the definition of the "best representation of the number n as 
a floating-point number" to use the JavaScript Number.ToPrecision 
function, and obtaining the precision from the step attribute, rather 
than using ToString.

This will work for integral currencies by specifying a step of 1, as 
well as decimal currencies that that use a single decimal point, by 
specifying a step of "0.1".



⊨ Michael Gratton, Percept Wrangler.
⚙ <http://mjog.vee.net/>

Received on Wednesday, 23 July 2014 05:11:19 UTC