Re: Precision of numbers using JSON Header Field Values

Hi Kazuho,

On Fri, Jul 15, 2016 at 01:13:31PM +0900, Kazuho Oku wrote:
(...)
> 0.0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000100e400
> is evaluated as 100. OTOH, using ruby 2.0.0, the same expression is
> interpreted as zero.

Nice!

> (please refer to
> https://gist.github.com/kazuho/7e72372111fa0655692d9e44be70c7ea for
> the full output)
> 
> IMO such disagreement is a mine of vulnerabilities.  For example in
> case of HTTP/1.1, it could lead to HTTP response splitting.
> 
> So personally, I think we should forbid the use of exponential
> notation at least in some cases, but I wonder if adding such
> restriction is intended or aligns with the motive to use an existing
> notation for HTTP headers.

Do we have a method to suggest to decoders to reject this ? I mean,
we can say it's forbidden, but if recipients systematically pass the
whole string through a non-configurable decoder, the vulnerability
continues to be deployed. That's especially true for all "MUST NOT"
which tend to be considered as granted by senders with no need to
check on the recipient unfortunately :-/

Also are we sure senders will never emit exponents without being
explicitly instructed to do so ? I'm asking because I wouldn't
like to see for example, "Content-length: 1000000500" being
encoded as "Content-length: 1e09" and then decoded as
"Content-length: 1000000512" (for example) because it's parsed
as a float by one implementation which can only use single
precision numbers.

Willy

Received on Friday, 15 July 2016 04:37:27 UTC