- From: Willy Tarreau <w@1wt.eu>
- Date: Sat, 4 Nov 2017 13:38:12 +0100
- To: Andy Green <andy@warmcat.com>
- Cc: Matthew Kerwin <matthew@kerwin.net.au>, Mark Nottingham <mnot@mnot.net>, Kazuho Oku <kazuhooku@gmail.com>, Poul-Henning Kamp <phk@phk.freebsd.dk>, HTTP Working Group <ietf-http-wg@w3.org>
On Sat, Nov 04, 2017 at 08:13:25PM +0800, Andy Green wrote: > The question is because some people on a 64-bit capable platform decided to > use 32-bits internally, No, nobody decided to use 32-bit internally, they just used the integer of the default size proposed by their language. "int" is signed 32-bit on the vast majority of platforms and has even been used for decades to store IPv4 addresses. There's no reason for accusating anyone of purposely doing bad stuff, the reality is that unsafe code exists all over the planet by lack of knowledge and awareness. While we can hardly improve people's knowledge using standards, we can at least improve their awareness of the issues. > Nothing stops those servers processing the integers in question as strings > and seeing if they exceed some implementation limit, Honestly, checking for integer values using strings is complex and not natural to anyone. Try to tell this to the guy who used tonumber(header) in my previous example, where "tonumber()" is provided by default on his system. > If they don't do that, they don't even work properly serving large files > today; that's their problem. On small platforms nobody seeks to serve large files, they're dealing with authentication pages, 10 settings on a page, and POST requests to change such settings. Despite this these software do cause interoperability issues right now by improperly parsing some valid responses (eg: when retrieving the whether forecast from public services to adjust the heater). And when you're unforunate enough to be responsible for the proxy in the middle that blocks the so-called "valid" response that the device normally properly deals with, it's a bit of a pain to have to argument that you're the only one applying safe processing there. > > We aren't here to tell people what (not) to do; rather we're here to > > describe how to get something done in a way that is useful and reliable, > > hard to get wrong, and easy to sort out if/when it does eventually go > > wrong. That includes predicting and addressing ways we can foresee that > > it could go wrong, or that similar things have gone wrong in the past. > > If you step back a bit, a simple, clear standard is the best way to get > something "useful, reliable, hard to get wrong, easy to sort out". Piling on > weird stuff - what was it, four "profiles" for integer sizes... is not > making a better standard in those regards than just saying compliant > implementations must handle 64-bit ints. If it becomes so complex just to > eat an int, you will create more "interesting" bugs you say you want to > avoid. I disagree on this one because we all know that we all code based on the target use case. Do you add provisions against bit flips due to solar eruptions in your code ? Most likely not. I don't either. But probably if you were coding for a Mars probe you'd have to use "volatile" in front of all your variables and perform such checks. It's just out of your scope and you skip such checks, that's fine. People developing for platforms adjusting their home temperature based on public forecast do the same, no need to deal with complex code compatible with 64-bit quantities to retrieve a JSON block. > As shown it is not an onerous requirement to say it should handle > 64-bit even on weak platforms like ESP8266 / ESP32. As I shown it's the opposite. *existing* frameworks are not even able to parse anything outside -2^31 and 2^31-1 nor have the types needed for this and these ones are currently being used in various products. > > And nobody (for a given definition of "nobody") cares about > > interoperability test suites, apart from the really big one (i.e. > > reality.) Some of the errors that come up in that one can be rather... > > Interesting(TM). > > Ehhh are you sure :-) I think you find many implementors really like having > test suites. I think you're only working in enterprise environments where some people do care about this. While I too am fond of standards compliance, I'm disgusted by what I see every other day, emitted by software supposed to be working fine, or the type of issues certain software face. Just google for "strstr cookie" or "atoi content-length" to see real-world horrors that *we* can avoid by better specs. But it's not by asking HTTP implementers to have to manually handle large integer arithmetics to comply with a spec that's out of their usage scope that we'll see any progress made in this area! Willy
Received on Saturday, 4 November 2017 12:39:31 UTC