Re: p6: maximum delta-seconds of 2147483648

On 2013-11-15 08:29, Willy Tarreau wrote:
> Hi Julian,
>
> On Fri, Nov 15, 2013 at 08:20:08AM +0100, Julian Reschke wrote:
>> On 2013-11-14 22:26, Willy Tarreau wrote:
>>> Hi Julian,
>>>
>>> On Thu, Nov 14, 2013 at 10:18:23PM +0100, Julian Reschke wrote:
>>>> On 2013-11-14 13:31, Julian Reschke wrote:
>>>>> Agreed.
>>>>>
>>>>> This is the only current issue that holds up draft -25. If no further
>>>>> information comes up, I'll apply this change (unless Roy beats me to
>>>>> it), and submit -25 over the weekend.
>>>>>
>>>>> Best regards, Julian
>>>>
>>>> Proposed patch:
>>>> <http://trac.tools.ietf.org/wg/httpbis/trac/attachment/ticket/512/512.diff>
>>>
>>>   "A recipient parsing a delta-seconds value ought to use an arithmetic
>>>   type
>>>    of at least 32 bits of signed integer range."
>>>                           ^^^^^^
>>> You meant unsigned, right ? Because the value 2147483648 requires at least
>>> 32 bits unsigned or 33 bits signed, but 32 bits signed never satisfies this
>>> requirement.
>>
>> The proposed text is:
>>
>>>    A recipient parsing a delta-seconds value ought to use an
>>>    arithmetic type of at least 32 bits of signed integer range.
>>
>> That is consistent with the previous requirement of "MUST use an
>> arithmetic type of at least 31 bits of range", no?
>
> Yes but it's still confusing. I read it as "declare a variable delta_seconds
> of type signed int" and later "check if that variable equals or is greater
> than 2147483648", which value cannot be represented with this type. So this
> is confusing for an implementer. Also, I know many developers who don't know
> the exact limits of the types they manipulate, especially in the +/-1 at the
> end.
>
> Also quite often, the integer parsing functions will return undefined values
> for the inputs they can't parse, which is worse. Here I'm pretty sure we'll
> see this :
>
>     int delta_seconds;
>
>     ...
>     delta_seconds = atoi(age);
>
>     if (delta_seconds >= 2147483648)
>         delta_seconds = INT_MAX;
>
> But this will never match and the atoi() will return crap, sometimes -1, or 0
> or whatever without the developer even being aware of this.
>
> Is there a reason for this type to be signed ?

a) It's been implicitly like that before.

b) Java, for instance, doesn't have unsigned integers.

Of course we can add even more prose or even pseudo-code, and also 
warnings about broken language libraries. But does this edge case 
*really* require this? Can we please

1) focus on whether this is an improvement over RFC 2616, addressing te 
LC comment, and/or

2) propose concrete changes?

Best regards, Julian

Received on Friday, 15 November 2013 07:41:06 UTC