Re: Using enums to avoid default true in "settings dictionaries" (#466, #467, #471)

On Tue, Feb 23, 2016 at 10:48 PM, Jan-Ivar Bruaroey <jib@mozilla.com> wrote:

> On 2/23/16 6:30 PM, Peter Thatcher wrote:
>
> My opinion: B is better than A.  I have no sympathy for someone calling
> foo(bar) and expecting it to be the same as foo(bar, false).  This is JS,
> which does all kinds of bizarre things with undefined/false values, and
> that's just asking for pain.  For example, what if foo() does addition?
>
>
> And I have no sympathy for someone doing boolean addition.
>
> function foo(bar, baz) {
>   return bar + baz;
> }
>
> foo("bar") // "barundefined"
> foo("bar", false)  // "barfalse"
> foo(1, undefined)  // NaN
> foo(1, false)  // 1
>
> Not the same!
>
>
> Lets return from WAT!-land and use booleans correctly, in conditions:
>
>
​This is Javascript.  There is not return from WAT-land.​



> function foo(bar, baz) {
>   return baz ? bar : null;
> }
>
> foo("bar") // null
> foo("bar", false)  // null
> foo(1, undefined)  // null
> foo(1, false)  // null
>

​Do you really think all of the Javascript in the world uses booleans
"correctly"?
The assertion that foo("bar") == for("bar", false) in all cases is false.
A developer can't rely on it, and shouldn't rely on it.
​
I don't think we should make our API worse to
​trying to maintain this false expectation.

​If we can come up with a strong argument for making our API worse, I'm
willing to hear it.  But I'm not convinced by this one.





>
>
> .: Jan-Ivar :.
>
>

Received on Wednesday, 24 February 2016 16:11:37 UTC