Re: [whatwg/streams] ReadableStream.from(asyncIterable) (#1083)

Uhoh... It looks like optimizing for arrays might not even work. 😕

You can `push()` new elements to an array *while you're iterating it*:
```javascript
let array = ['a', 'b'];
const it = array[Symbol.iterator]();

it.next(); // { value: 'a', done: false }
it.next(); // { value: 'b', done: false }

array.push('c');

it.next(); // { value: 'c', done: false }
it.next(); // { value: undefined, done: true }
```

So by that logic, this should also work:
```javascript
let array = ['a', 'b'];
const rs = ReadableStream.from(array);
const reader = rs.getReader();

await reader.read(); // { value: 'a', done: false }
await reader.read(); // { value: 'b', done: false }

array.push('c');

await reader.read(); // { value: 'c', done: false }
await reader.read(); // { value: undefined, done: true }
```

With the current implementation, this does indeed work. `ReadableStream.from()` will call `it.next()` only when a new chunk is needed (since HWM = 0), so it *will* still see the newly added element.

However, if we add a special case for arrays that *immediately* iterates over the array and enqueues all its elements, then the above snippet would no longer work.

How do we feel about this? Should we allow this, and make this optimization impossible? (If so, I'll add a test for the above.) Or not?

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/whatwg/streams/pull/1083#issuecomment-876767187

Received on Thursday, 8 July 2021 21:47:43 UTC