- From: Adam Rice <notifications@github.com>
- Date: Tue, 23 Jan 2018 07:01:04 +0000 (UTC)
- To: whatwg/encoding <encoding@noreply.github.com>
- Cc: Subscribed <subscribed@noreply.github.com>
- Message-ID: <whatwg/encoding/pull/127/c359694445@github.com>
@hsivonen Consider the following transform stream: ```javascript const rechunker = new TransformStream({ transform(chunk, controller) { for (let start = 0; start < chunk.length; start += 1024) { const end = Math.min(start + 1024, chunk.length); controller.enqueue(chunk.substring(start, end)); } } }); ``` This takes a stream of strings and splits up any chunk that is longer than 1024 code units. Here are the questions I ask myself: 1. _Is this a useful and sensible thing to want to do?_ I believe so. If the stream you're processing may have arbitrarily large chunks, you're forwarding it to some expensive operation, and you want to avoid jank, I think the correct approach is to include something like this in your pipe. 2. _It is reasonable for a developer to expect this to work and not corrupt the input?_ I think so. 1. Corruption wouldn't even happen unless you streamed it to `TextEncoder`, and that could be several steps away down the pipe. 2. It just applies the Javascript built-in `substring()` function, and nothing on https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/substring says anything about data corruption. 3. It will be a long time before any problems are noticed. 3. _Is this implementation correct?_ Maybe? Maybe not? I don't know. -- You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub: https://github.com/whatwg/encoding/pull/127#issuecomment-359694445
Received on Tuesday, 23 January 2018 07:01:31 UTC