- From: Anders Riutta <notifications@github.com>
- Date: Tue, 02 Aug 2016 11:37:45 -0700
- To: whatwg/streams <streams@noreply.github.com>
Received on Tuesday, 2 August 2016 18:38:21 UTC
`pull-through` is just 1.7K minified, which is far smaller than Node streams! It looks like a great solution for async. But as far as being the lowest common denominator, aren't transducers still lower, because they handle both async and non-async? The incomplete token part is tricky. Most parsers need to deal with that, e.g., [this one](https://github.com/jonnyreeves/chunked-request/blob/07d3553276c336890d2f328e204e3910aa418825/src/defaultChunkParser.js#L9) or [this one](https://github.com/RubenVerborgh/N3.js/blob/4863b95225e5e7ba3b02943254cc1c801155db92/lib/N3Lexer.js#L343). To handle this case, we'd need to use `reduce` followed by `flatten` instead of `flatMap`. --- You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub: https://github.com/whatwg/streams/issues/461#issuecomment-237000022
Received on Tuesday, 2 August 2016 18:38:21 UTC