- From: Alexandre Morgaut <notifications@github.com>
- Date: Tue, 23 Apr 2019 05:41:13 -0700
- To: whatwg/xhr <xhr@noreply.github.com>
- Cc: Subscribed <subscribed@noreply.github.com>
- Message-ID: <whatwg/xhr/issues/244/485785893@github.com>
As a Green IT evangelist, I would say > "such implementation is great" > "please fail early and interrupt useless transfert and process costs" As a JS developper, I would say > "I can be interested to know that the request will eventually fail" > "But my user may be frustrated if I don't at least handle the data I'm capable of..." That's the whole power of streams and progress events, you can potentially provide services / features as soon as you start receiving the 1st bits of a response ex: you can start to fill a Grid with useful information for the user But for sure, for such usage: - I would prefer using `fetch` and its streams than `xhr` - I'd expect the backend to send `chunk encoded` responses, not ones with a `Content-Length` - BUT frontend devs don't always have control on backend implementations... Idem regarding images What UX would you expect: - a partially loaded image - an image that is not shown at all The difference is probably that this choice is usually automatically based on chunked vs identity transfert encoding So What My 1st approach would be "don't break the web", and this change may break it at some point. Probably in a very limited way, but to check it seriously, tests should be done in areas people are using limited devices. What about applying this optimization only when there no progress event listeners? -- You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub: https://github.com/whatwg/xhr/issues/244#issuecomment-485785893
Received on Tuesday, 23 April 2019 12:41:39 UTC