W3C home > Mailing lists > Public > whatwg@whatwg.org > August 2015

Re: [whatwg] APIs to interrogate default outgoing HTTP headers, i.e. Accept-Encoding

From: Nils Dagsson Moskopp <nils@dieweltistgarnichtso.net>
Date: Tue, 11 Aug 2015 00:55:31 +0200
To: Ron Waldon <jokeyrhyme@gmail.com>, whatwg@whatwg.org
Message-ID: <877fp292jg.fsf@dieweltistgarnichtso.net>
Ron Waldon <jokeyrhyme@gmail.com> writes:

> On Tue, 11 Aug 2015 at 08:31 Nils Dagsson Moskopp <
> nils@dieweltistgarnichtso.net> wrote:
>
>> I do not understand that use case. It reads incredibly convoluted to
>> me. The UA controls the transport anyway – it should not make any
>> practical difference to a script how the data is transmitted.
>>
> My use case is centred around trying to optimise network usage when
> requesting content from AWS CloudFront backed by S3.
>
> I 100% agree with you that this should not be a script's problem. However,
> it is. When the server (CloudFront in this case) has raw and GZIP'ed copies
> of content, and no automatic server-side selection between the two, the
> only way to optimise network usage is for the script to make this
> determination.
>
> Unfortunately, there is no way to gain access to the default Accept
> Encoding header from JavaScript, which is necessary to figure out whether
> to download raw of GZIP'ed content.
>
> So we currently do more hoop-jumping by serving a dynamic initial HTML,
> where the server constructing it can reflect the UA's Accept Encoding
> header back to the client in a generated script tag. It's yucky.

So the server is buggy (for your purposes) and you want to have another
feature to work around it. Introducing that will take considerable time
and resources. I suggest to just wait until the server is able to serve
compressed content transparently or use a more functional server setup.

> Beyond our own need for access to the Accept Encoding header, there may be
> other use cases that are supported by providing access to other headers.
>
>
>> Btw, why can AWS CloudFront not into compressed content?
>>
> http://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/ServingCompressedFiles.html
>> CloudFront doesn't compress the files itself
>> Amazon S3 doesn't compress files automatically
> AWS CloudFront will "do the right thing" if it is backed by a Custom Origin
> that honours the Accept Encoding header (not S3).
> We have repeatedly requested improvements from AWS, and they are likely on
> the way, but we have many hoops to jump until then.
>
> When last I checked, many sites using AWS CloudFront and S3, including the
> first-party AWS Console itself, do not serve GZIP'ed resources, which is
> sub-optimal.

It may take considerable longer time to spec a feature, ship it and wait
for user agents to catch up than to just fix your server setup. If your
setup really is so buggy that it cannot serve compressed content (Am I
understanding it correctly? It sounds so stupid!) switch to something
that can, instead of externalizing the costs to UA implementors and
future web developers who will have one more interface to learn.

-- 
Nils Dagsson Moskopp // erlehmann
<http://dieweltistgarnichtso.net>
Received on Monday, 10 August 2015 22:56:12 UTC

This archive was generated by hypermail 2.4.0 : Wednesday, 22 January 2020 17:00:34 UTC