Re: [XHR] Open issue: allow setting User-Agent?

> I personally have contacted hundreds of sites for these types of issues
> over the past few years. We've done the education, outreach, evangelism,
> etc. Success rates are very low, the majority are simply ignored.


I'm sorry to hear that. I really am. Still trying to have people stop
browser sniffing client-side. :(


> I'm sorry but that's complete non-sense. The backend is the provider of the
>> data and has all the right when it comes to its distribution. If it's a
>> mistake on the backend's side (they filter out while they didn't intend
>> to)
>> just contact the backend's maintainer and have them fix this server-side
>> problem... well... server-side.
>>
>
> This isn't feasible. There's a whole web out there filled with legacy
> content that relies on finding the string "Mozilla" or "Netscape", for
> example. See also the requirements for navigator.appName,
> navigator.appVersion, document.all, etc. You can't even get close to
> cleaning up the mess of legacy code out there, so you work around it. And
> history repeats itself today with magical strings like "Webkit" and
> "Chrome".
>
> What of new browsers, how do they deal with this legacy content? The same
> way that current ones do, most likely -- by pretending to be something else.
>

The problem is that the same reasoning can be made regarding CORS. We have
backends, today, that do not support it. I'm not convinced they actually
want to prevent Cross-Domain requests that come from the browser. Truth is
it depends on the backend. So why do we require server opt-in when it comes
to CORS? After all, it is just a limitation in the browser itself. Surely
there shouldn't be any issue given these URLs are already fetchable from a
browser provided the page origin is the same. You can even fetch them using
another backend or shell or whatever other means.

Problem is backends expect this limitation to be true. So very few actually
control anything because browsers on a page from another origin are never
supposed to request the backend. There is potential for abuse here.
Solution was to add an opt-in system. For backends that are not maintained,
behaviour is unchanged. Those that want to support CORS have to say so
explicitely.

If we had a mechanism to do the same thing for the fact of modifying the
UserAgent header, I wouldn't even discuss the issue. Target URL authorizes
UserAgent to be changed, browser accepts custom UserAgent, sends the
request and filtering that happened between the URL and the browser would
be bypassed (solving the problem Hallvord gave with devs working on a part
of a site and having to deal with some filtering above their heads). Could
work pretty much exactly like CORS custom headers are handled. Hell, it
could even be made generic and could potentially solve other issues.

What's proposed here is entirely different though: it's an all or nothing
approach. Now I'm just trying to see if there is no potential danger here.


> <aside>
>
>  The burden of proof is on you. *You* ha
>>
>
> Emphasis with asterisks seems unnecessary aggressive. Perhaps
> unintentionally so. :)
> </aside>
>
>
Sorry about that, not my intention at all. I'd love to be convinced and I'd
just love it if Hallvord (or anyone really) could actually pull it off. So
it's positive excitement, not negative one. I hope my answer above will
make my reasonning a bit clearer (just realized it wasn't quite clear
before).

Received on Thursday, 11 October 2012 15:28:02 UTC