W3C home > Mailing lists > Public > ietf-http-wg@w3.org > January to March 2007

RE: HTTP Request Forwarding?

From: Mike Schinkel <mikeschinkel@gmail.com>
Date: Thu, 8 Mar 2007 02:30:51 -0500
To: "'Travis Snoozy'" <ai2097@users.sourceforge.net>
Cc: "'Adrien de Croy'" <adrien@qbik.com>, <www-talk@w3.org>, <ietf-http-wg@w3.org>
Message-ID: <007f01c76153$b35265d0$0702a8c0@Guides.local>

> So, in other words, you're asking that user agents be
> forced to report the original request URL, and not the
> redirected URL, to the user. That seems like a
> presentational issue; at best, you might get a "SHOULD"
> level requirement out of it. 

No, "SHOULD" isn't strong enough. I'm asking for an alternate "MUST" because
people bookmark the location returned which defeats the purpose,
*especially* now with the explosion of social media.  My point is: 

	If the URI authority wants to deem the 
	published URL instead of the serving 
	URL to be the canoncial URL, it should 
	be given the tools it needs to do so.

Clearly my proposal won't work on existing clients but what I'd like to see
is that if the client understands HTTP/1.2 [1] then it MUST display the
canonical URL but if it doesn't and only supports HTTP/1.1 then obviously it
is okay to behave as it currently does today.  OTOH, why not change the
rules changes for 302, assuming it wouldn't break anything (would it?) After
all, if it is a temporary redirect why should the temporary URL be displayed
by the client at all?

> In practice, though, redirects seem to do a pretty good
> job as it is. As it stands, many redirects are cacheable.
 
Actually, they do a horrible job. Have you ever noticed how many broken
links there are on the web?   (Yes, I know there is a logical disconnect in
my statement but having "HTTP Request Forwarding" that doesn't obscure the
original URL would empower URL virtualization as a solution for minimizing a
significant portion of broken links. Having the requirement for large static
content (images, video, etc.) to go through a proxy makes URL virtualization
it a non-option, but if the heavy content can be handled as I propose, it
opens a lot of potential doors for new techniques to solve nagging
problems.)

BTW, one of my favorite examples related to the broken URLs problem is a
post from June 2003 entitled "Future Proofing Movable Type URLs" [2]. It has
121 comments and/or trackbacks. Of the links in those comments and
trackbacks, OVER HALF no longer resolve! You couldn't get a more biased
sample of people who should know better -- people commented on a blog post
about future-proofing URLs -- yet over half haven't maintained those links
less than 4 years later.  I'm sure I'll be it's costing untold millions in
lost productivity and it causes important people to distrust the reliability
fo the web, yet nobody's really paying any attention to it.  I'd personally
really like to work on this problem. Getting HTTP Request Forwarding would
be a first step.

-- 
-Mike Schinkel
http://www.mikeschinkel.com/blogs/
http://www.welldesignedurls.org
http://atlanta-web.org - http://t.oolicio.us

[1] I'm using "HTTP/1.2" as a euphamism for this newer version that supports
my proposal as compared to the existing HTTP/1.1
[2] http://mar.anomy.net/entry/2003/06/22/17.15.00/
Received on Thursday, 8 March 2007 07:31:07 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Friday, 27 April 2012 06:50:00 GMT