W3C home > Mailing lists > Public > ietf-http-wg@w3.org > July to September 2007

Re: Standardizing Firefox's Implementation of Link Fingerprints

From: Travis Snoozy <ai2097@users.sourceforge.net>
Date: Tue, 10 Jul 2007 17:31:30 -0700
To: "Edward Lee" <edilee@mozilla.com>
Cc: "Roy T. Fielding" <fielding@gbiv.com>, ietf-http-wg@w3.org
Message-ID: <20070710173130.5007f1c3@localhost>

On Tue, 10 Jul 2007 17:09:18 -0700, "Edward Lee" <edilee@mozilla.com>

> > The only component in the request chain that is even aware of
> > the fragment is the UA.
> This means the servers and networks don't need to be modified to
> support Link Fingerprints - in fact, they don't even know if a request
> uses Link Fingerprints. Modifying the servers to include a hash in the
> URI path requires deploying new software on both ends - a task that
> requires more resources than implementing Link Fingerprints in some
> browsers or download managers.
> Link Fingerprints is bar-raising exercise that requires little/no
> effort from the parties involved. Servers don't change. Links have an
> additional #hash(). And in the common case of no failure, the end user
> sees nothing different from before, while existing clients function
> normally.

There is a tool for that. It's called RDF. Failing that, there's
metalinks XML files, which from what I recall support hashes of the
content (and sections of the content). Failing even that, there's the
thing that everybody's done for forever... putting MD5/SHA hashes and
PGP signatures of the file on the linking site. URLs are not supposed to
handle unique identification, nor are they supposed to handle integrity
checks. There are other technologies that are way better suited for
that job.

Everyone always argues "Oh, but it's a really easy backwards-compatible
way to implement feature X!" when trying to cram more cruft into the
meaning of a URL. That's because URLs are very much meant to be a
flexible, extensible, generic tool. However, you have to stay within
the framework for them to stay flexible, extensible and generic -- and
add-ons like this break as soon as you try and do two at once (e.g.,
fingerprints and metalinks).

It's a one trick pony. That trick might be a really big "wow" the first
time, but when you realize that's all it can do, you kinda get bummed

Pick a single standard to describe the contents of a link. Stick
with it. If there isn't one (and it sounds like that's the case),
maybe there needs to be a new WG to make one. But making a million
different standards to do the same job, and overriding existing
standards to do a job they weren't meant to do, is not in anyone's best
interest (despite excellent intentions).

Received on Wednesday, 11 July 2007 00:31:37 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:13:31 UTC