W3C home > Mailing lists > Public > public-html-bugzilla@w3.org > November 2010

[Bug 11402] One problem of todays JavaScript libraries is, that the client has to download the same library over and over again, while visiting multiple sites. One could use services like Google Libraries API for a central location, but that introduces new points of

From: <bugzilla@jessica.w3.org>
Date: Wed, 24 Nov 2010 23:42:45 +0000
To: public-html-bugzilla@w3.org
Message-Id: <E1PLOz7-0005CZ-BG@jessica.w3.org>

Aryeh Gregor <Simetrical+w3cbug@gmail.com> changed:

           What    |Removed                     |Added
                 CC|                            |Simetrical+w3cbug@gmail.com

--- Comment #3 from Aryeh Gregor <Simetrical+w3cbug@gmail.com> 2010-11-24 23:42:43 UTC ---
(In reply to comment #1)
> 3) On a page of your own, link to the script multiple times, each time
> specifying a different hash from the list in (1).

Which would do nothing.  In a sane implementation, the browser would just keep
a second index for its cache, indexed by the hash of the file.  This would
allow it to save some disk space (since it only needs one copy of each file). 
Then, if a site requested a resource using hash="", it would first check its
index for the file and return it if present.  The hash attribute would be
ignored when *storing* the file.

> This can potentially be mitigated by specifying particular hash algorithms that
> can be used, so the browser can verify that the script actually hashes to the
> provided value before committing it to the cache, but that still leaves us at
> the mercy of hashing algorithms being strong.  Had this been specified before
> we knew that MD5 was broken, for example, the attack described above would now
> be completely doable even *with* hash verification.

How so?  MD5 has collision attacks against it, but you'd need second-preimage
attacks here, which are much harder.  Even MD4, which reportedly
<http://valerieaurora.org/monkey.html> has collision attacks that can be
carried out by hand, has no practical preimage attack.  (The best attack is
2^102 according to Wikipedia.)  The best preimage attack against MD5 is even
less practical (2^123.4).

On top of that, you'd need a preimage attack that allowed you to substitute
effectively arbitrary content.  Real-world preimage attacks might generate
preimages that are short strings of gibberish, which would be useless for this

In the unlikely event SHA256 (for example) does get a practical second-preimage
attack against it anytime soon that's usable for this purpose, there will be
plenty of advance warning.  Papers will have been published months or years
before pointing out theoretical weaknesses and bringing attacks closer and
closer to reach.  There will be ample time to retire it.

(For instance, MD5 had theoretical vulnerabilities first published in 1996, but
the first practical break was around 2005 -- and that was only a collision
attack.  SHA256 has no theoretical vulnerabilities published yet at all, so we
probably have ten years or more before we need to worry about a break here.)

And of course, in the ludicrously implausible scenario that someone publishes a
practical preimage attack on SHA256 when there hadn't been significant
theoretical problems beforehand, even if they grossly violate ethical standards
and publish it with zero advance warning, and even if they include sample code
so that there's no delay before attackers get it -- even in this incredibly
extreme case, it's just a zero-day vulnerability that happens to hit all
browsers at once.  Release a trivial fix and it disappears overnight.  All you
have to do to stop it is just clear this special cache and ignore hashes of the
bad type.  It's not even a black-box detectable behavior change, it just causes
extra page loads.

(It would be very cool if we could skip the whole problem by using a provably
secure hash function:
You can construct hash functions whose collision resistance reduces to the
security of Diffie-Hellman, for instance, so if they get broken we have bigger
problems.  Sadly, they're all horribly inefficient, typically requiring lots of
modular exponentiation or such for even small messages.)

The real problem with this is that it will bitrot.  If you update the file but
don't update the hash in every single HTML file referring to it, then a bug
will occur only for users who happen to have the old file in cache, which will
be impossible to reproduce for other people.  Even if the user clears cache,
another site might be repopulating it, so the bug will recur for them but not
the site admin.  It's not obvious that the saved page loads are realistically
worth the danger of this pitfall.  (C.f. resource packages.)

Configure bugmail: http://www.w3.org/Bugs/Public/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the QA contact for the bug.
Received on Wednesday, 24 November 2010 23:42:47 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 24 November 2010 23:42:58 GMT