wikileaks - Web Architecture and Robustness

There's a recent blog post from Ethan Zuckerman about Amazon and wikileaks.

Summary: I will not introduce wikileaks, you should know what is happenning, except if you have lived under a rock these last few weeks. Wikileaks in the last few days had a few massive DDOS. They decided to switch the hosting of their assets from their servers in Sweeden to the cloud service provided by Amazon. But Amazon decided to unplug wikileaks. They had to go back to their own servers. 

Why I'm talking about this:
It is important to see the interaction of the Web architecture and its strengths and weaknesses when the stakes are very high. There are a few things into play:

* Political
* Law
* Technical robustness
* Information flow

As Ethan mentionned, the documents themselves are distributed through bittorrent and were then not really affected by the DDOS. But the Web presence was definitely. I was then wondering what would be the part of Web architecture that would to be improved or modified to be as robust as bittorrent in such circumstances without losing the benefits of URI.

    In …My heart’s in Accra » If Amazon has silenced Wikileaks…
    At http://www.ethanzuckerman.com/blog/2010/12/01/if-amazon-has-silenced-wikileaks/

    Update: It’s worth mentioning that Wikileaks is 
    using peer to peer networks to distribute the 
    actual cables. DDoS may be effective in removing 
    their web presence, but it’s going to have a much 
    harder time removing the sensitive material from 
    the internet. The DDoS attacks are actually a 
    useful reminder that we still don’t have a good 
    way to serve web sites on a purely peer to peer 
    architecture. That would be one response to the 
    problems of consolidation I’m talking about here…


-- 
Karl Dubost - http://dev.opera.com/
Developer Relations & Tools, Opera Software

Received on Thursday, 2 December 2010 01:02:38 UTC