Re: [SocialSwarm-D] D-CENT: state of the art - not

On 02/27/2014 05:07 PM, carlo von lynX wrote:
> On Thu, Feb 27, 2014 at 04:44:22PM +0100, Harry Halpin wrote:
>> Almost all the academic work you cited was using social-networking
>> based routing. IMHO reputation systems are a effectively a variant
>> of social networking where the reputation is implicit rather than
>> explicit as in social networking. In that way, you can consider
>> client-server systems kinda the same thing where you pick a
>> "trusted" server (trusted because you know the sysadmins, trusted
>> because it's not down all the time, etc.). Again, all of these are
>> basically non-technical solutions (reputation is a social rather
>> than technical construct), but rely on social factors.
> This is more of a political statement than a technical one.
> You are arguing that Facebook and Google are legitimately
> being entrusted by their users while their users to a large
> extent do not feel like they really have a choice. Also they
> would expect usage of these servers to be respectful of
> constitutional rights, which technically isn't possible.
> The abuse isn't even technically measurable.

No, I am arguing that your cited  "fixes" to the sybil attacks were 
social fixes rather than technical ones (and one's that expose the 
user's social graph usually at that for traffic routing, which at least 
for me is an unacceptable tradeoff in terms of privacy).

I support data portability, privacy, and the user's right to chose their 
own place to store data. Standards exist for interoperability between 
systems, and thus would give users the ability to trust whatever system 
they wanted. Again, users will vote with their feet if any system 
reaches maturity. Standards generally exist between systems that would 
otherwise be mutually incompatible and that already have user-bases in 
order to give users more freedom.

>
> Whereas you could imagine that people do not have a problem
> to entrust the social graph with protecting social graph
> information (aka transaction metadata), as that is a quite
> reasonable choice they make in their daily lives each day.

Of course, we support standards. However, the Web is not a p2p system, 
if a p2p system is strictly designed as one in which any node can 
*directly* connect with any other. It's client-server. Same with Tor 
(i.e. you access Tor relays via installing a Tor client). If you wish to 
define p2p in another way, that's fine but explain your usage, as often 
the term is thrown around in vague ways.

In general, claiming like "sybil-proof routing" and then claim without 
regards to the needs of actual users is unlikely to be helpful in 
communicating to people who don't have an ideological or personal reason 
for claiming a single approach or system is a "magic bullet", and is 
generally not the approach taken by those with expertise. For example, 
people from Tor do not claim their system is a "magic bullet" but useful 
for bursty-traffic like Web-browser traffic against a non-global adversary.
>
>> Feel free to bring it up with Roy and ping us back when Secushare
>> starts getting adoption. But again, strictly speaking, p2p internet
>> routing is out of scope for Web standards. You can always bring
>> these kind of protocols up at the IETF, where they would both be in
>> scope and have an outside security review.
> In other words privacy and other constitutional principles
> are out of scope for the W3C since the "web" as we know it
> is architecturally unable to remedy to the problems at hand.
>  From my point of view W3C is supporting illegal activity on
> a global scale, so it should seriously reconsider its priorities.
> I know that everyone has been acting in good faith and there have
> been years of efforts to support concepts of privacy on the web,
> but since last summer it should be clear that the entire approach
> is fallacious and the question to pose yourself is, do you really
> want to stand on the wrong side of history?
>
> And no, don't tell me there are solutions to the problems that
> are web-based as I haven't seen a single one of those stand the
> test of scientific logic.

I think there are both merits in high-risk efforts with a high-risk of 
failure such as "reinventing the Internet" and merits in building on the 
mass-deployed and Internet and Web people already use. The W3C is of 
course focussed on the latter, but some other standards bodies (ISO 
comes to mind, albeit with the legacy of OSI) are open to the former.  
If you wish to be introduced to folks in the Security Domain at the IETF 
STRINT meeting, I'm happy to do so.

  In terms of scientific logic, in general the best way is to submit any 
system to peer review in standards bodies and academic conferences in 
security and privacy, which generally have people with adequate 
expertise. Tor did that, it's helping a lot. I do not believe that's 
happened with most systems that people are discussing, which one of the 
useful points of a Working Group around social. However, we will scope 
that Working Group to focus on Web-based approaches, given it's at the W3C.

    cheers,
        harry

Received on Thursday, 27 February 2014 19:15:57 UTC