- From: Nick Jennings <nick@silverbucket.net>
- Date: Sat, 1 Jun 2013 20:50:29 +0200
- To: Sandeep Shetty <sandeep.shetty@gmail.com>
- Cc: "Michiel B. de Jong" <anything@michielbdejong.com>, public-fedsocweb <public-fedsocweb@w3.org>
- Message-ID: <CAJL4WtYDaeK7Z4Z_PC94+sKeT-QzatDGZmpunb0X_7hxs3wwQQ@mail.gmail.com>
On Fri, May 31, 2013 at 3:02 PM, Sandeep Shetty <sandeep.shetty@gmail.com>wrote: > On Fri, May 31, 2013 at 6:06 PM, Michiel B. de Jong > <anything@michielbdejong.com> wrote: > > I think in a polyglot mindset we should allow both email-like and > URL-like > > identifiers, and I even think that it is the only way forward. If we can > > apply polyglot thinking at that most basic level of the user identifier, > > then we will also be able to apply it at all the other levels, and can > > achieve interop without having to discuss superiority of certain design > > choices over others. It is actually a beautiful thing that all our > systems > > are so different and unique, that's part of the richness! :) Let's try to > > federate them with each other in a polyglot way. > > This polyglot "adapter" approach come with high "costs". IIRC, even > Friendica Red decided to limit what it can connect to [1]. At a basic > level there needs to be a few building blocks to support interop, > otherwise the cost of integrating with every system out there are too > high (remember integration is not just a one time task and anyone that > has integrated with more than one system knows about this ongoing > maintenance "cost"). I agree. I think if we use a middle-man building block between the application and the various server-side APIs or protocols, then we mitigate that cost considerably. Traditionally what we've seen is large projects trying to pick and choose which other large projects they will interoperate with due to the costs involved with implementing support for other platforms. It's a recipe for failure IMHO. I think we should worry less about interoperability and more about decoupling our approach to address smaller areas of concern, that way each piece has a much higher chance of remaining relevant and useful to developers. I think the many different protocols out there on the web is a good thing, and is price we pay for innovation and progress. It's unrealistic to expect everyone to conform to a protocol that may not specifically fit their needs, perhaps they think they have a better way to do things, or the system they end up with grew organically and what they have is what they ended up with. I think we've got to accept the fact that things will continue to change faster than we will be able to, and adjust our approach accordingly. On the flip side, maybe this could be the reason > only a few of the systems that are "friends" with each other will > survive and grow, and achieve network effects, while the rest of them > die out because without the basic building blocks for interop they > will have no options if they cannot "convince" others to integrate > using "their approach". > Perhaps. Regardless of the reasoning, it's safe to say protocols will die out and others will pull ahead in popularity. If our back and front-end systems are not tightly coupled together, we have a better chance of building long-lasting applications that users can use beyond the life of any handful of APIs, protocols, social networks or fads of the day - while still being able to take full advantage of the fads of the day and not being 2+ years behind everything. I say it's not really about the protocol(s), it's about our approach in developing open alternatives for end-users.
Received on Saturday, 1 June 2013 18:51:29 UTC