W3C home > Mailing lists > Public > public-colloquial@w3.org > August 2011

Colloquial Web CG

From: Sean B. Palmer <sean@miscoranda.com>
Date: Tue, 16 Aug 2011 16:52:27 +0100
Message-ID: <CAH3-oEcC6q86WtRK4OGoGFtgpn9WgDsYzQOXS1=qRPeeDqWyKg@mail.gmail.com>
To: public-colloquial@w3.org
Today the W3C announced the new community site:

http://www.w3.org/QA/2011/08/subject_from_innovation_to_sta.html

The Colloquial Web CG is part of that site, and now that it's public
we can ask people to join, and talk freely about what we're doing.

So, what are we doing?

The colloquial web is a point of view rather than an architecture or a
system. It's the idea that how we use the web, and what we do on the
web, evolves differently to the web technologies that ought to cater
for us. Sometimes this dissonance or tension is trivial. Sometimes it
is more obvious and irritating. This CG is supposed to look at the
differences, and comment on what to do about them.

As a simple example, consider the role of validation on the modern
web. How many people have valid sites? How many people validate their
sites? If many more people validate their sites than have valid sites,
why aren't they fixing things? What benefit is there if you make your
sites valid? What kind of errors can't validation point out, and how
have people found ways to cope with such errors?

Changing notions of conformance are of course nothing new, but theory
here seems to fall far short of practice. I saw an article recently
where when the BBC wanted to update their website, in order to check
that the new design worked they took screenshots of the resulting
pages. Then they compared them pixel for pixel to the old design.
Markup validity didn't matter to them, just the screen rendering. They
couldn't test the UI that way either, but they seemed to be proud of
the results.

This method of testing, which may have just been one amongst many,
shows nonetheless that a great reliance is placed on the resulting
rendering of a page, not on the state of the markup which is an
intermediate affair. As sites become more complex, using more DOM
manipulation and plugins, this complexity will only make testing more
difficult.

Conformance isn't the only thing we can study, of course. It would be
interesting to know how trends in making websites are changing, and
what they're changing towards. Attitudes towards content creation have
changed massively in the past decade. Whereas social media sharing and
CMSes rule the roost now, ten years ago people were probably more
likely to be uploading their files using FTP to a hosting company that
wouldn't last more than a year.

This all has an effect on the technologies that we use. When our
content is locked into proprietary silos, it's hard to manage a
coherent presentation of what we put online. We are faced with
managing an ever increasing number of accounts and even identities on
the web, with decreasing chance of ever being able to consolidate
them. Technologies such as OAuth promise to help us manage our data by
giving us the ability to delegate fine grained access control to the
companies that we trust, but fail when the companies ask for more than
they need, leaving us no alternatives when the competition is slim.

Web technologies have always lagged behind practice, and politics have
had as much to play in their development as science. Sometimes it's
impossible to effect the change that we'd like to bring about. But I
think we should at least be documenting as we go, providing good data
and good arguments to people who might be interested. Perhaps we can
start by identifying some areas in which a colloquial viewpoint will
be most illuminating.

-- 
Sean B. Palmer, http://inamidst.com/sbp/
Received on Tuesday, 16 August 2011 15:53:04 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 16 August 2011 15:53:05 GMT