legacy of incompetence? [was: a compromise to the versioning debate]

On Sun 4/15/2007 7:02 AM Alexander Graf

>Why even care about incompetent web developers? I don't want to start 
>a discussion about the trade but seriously, being a web developer 
>requires some skill that a lot of people just don't have. Catering 
>for them and trying to make HTML as simple as possible so that these 
>people won't have any problems is just weird. 
This got me thinking. My initial reaction to it was quite negative (in some sort of fundamental way). Let me first say why it was. After that, I'll say how I mellowed a bit upon further reflection.
It runs counter to a couple of WHATWG's proposed design principles ( http://esw.w3.org/topic/HTML/ProposedDesignPrinciples): certainly "Pave the Cowpaths" and "Priority of Constituencies", and possibly "Don't Break the Web", but then I recently argued against the adoption of those very principles (http://lists.w3.org/Archives/Public/public-html/2007Apr/0679.html) so I am hardly the one to rely on them to make my case. 
I thought the WG charter had language running counter to this perspective, but on reanalysis, the closest I could find was:
The Group will define conformance and parsing requirements for 'classic HTML', taking into account legacy implementations;
It would be a bit of a stretch to claim this means we have to support EVERY peculiar piece of HTML ever successfully rendered in some browser. 
But my negative reaction to the above was not based on statutory considerations, but on other considerations: like the relativism associated with judgments of incompetence. Different people and orgnanizations use HTML for very different reasons. 
I was at a conference (of fellow academics) in 1993 when someone announced that the number of .com sites on the Internet had, for the first time, exceeded the number of .edu sites. The audienced gasped in dismay. "There goes the neighborhood." Many of us had seen it coming, but some were smug in the belief that the "real world" would never discover the academic playground that we had used to collaborate, play, and research for the past couple of decades. A lot of those same academics are to this day, still writing bad HTML to convey very good content. They don't hire web developers; they don't particularly care about CSS ("isn't that just how you make stuff pretty?" I have heard several faculty ask); they do use tables and blockquote to layout their content, since so far as they are concerned HTML IS "presentation" and not "semantics." The meaning is their text and pictures (the natural science, art, history, medicine, engineering,  and social science), that before HTML they used anonymous ftp and then gopher to convey to students and colleagues. HTML is just how you present the stuff. Some (me for example) learned the little bit of HTML I cared about at NCSA prior to the release of Mosaic). It looked like a cute sort of thing-- not quite as cool as WAIS, but the fact that pictures were downloaded automatically and embedded in it, revealed that it had some potential. The Ted Nelson sort of hypertext concept was central rather than peripheral as in gopher and that showed signs of forward thinking. 
These academic HTML authors are the ones who seeded the Internet and the web with just enough real content (as opposed to pointers to GNU archives and RTF's and other esoterica peculiar to some of the most esoteric parts of computing), that humans actually wanted to get to it. 
Another proposed design principle: Don't disenfranchise your ancestry! (It may turn your bend dexters into bend sinisters, and parsing backslashes gets weird sometimes.)
If we break higher education -- we'll lose not only faculty and their delightfully illiterate incompetencies in the art and science of web development -- but we will lose the students as well -- since the faculty will just find other ways of doing things. If HTML gets too modular (CSS = presentation; JavaScript = function; HTML = some oddball concept of semantics that does not map to the semantics of "semantics" at all well for most folks) then faculty will find it too hard to do and will find other places to present their material (see http://blogs.law.harvard.edu/cyberone/ for example.) And those very students, some of you web developers and browser developers might like to be able to hire when it comes time to actually build browsers to do HTML6 (or will it be 7? ...  I've lost count) in 2015. And one of the problems the web now faces is that the same faculty who know rudimentary HTML have now (thanks to their wonderfully rich browsers) seen moving OWL diagrams for music proximities, and Flash demos that make them want to do animate their lectures and are beginning to say -- hey I can display four-dimensional red-shift data on galaxies with that, or I can let students simulate heart surgery with that. It'd be nice if they could use HTML for that. But if not, being a resilient lot they'll figure out some way.
But (the clouds begin to move) -- as long as we don't make HTML too weird and difficult, then I think most faculty won't really care too much if they have 15 years worth of lecture notes begin to look funny in new browsers. They'll learn how to make their new pages conform to the new browsers and every couple of years they'll fix a dozen or two of their old pages, grumbling a bit about the computing industry as they do so. No great catastrophe will in fact occur.
So I have had a bit of change of heart, perhaps. Should our concern for preserving "ill-formed" legacy content on the web really cause an impasse between the major browser developers? I suppose not. Such an impasse breaks a whole lot more than the 772 million sites that Google gives me in answer to the query "education." At least so I suspect.
But it does make me wonder... Folks in the user-interface community (like ACM-SigCHI) often grumble about how the software developers build some darn collection of algorithms and then come to the interface specialist and say "here .. build us an interface for this." According to them, the interface design really ought to begin earlier, rather than being tacked on as an afterthought. One way that good development efforts often begin is through some sort of consitituency analysis. Who exactly are our prototypical web authors?
Would it make sense to sit down and sketch out the six or eight or seventeen primary types of web developers (and users), make a brief stab at identifying their needs (perhaps by actually rounding some up and giving them a questionnaire of some sort) or at least guaranteeing that those constitutencies are represented here, and then use that to figure out just what are we talking about when we're talking about breaking things?
For example:
browser developers (Apple, Microsoft, Mozilla, Opera, etc.)
corporate sites whose business is primarily internet based (Google, Amazon, EBay, etc.)
large sites with mission-critical dependence on web (governments, health care, universities)
web development consultants and companies (those who build pages for other companies)
stand-alone single authors (faculty, bloggers, wiki-contributors)
people who are currently in read only mode (the folks who visit web pages)
I suspect Apple and Microsoft, given their large historic interest and investments in interface design, probably already have data germane to these questions. 
Who are the people of the web?
Do they fall into natural categories (in some multidimensional scaling or cluster analytic way)?
If so, how much content has been creaed by  those belonging to each of those categories?
If such a study were done, then would it actually help a large and multifaceted group such as this WG actually make decisions more intelligently? Could such a study be cobbled together from existing data? Could such a study be ongoing while the WG finishes others of its assigned tasks? I don't know, but that's the final place my thoughts sort of came to rest upon reading Alexander's note.
In the meantime, instead of the term "incompetent" could we agree on "differently abled" or something? I know it sounds awfully PC, but incompetent just rankles a good half of the world, and that runs rather contrary to some of the working group's charter I suspect. (I'm just imagining the headlines -- "HTML WG vows to stamp out web incompetency!" -- it doesn't play well in Poughkeepsie.)
David Dailey

Received on Sunday, 15 April 2007 14:41:33 UTC