- From: William Loughborough <love26@gorge.net>
- Date: Fri, 30 Mar 2007 17:33:11 -0700
- To: semantic-web@w3.org
A colleague in the world of Web Accessiblity ( liam.mcgee@communis.co.uk ) asks: _______________________________________________________________________ If one had access to, like Google does, an index of 100 billion pages of the web as a data set, what questions would you like to ask it? I would like to find out how many pages contained H1 tags and how that varies over time (and indeed how many contain h1 and h2, how many contain h1 h2 and h3) as a proxy for semantic structure. Taking the semantic temperature of the web, as it were. I'd also want to know how many pages had a 'skip to...' link in them, and again how that changes over time. (and while I'm thinking along these lines, it'd be great to get usage figures on the w3c validator services over time. Would make for a good story: 'more and more designers getting valid') *********************************************************************** Anybody have a handy-dandy to check structural practices in this way? Love.
Received on Saturday, 31 March 2007 00:33:43 UTC