now thats food for thought.
if I had the time and access to available programs, I wonder if making a standard code to use to test the web programming programs would show which make more or less errors under typical use conditions ? well gota run
Peace out Greg
--- On Thu, 5/1/08, Karl Dubost <firstname.lastname@example.org> wrote:
From: Karl Dubost <email@example.com>
Subject: Re: web site programming.
To: "Rick Merrill" <firstname.lastname@example.org>
Date: Thursday, May 1, 2008, 9:40 PM
Le 2 mai 2008 à 04:53, Rick Merrill a écrit :
> I have a suggestion to float: How about adding a summary
> page on the Validator web site where users (like us) can
> add Validator results based on what software was used.
> For example,
> DreamWeaver: 15 pages; 25 errors; html 4.0 transitional
The only way to do that in a painless way for authors is to collect a
meta name in the headers of files. I suspect you want the compiled
stats on how badly performs some authoring tools, maybe something else.
But the data set will be of poor quality:
- not a huge number of pages with the meta.
- more than one authoring tool involved in the generation of pages
- author changing authoring tool, but forgetting to modify the meta
It will be very hard to have relevant statistics in such open
Karl Dubost - W3C
Be Strict To Be Cool