I can understand that.
My only point in asking the original question, is that with all of the experiance of all of the members here.
that their might be a general concensus as to what html or xml web site or page builder would provide the most consistant results with no errors.
As much as I do like fancy web sites, If I could make a small web site out of one of my portals, with no errors.
The hope is that it would be viewable reguardless of what browser, or operating system the person uses to view the web site.
From what I have gathered from Rick is that he feels the same way.
So we are not so much trying to rock the boat, then we are trying to find a solution, that would work better then trying to guess as to what program is the best to use.
--- On Sat, 5/3/08, Rui del-Negro <email@example.com> wrote:
From: Rui del-Negro <firstname.lastname@example.org>
Subject: Re: web site programming.
To: "Rick Merrill" <email@example.com>
Date: Saturday, May 3, 2008, 2:45 PM
>>> And the Big League web sites - how about validating
>> How about it? The W3 isn't the HTML conformance police. I'm
>> if the people coding those sites want to validate their code, they
>> be able to find the validator (ex., by searching on... oh, I don't
>> know, Google or Yahoo ;).
> what I am sayingis to create a COMPETITION taht the public can see.
The "public" has no idea what a validator is, or what it means for a
to be "valid". They don't understand that pages are not just
browsers, or that something that can be rendered by their browser might
not work on a different one, or might cause a search bot to choke (as this
thread shows, some people can't even tell the diffence between creating
HTML markup and "web site programming"). And, even if average people
cared, the chances of them locating the W3 validator and looking for some
statistics there before deciding which HTML authoring software to buy is
There is no way to guarantee that a given page, tested on the validator,
was completely generated by a specific package. So besides not being very
useful, those statistics would be unreliable. Maybe the error was in a
part the user had hand-coded, so the validator would think the package
produced an error, when it hadn't. Or maybe the user had fixed some errors
by hand, so the validator would think the package produced compliant
mark-up when it didn't. Or maybe the user deleted the "generator"
the validator wouldn't have a clue what software the page was created by.
Maybe something similar to the "ACID Test" could be created for HTML
authoring applications, but that's really a separate project from a
validator. It has to be tested in very controlled conditions, to guarantee
that every aspect of markup generation is tested, and that code from
different sources isn't mixed.
It's a job for software reviewers (on magazines, IT websites, webmaster
forums, etc.), or for groups like webstandards.org, not for an automated