W3C home > Mailing lists > Public > www-validator@w3.org > February 2001

Re: Table Validation

From: Terje Bless <link@tss.no>
Date: Tue, 27 Feb 2001 11:05:49 +0100
To: Nick Kew <nick@webthing.com>
cc: www-validator@w3.org
Message-ID: <20010227110752-r01010600-aaf6b620@10.0.0.2>
[ Removed Marty from the CC as it's no longer directly relevant ]

On 27.02.01 at 08:43, Nick Kew <nick@webthing.com> wrote:

>That can't be right???   That URL is nothing but a meta refresh, whose
>markup betrays a total lack of clues.  Could this be some private
>person's site on a virtual host lacking a half-competent admin?

No, it's moved. You'll find it under Tools on Imagiware.


>Arguably the failure of W3C's HTML 4 DTDs to do so is a defect in the
>standard:

Yes, and it's up to the W3C to fix it; not us to ignore it by using other
DTDs.


>  (1) There are no concievable problems with upgrading the DTD

Except you are no longer validating against W3C standards. you are now
validating against Nick Kew's standards. What's your feeling on "name"
attributes on "IMG" elements? How about "border"? Slippery slope! Who died
and put you in charge? etc.


>Bearing this in mind, you might be interested to look at the WDG DTD
><URL:http://www.htmlhelp.com/design/dtd/>.

Thanks. Wasn't aware of that one. The WDG needs some better PR! :-)


>> It's not. It merely checks against some random programmers idea of "good
>> HTML" (possibly taking into account known deficiencies in browser
>> implementations) and not the actual valid syntax of the language.
>
>I'm at a disadvantage, not having seen what you have apparently seen
>at doctor-html.

I know Dr.HTML from way back when. Usefull, but not based on a SGML parser
last I looked. You've probably seen it too if you've been around a while
(Tina and Liam should know it, if that helps?).


>While that comment is true of a number of services - such as Weblint,
>Tidy, the Demoroniser, and a range of tools of more questionable repute
>than these, it doesn't have to be.

No. As I said, it's based on previous experience with Dr.HTML.


>>You'll want to start by making sure somethig is Valid -- by checking with
>>the W3C Validator -- then running it through something like Dr.HTML
>>(including looking at it in various browsers etc.), and finally going
>>back to the W3C Validator to make sure you haven't introduced any syntax
>>errors in the interval.
>
>But the appropriate DTD will do both at once!

Yes, but the standard DTDs from the W3C will _not_. What a custom DTD does
is entirely up to the author of said DTD.


>>>I'd like to see you add the checking of matching end tags and the
>>>proper nesting of tags as part of the validator.
>
>>add such a feature would impose _our_ arbitrary standards of "good" HTML
>>without a solid, objective, measure to compare it to.
>
>That's the crux of it.  But for certain limited tasks including
>enforcement of closing tags under discussion, we _can_ superimpose our
>own standards without risk of breaking the W3C published specs.

Again: Slippery Slope! You think this is unproblematic. Todd Fahrner thinks
adding "name" to "IMG" is unproblematic. I want Hn in UL/OL/DL and a
"level" attribute on a "H" element. Where does that leave us? Who gets to
decide? I say the W3C, for better or worse, because the alternative is
Netrape and Internet Exploiter (as opposed to "Netscape" and "Internet
Explorer").


>> >It also checks for missing images.
>> 
>> This, OTOH, could be added in some future version (it's on the TODO).
>
>More useful to check all links

Yeah, that's the idea. I was just keeping it short. :-)
Received on Tuesday, 27 February 2001 05:08:01 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 25 April 2012 12:13:55 GMT