W3C home > Mailing lists > Public > www-validator@w3.org > February 2005

Re: Cloaking and honest validation

From: Lachlan Hunt <lachlan.hunt@iinet.net.au>
Date: Wed, 09 Feb 2005 11:03:33 +1100
Message-ID: <42095355.5020601@iinet.net.au>
To: goxholm@oazao.com
CC: www-validator@w3.org

Geoffrey Oxholm (Oazao, Inc.) wrote:
> In developing websites for our clients we frequently battle with browser
> differences.

Yes, that is every web developer's biggest problem.

> Although we are writing valid XHTML,

Why are you using XHTML?
http://www.hixie.ch/advocacy/xhtml
http://www.mozilla.org/docs/web-developer/faq.html#accept

Just use HTML 4.01, unless you are actually making use of some XML-only 
features (eg. embedded MathML).

> we user server side browser detection to return slightly different
> pages to very old browsers.

Don't use browser sniffing, how do you know which browsers do, and do 
not support CSS?  The browsers in use are not limited to IE, Netscape, 
Mozilla, Opera and Safari.  Although, they are the most popular and 
represent a significant portion of the browser market, it is impossible 
for you to accurately determine the level of CSS support (including 
whether or not CSS is enabled) based on the User-Agent string, which is 
quite often spoofed to get around this kind of thing anyway.

>  (Browsers that don't fully support CSS

Will degrade gracefully, assuming the HTML has been well written.

>  or self-closed br tags,

Do you mean <br />?  That syntax is required in XHTML, unless you are 
replacing it with <br></br>.  However, since XHTML is being treated as 
tag-soup in such user agents, you're better off just serving HTML 4.01 
instead.

> for example.) Although both versions of our pages are valid XHTML the
> link "http://validator.w3.org/check?uri=referer" is only capable of
> validating the default version.

That just illustrates why browser sniffing is a bad practice.  If you're 
going to do any sort of content negotiation, do so based on the HTTP 
Accept headers.  Browsers that support XHTML properly will include 
application/xhtml+xml; and browsers that don't, won't.  However, that 
does not indicate any form of CSS support.

> But it would be nice, and perhaps more accurate, to
> bounce the requester's actual user agent to make sure that the page
> being viewed is actually being validated.

How is the user-agent string supposed to help the validator decide which 
version of the page to recieve?  If you are suggesting that the 
validator should send the user's user-agent string, instead of it's own, 
that is not going to happen.

> This obviously wouldn't help with IP based cloaking, but the legitimate
> reasons for using that are minimal anyway.

The legitimate reasons for using any kind of cloaking is minimal.

-- 
Lachlan Hunt
http://lachy.id.au/
http://GetFirefox.com/    Rediscover the Web
http://SpreadFirefox.com/   Igniting the Web
Received on Wednesday, 9 February 2005 00:03:45 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 25 April 2012 12:14:18 GMT