- From: griffin granberg <ggranberg@kuitoweb.com>
- Date: Sun, 13 May 2001 10:51:50 -0500
- To: <www-html@w3.org>
is there a way to make it so a browser doesn't cach certain pages? _____________________________ Griffin Granberg Creator / Co-Founder Media Developer http://www.jestercrew.com ----- Original Message ----- From: "Masayasu Ishikawa" <mimasa@w3.org> To: <jimajima9@yahoo.com> Cc: <www-html@w3.org> Sent: Sunday, May 13, 2001 10:37 PM Subject: Re: Validate Strict AND no robots > Jim Angstadt <jimajima9@yahoo.com> wrote: > > > Below are the results of checking this document > > for XML well-formedness and validity. > > > > Line 7, column 29: > > <meta name="robots" contents="noindex, nofollow" > > /> > > ^ > > Error: there is no attribute "contents" for this > > element (in this HTML version) (explanation...) > > > > Line 7, column 50: > > <meta name="robots" contents="noindex, nofollow" > > /> > > ^ > > Error: required attribute "content" not specified > > (explanation...) > > > > Is it possible to validate to 'strict' AND exclude > > robots? > > Change "contents" to "content". > > Regards, > -- > Masayasu Ishikawa / mimasa@w3.org > W3C - World Wide Web Consortium
Received on Sunday, 13 May 2001 23:50:08 UTC