- From: chupacerveza <chupacerveza@gmail.com>
- Date: Thu, 7 Jun 2007 14:01:34 -0700 (PDT)
- To: www-validator@w3.org
What I'm getting from this is that you are using the checklink plug-in for Firefox (on Windows?) and it is reporting broken links when you "check this page" is that right? You might need to be asking the author of the plug-in if that's the case. It's certainly possible that the plug-in breaks when robot exclusion rules are encountered. === Ted Koppel wrote: > Thanks for the reply. If I check it on my builder the links check good, > if I do it on the site they show broken..?? Could it have anything to do > with > FireFox? === Robert T Wyatt-2 wrote: > > > Hmmm... it doesn't report broken links when I check it. But it does > report that you have a robots.txt file that excludes robots: > > What to do: The link was not checked due to robots exclusion > rules. Check the link manually. > Response status code: (N/A) > Response message: Forbidden by robots.txt > > ... it even tells you what to do about it.... > > > Payless Sport Store wrote: >> I'm confused. Checklink shows almost all of my links as broken Code 404, >> however they all work just fine. Any reason I should be concerned? URL >> www.paylesssportstore.com <http://www.paylesssportstore.com> >> TKoppel Payless Sport Store >> >> Return Mail tkoppel@paylesssportstore.com >> <mailto:tkoppel@paylesssportstore.com> > > > -- View this message in context: http://www.nabble.com/checklink%3A-tf3885192.html#a11016171 Sent from the w3.org - www-validator mailing list archive at Nabble.com.
Received on Friday, 8 June 2007 14:16:01 UTC