- From: Marcos Caceres <notifications@github.com>
- Date: Mon, 27 Apr 2015 12:42:13 -0700
- To: w3c/manifest <manifest@noreply.github.com>
Received on Monday, 27 April 2015 19:42:40 UTC
> Also, we list no requirements for crawlers like folliwng robots.txt. This seems outside the scope of this document. Robots.txt already defines general rules for spiders trying to access resources. However, I'll add a search engine and a browser as examples of user agents. If there are specific requirements for search engines beyond processing manifests in the manner specified in the spec, then I would like to hear them. --- Reply to this email directly or view it on GitHub: https://github.com/w3c/manifest/issues/343#issuecomment-96792856
Received on Monday, 27 April 2015 19:42:40 UTC