[w3c/manifest] Describe how developers can block web crawlers (#498)

There may be a case where a web app developer does not want their web app to be crawled.  I was thinking proposing an additional element to support this, but realized the web already has a way to block crawlers.  

Below is some draft language, if this seems reasonable, please 👍 and I'll submit a PR.

> Like other web resources, a web app manifest should be accessible to any web browser or web crawler.
>  
> If a web app developer wants to block traffic from web crawlers, the developer MAY do so by including the web app manifest in a robots.txt file.  This is further described in the [robots.txt](http://www.robotstxt.org/) protocol.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/w3c/manifest/issues/498

Received on Friday, 9 September 2016 00:38:42 UTC