Re: [w3c/manifest] Describe how developers can block web crawlers (#498)

So the consensus is not to add the robots.txt fix to the spec, but instead to an explainer?

BTW I sympathize with @patrickkettner's POV but I imagine most web authors will work around this by just not hosting their PWAs on Geocities. :wink: Also may not be so bad that admins of JSBin/Codepen/etc. can unilaterally disallow PWAs on their sites by robots.txt-blocking all manifest files.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/w3c/manifest/issues/498#issuecomment-246473012

Received on Monday, 12 September 2016 20:02:42 UTC