> * allowing content in Shadow DOM to be indexed by search engines (it would still be good to have a working example to understand why that is desirable).
This has been quite desirable before, especially for e-commerce - that was mostly due to the Google and Bing crawler using very old browser engines (I think Google Search used to use M42 until a year or so ago), as well as being quite slow at indexing with JavaScript enabled. Like it might index immediately without JS but then it could take a week or so before it got reindexed, which has been a no-go for a lot of companies.
I personally know companies who have disregarded Web Components due to not supporting SSR as it would result in very poor SEO and kill their businesses.
On the other hand, it seems that both G Search and Bing are using modern engines now that are being updated on a good cadence. I am still not sure if you can rely on things being indexed quickly with JS enabled.
Maybe Martin knows better (@AVGP)
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/w3ctag/design-reviews/issues/494#issuecomment-621663625