- From: Mason Freed <notifications@github.com>
- Date: Thu, 30 Apr 2020 10:53:41 -0700
- To: w3ctag/design-reviews <design-reviews@noreply.github.com>
- Cc: Subscribed <subscribed@noreply.github.com>
- Message-ID: <w3ctag/design-reviews/issues/494/622007263@github.com>
> * getting pixels on the screen quickly, even if those pixels aren't interactive yet (or, alternatively, avoiding layout shifting after javascript is loaded), and > * allowing content in Shadow DOM to be indexed by search engines (it would still be good to have a working example to understand why that is desirable). > > Is that right? Yes, that is right. I think there are two other compelling use cases for this feature, in addition to those two: * Allowing serialization and deserialization of DOM that includes Shadow DOM. Currently, there is no way to get innerHTML including shadow roots, because there is no declarative way to represent it. * CSS developers, interested in using the style-scoping feature of Shadow DOM, do not want to use (or their design system prohibits) Javascript. > I personally know companies who have disregarded Web Components due to not supporting SSR as it would result in very poor SEO and kill their businesses. > > On the other hand, it seems that both G Search and Bing are using modern engines now that are being updated on a good cadence. I am still not sure if you can rely on things being indexed quickly with JS enabled. I have also heard this objection/concern from developers several times. Even if they want to use Web Components, they can't, because **SSR is seen as a hard requirement**. -- You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub: https://github.com/w3ctag/design-reviews/issues/494#issuecomment-622007263
Received on Thursday, 30 April 2020 17:53:54 UTC