Re: [w3c/ServiceWorker] What should happen in case of large number of added routes? (Issue #1746)

@sisidovski asked me for my perspective on how this works in the web platform in general. The main guidance is given in https://infra.spec.whatwg.org/#algorithm-limits , which advises that by default there should be no limits.

But since we have a good reason to have limits here, the ideal is to make them interoperable. So the discussion of leaving flexibility to implementations for the upper limit is a bit scary from that perspective, as it could cause interop issues. If Chrome supports 256 and Safari/Firefox support 512, then it is pretty easy to make sites that work only in Safari/Firefox and not Chrome.

So I think the ideal would be if all implementations were willing to pick a specific limit, and support everything in the range [0, thatLimit]. For both number of rules and depth of rules.

If implementers have strongly different ideas on what the upper limits should be, and don't want to converge, then the second-best option would be a defined lower limit, and leave the upper limit implementation-defined. This doesn't solve the interop concern in general, as it would still be possible for interop issues to arise between [specifiedLowerLimit, maxUpperLimitFromAllBrowserEngines]. But it at least gives web developers who read the docs a target: don't exceed specifiedLowerLimit, and your code will work in all browsers.

-- 
Reply to this email directly or view it on GitHub:
https://github.com/w3c/ServiceWorker/issues/1746#issuecomment-2608759152
You are receiving this because you are subscribed to this thread.

Message ID: <w3c/ServiceWorker/issues/1746/2608759152@github.com>

Received on Thursday, 23 January 2025 03:16:52 UTC