- From: vmpstr via GitHub <sysbot+gh@w3.org>
- Date: Thu, 25 Feb 2021 16:37:37 +0000
- To: public-css-archive@w3.org
For posterity, and to express my "this will add cost" concern a bit better: I think a viable implementation here is a two step process. First, upscale using NN to nearest integer multiple. Then, use something like bilerp to take it all the way to the right scale. The problem (at least in Chromium) is that we would need to store the intermediate result in memory before doing bilerp sampling. This means that if we have something like 1,000,001% scale of an image that is 150x150, previously we'd be able to sample directly out of the 150x150 image and obviously only sample where needed so that after all the clipping, we'd end up with the correct result. However, for this case the result should be a size 1500001.5 image (if my math is right), which means the new algorithm would say that we need to use NN up to 1500001x1500001 and then use bilerp. That's not something we can store, even temporarily :). I know that this is a contrived example, but I think there are less contrived examples that would suffer from similar problems. I'm also hoping there's a way to just sample out of the original image (150x150) with some clever math that would effectively end up with this algorithm, but I haven't really thought about it at length yet. -- GitHub Notification of comment by vmpstr Please view or discuss this issue at https://github.com/w3c/csswg-drafts/issues/5837#issuecomment-786035974 using your GitHub account -- Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config
Received on Thursday, 25 February 2021 16:37:39 UTC