Re: [filter-effects] resolution dependent filter primitives

+CC public-fx

On Aug 27, 2013, at 10:09 AM, Jasper van de Gronde <th.v.d.gronde@hccnet.nl> wrote:

> On 2013-08-27 06:28, Dirk Schulze wrote:
>> ...
>> We could create stable results by default. UAs could set 'kernelUnitLength' such that the kernel size is always relative to a CSS pixel in the current local coordinate system of the filtered object. This is something that is nearly impossible to determine for the author if he wants to use the same filter on different objects. With the auto calculated 'kernelUnitLength' the proportion of the filter chain result looks always the same across platforms but might be more pixelated on one platform and less pixelated on another one.
>> 
>> It is unclear what is better for the users:
>> * best results that are highly platform dependent and indeed differ significantly, or
> What would qualify as "best"? It seems like you're suggesting that using 
> the device resolution is "best". However, applying a convolution at 
> different scales can have wildly different effects. I would thus not 
> consider it "best" to always use device pixels (in fact, I would hardly 
> ever consider it best). Note that applying a filter at a scale that does 
> not match the device resolution does not necessarily result in bad image 
> quality, this heavily depends on the implementation.

If your convolution kernel operates on the device space and no scaling is involved, then you have the most precise results without any interpolation or bilinear/bicubic filter operation. So with "best" I mean exact mapping from kernel to device pixel.

>> * platform independent results that might be pixelated.
> Why would they be pixelated? (At least more pixelated than necessary due 
> to the device resolution.) You don't actually have to apply the filter 
> to an image whose resolution matches kernelUnitLength. (At least in 
> principle.) In fact, although specifying a filter using kernel weights 
> on a regular grid might be best from a usability point of view (as it is 
> the most common way to do it in this context), it might not actually be 
> the best fit for SVG. Formally, SVG is not pixel but vector based, 
> filters would thus be more naturally expressed in some kind of 
> continuous way (using a Laplace transform for example, or by 
> using/specifying interpolation). The kernelUnitLength would then 
> essentially just specify the scale (not resolution) at which to apply 
> the filter. It is then in principle the implementers job to find a 
> decent discrete approximation (just like is done for a polygon). 

The specification requires that the input source gets scaled. It is not the kernel that gets scaled. Therefore you don't have the mapping from the kernel matrix to the device pixel anymore as described before, but over- or undersample  by making your input smaller or larger. For device independent results you may operate on CSS pixel units which mean downsampling your image. And that means that you loose quality of your input source that can just partly be compensated by bilinear filters. And even that is not a requirement to UAs at this point. If you don't do an interpolation, you end up with pixelated results, but the proportion is ideally the same on all platforms.

> Although the specification can of course recommend a course of action. 
> (I would be inclined to do some kind of least-squares fit to the 
> spectrum of the given filter for example, or simply use decent 
> interpolation to derive kernel values for the current scale.)
>> ... I would even suggest removing 'kernelUnitLength' as well and choose one of the above two ways (high DPI but resolution dependent results, or pixelated proportion stable results). Or let the author choose between one of the two options and have one as fallback.
> I may have misunderstood, but some way of specifying the scale is 
> necessary, and kernelUnitLength would seem to be a fairly sensible 
> option (although it would be even better if one could specify one 
> additional coefficient, giving the ability to essentially specify an 
> arbitrary metric). Having said that, a sensible default is of course 
> always a good idea.

Well, scaling of the whole filter chain or a single primitive was specified for 13 years. And yet I can't find any usage in the web beside tests of browser vendors. Again, it is extremely hard to determine the right scale level for the horizontal and vertical axis and chances are high that it does not work as expected on a different object. I assume that this is the reason why it actually was never used by authors. Implementations on the other hand already have these information. At the end both alternatives are reasonable for me. Giving the author a theoretically powerful but practical fragile tool like kernelUnitLength doesn't help IMO.

Greetings,
Dirk

> 
> 

Received on Tuesday, 27 August 2013 08:43:21 UTC