Re: [filter-effects] resolution dependent filter primitives

On 2013-08-27 06:28, Dirk Schulze wrote:
> ...
> We could create stable results by default. UAs could set 'kernelUnitLength' such that the kernel size is always relative to a CSS pixel in the current local coordinate system of the filtered object. This is something that is nearly impossible to determine for the author if he wants to use the same filter on different objects. With the auto calculated 'kernelUnitLength' the proportion of the filter chain result looks always the same across platforms but might be more pixelated on one platform and less pixelated on another one.
> It is unclear what is better for the users:
> * best results that are highly platform dependent and indeed differ significantly, or
What would qualify as "best"? It seems like you're suggesting that using 
the device resolution is "best". However, applying a convolution at 
different scales can have wildly different effects. I would thus not 
consider it "best" to always use device pixels (in fact, I would hardly 
ever consider it best). Note that applying a filter at a scale that does 
not match the device resolution does not necessarily result in bad image 
quality, this heavily depends on the implementation.
> * platform independent results that might be pixelated.
Why would they be pixelated? (At least more pixelated than necessary due 
to the device resolution.) You don't actually have to apply the filter 
to an image whose resolution matches kernelUnitLength. (At least in 
principle.) In fact, although specifying a filter using kernel weights 
on a regular grid might be best from a usability point of view (as it is 
the most common way to do it in this context), it might not actually be 
the best fit for SVG. Formally, SVG is not pixel but vector based, 
filters would thus be more naturally expressed in some kind of 
continuous way (using a Laplace transform for example, or by 
using/specifying interpolation). The kernelUnitLength would then 
essentially just specify the scale (not resolution) at which to apply 
the filter. It is then in principle the implementers job to find a 
decent discrete approximation (just like is done for a polygon). 
Although the specification can of course recommend a course of action. 
(I would be inclined to do some kind of least-squares fit to the 
spectrum of the given filter for example, or simply use decent 
interpolation to derive kernel values for the current scale.)
> ... I would even suggest removing 'kernelUnitLength' as well and choose one of the above two ways (high DPI but resolution dependent results, or pixelated proportion stable results). Or let the author choose between one of the two options and have one as fallback.
I may have misunderstood, but some way of specifying the scale is 
necessary, and kernelUnitLength would seem to be a fairly sensible 
option (although it would be even better if one could specify one 
additional coefficient, giving the ability to essentially specify an 
arbitrary metric). Having said that, a sensible default is of course 
always a good idea.

Received on Tuesday, 27 August 2013 08:09:40 UTC