- From: David Singer <singer@apple.com>
- Date: Fri, 22 Apr 2011 11:07:36 -0700
- To: Dean Jackson <dino@apple.com>
- Cc: public-fx@w3.org
On Apr 19, 2011, at 18:14 , Dean Jackson wrote: > This is something that came up while I was editing the filters specification. > > There are some existing filters (eg feFlood, feTurbulence, feImage) and well-known effects (lenticular halos are a commonly overused example, also [1]) which generate images from scratch rather than manipulate inputs. These don't really fit into the model of the 'filter' property, which is a linear chain of effects, since any input would be completely wiped out by the new image. They work in the model of the 'filter' element, since that can be declared as a graph that composites multiple inputs. > > How important do we consider these effects as part of the property? > > One solution would be to declare some effects as compositing before or after their input. For example, a flood effect would first draw the solid colour then composite the input image, whereas a lenticular halo would composite the input image, then draw the halo. [NOTE: I'm not suggesting lenticular halo is a must-have effect!]. Or maybe this could be a parameter to the effect? I can imagine some cases where you'd want to flood over the input image, then animate the flood colour from transparent to opaque. > > Then I was thinking that these generator effects are similar to other parts of CSS, in particular gradients. There have been some proposals in this area. For example: > > - WebKit's -webkit-canvas value, which allows an element to use a canvas element as an image in background, border, whatever. > - Mozilla's -moz-element, which does the same but with the rendering of any element > > Examples of generators may be: > > - checkerboards > - stripes > - noise > - star shines > > They don't sound too useful in isolation, but graphics artists may feel otherwise. Anyway, the important part of this message is the 3rd paragraph. I don't want to complicate the syntax or implementations. "SVG" filters provide the functionality, so it definitely isn't essential. > > Dean > > > [1] http://www.theonion.com/articles/graphic-designers-judgment-clouded-by-desire-to-us,249/ > I have long thought it strange, if not insane, that we generate textures at the authoring side, and then try to compress them for transmission (when, in fact, good textures are often 'noisy' and hard to compress) instead of sending the parameters to (for example) a reaction-diffusion texture generator. David Singer Multimedia and Software Standards, Apple Inc.
Received on Friday, 22 April 2011 18:08:04 UTC