- From: Tab Atkins Jr. <jackalmage@gmail.com>
- Date: Wed, 25 Jan 2012 08:26:57 -0800
On Wed, Jan 25, 2012 at 6:41 AM, David Geary <david.mark.geary at gmail.com> wrote: > On Tue, Jan 24, 2012 at 5:22 PM, Chris Marrin <cmarrin at apple.com> wrote: >> Adding filter functions to canvas would require you to re-render the items >> for every filter change and you'd have to animate it all yourself. > > Sure, but you must create a superfluous canvas for each set of images that > you animate, and invoke an entirely different technology to apply the > filter. You must ?make sure that those superfluous canvases have > transparent backgrounds, no borders, and have the correct Z order so they > appear over, and not under, the primary canvas for the application. And I'm > sure there are other gotchas to this hybrid approach that don't immediately > come to mind. > > I'd much rather use the filtering underlying API and control the rendering > and animation myself. Yes, it's effectively creating an ad-hoc retained-mode API out of multiple <canvas> elements solely so it can apply filtering. (Using multiple backing canvases to sprite things is a reasonable performance hack, but I don't think it should be required for basic functionality.) ~TJ
Received on Wednesday, 25 January 2012 08:26:57 UTC