W3C home > Mailing lists > Public > www-style@w3.org > April 2007

Re: CSS Opacity

From: David Hyatt <hyatt@apple.com>
Date: Sun, 29 Apr 2007 03:13:15 -0700
Message-Id: <DCFF1C65-1D55-4731-8EF1-61787F90EAD3@apple.com>
Cc: www-style@w3.org
To: Daniel Beardsmore <public@telcontar.net>

On Apr 28, 2007, at 9:08 PM, Daniel Beardsmore wrote:

>
> David Hyatt wrote:
>> Sure, but using opacity is much worse, since you'll force the Web   
>> engine to make an offscreen buffer to render the image into  
>> before  then blending it with the destination.  One transparent  
>> image buffer  from a PNG is going to be way more efficient than  
>> having a non- transparent image buffer + an entire offscreen  
>> buffer just to do the  blend.
>
> I am not sure there is any reason why this has to be the case. When  
> plotting pixels that are governed by opacity, you cannot simply  
> multiply them by the opacity as you go?
>

No, you can't.  Opacity is group opacity, which means that the entire  
subtree has to be composited and blended as a unit.  In the fully  
general case you can't "blend as you go" because of overlap...  
multiple objects within the subtree could draw to the same pixel.

> I also don't know how JPEG+opacity is any heavier a load than a PNG  
> with alpha. PNG images sit in memory as raw PNG to be handled by  
> the OS? Is this something that every "OS" (read, OS/window manager/ 
> widget set) features, or is direct PNG support only a feature of  
> Core Image that Apple is relying on in Mac OS X only?
>

As I said above, opacity basically forces you to create a buffer to  
hold the contents of the subtree, since you have to delay the blend  
until you've painted all of the relevant contents of the subtree.

> You are hinting that Core Image has a fancy direct PNG blitter that  
> doesn't require unpacking the PNG image first as a linear image  
> before rendering it? Otherwise, unpacking the PNG into a linear  
> image would be no different to unpacking a JPEG into a linear image  
> and then slapping a fixed alpha channel on it.
>

WebKit does not use Core Image for image rendering.  Again, the  
definition of opacity is that it applies to an element and its  
descendants, so everything has to be rendered and then blended.  A  
Web engine could in theory detect the case where an opacity layer  
contained only an image (and nothing else) and optimize to avoid the  
buffer though (you could imagine a similar optimization being done if  
the opacity layer contains only non-overlapping text).

dave
Received on Sunday, 29 April 2007 10:13:22 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 27 April 2009 13:54:50 GMT