Re: [css3-background] Where we are with Blur value discussion

On Jul 23, 2010, at 9:31 AM, Tab Atkins Jr. wrote:
>> But instead of extending the shadow out by about 200px in each direction, it actually extended it out 151px on each of the horizontal sides, and 161px on each of the vertical sides. That's pretty weird. When I measured the values along the bottom of the blur, the bottom 10 rows of blur pixels (along the straight segment) were all the same (254, 254, 254), and the bottom 21 rows were 99% brightness. Maybe there were 40 or 50 more pixels outside of all that which is less than 1/256th opacity?
> 
> Right, everyone's current blur implementations do something strange
> with the numbers.  With my suggested heuristic (approximate a gaussian
> with stdev of half the blur length),  you hit the 2% point (that is,
> the point where the blur is less than 2% opaque) at nearly exactly the
> specified length (serioualy, +- 1px).  

OK, that seems pretty good, if that point is at a the distance equal to what the the author specified, which I think is what you are confirming. So pixels more transparent than that can still fall outside that region, but nobody  will really notice them because they are so nearly transparent, right? That seems fine too (especially since they are also transparent to clicks).

> This has been verified out to a
> 200px length.  2% seems to be a good point for declaring the shadow
> invisible - if you're blurring into white, that's roughly 250,250,250.

Just a point of clarification, then: with your heuristic, is that 2% opacity, or 2% of the shadows full opacity (if, frex, the shadow has an rgba() color)? The former seems like a better choice than the latter, but I can't follow the math to know if that's the case.

> Using a length larger than 200px makes the numbers start changing
> slightly, but I don't know if that's floating-point errors
> accumulating on me or my heuristic breaking down (I need to grab
> Haskell and throw some computable reals at the problem).  In any case,
> by the time you're throwing around 300px blurs, you're certainly not
> caring about the exact extent.

True enough, I guess, unless it is something like 50px off from 200px, or the full shadow height is, say, 10% different than the width. 

It seems like some sort of simple interpolation upscaling could be done at larger blur values, as a performance sop (as long as care is taken to make sure your 2% rule still holds, for consistency). I'm guessing that Gecko is doing that, as their 200px blur was none too smooth (it had visible striations). That is a big reason why I didn't want the shadow to be defined JUST as Gaussian blur amounts (that, and the desire to be able to test/confirm visually that some resulting distance can be measured to _match_ the value the author provides).

Judging from some of the Webkit bugzilla reports and comments, performance is a major issue at some larger blur values (and different UAs seem to be drawing the line in the sand differently in terms of at which point they do something about that and set some limits on their normal Gaussian blurs).

Received on Friday, 23 July 2010 17:11:56 UTC