Re: [css-transforms] CSS3D breaks with opacity flattening

Hey all,

Sorry if my previous replies were rash, but I've collected myself and
prepared the following response which can hopefully paint a picture of the
problem as well as propose a solution.

Let's step outside of the web box for a second, and just imagine: we have a
3D engine, and with that engine we have game characters or objects that are
made of arbitrary number of descendant children (subtrees) in an overall
scene graph. Imagine these objects are in a game where things fade in and
out of view and therefore involve transparency.

I believe we all want the API for doing this to be as easy as possible.
And, since this is the case, then it makes sense that applying opacity less
than one to the root node of an object in a scene graph will make the whole
object transparent without flattening it.

Take for example, this basic car I just made:

https://jsfiddle.net/trusktr/ymonmo70/1

=== Challenge ===

Here's a challenge for you all: make the whole car transparent without
modifying the markup (it is important not to modify the markup because that
is what CSS is about). But, if you want to modify the markup, I am open to
seeing that solution as well (and I already know the non-nested solution
will work, but will be tedious to migrate to). Keep in mind, this is very
very easy with "legacy" behavior, and we can make the whole car transparent
by simply adding the following CSS into the CSS text box:

```css
.car {
  opacity: 0.5;
}
```


which results in the following (works in Safari 9 and Firefox 47, broken in
Chrome 53):

https://jsfiddle.net/trusktr/ymonmo70/2

As far as I know, I don't think any of you will have a solution that is as
simple as simply applying an opacity to the root node of the car. The
simplest solution I can imagine is applying opacity to all the leaf nodes,
and that will involve much more work than the addition of a single opacity
property to a single selector or element.

With the new behavior, it isn't so simple any more -- *the web's 3D API is
now **more difficult to use*, which isn't what we want as API developers.

We should aim to make 3D programming more easy, not more difficult! Opacity
flattening makes 3D programming more difficult because the 3D library
programmer now has to go and keep track of opacities virtually, for every
node in the scene (DOM nodes in our case) and then has to traverse the
scene graph (DOM tree) manually and multiply all opacities manually in
order to finally apply actual opacities to the leaf-most items (elements)
in the scene graph.

Note, even with "legacy" behavior or the new flattening behavior, there is
currently no way to make rendered non-leaf nodes transparent. In the
following example, try and make *only* the red box transparent but not
anything else (i.e. not it's children):

https://jsfiddle.net/trusktr/ymonmo70/4

With the new behavior, we are limited to making only leaf-nodes of an
existing scene transparent, and keep in mind we'd like to apply CSS *without
modifying markup*.

One of the great things about preserve-3d and nested elements is that
transformation matrices can be cached, and don't need to be re-calculated
all the time on the JavaScript side because the HTML engine has them cached
in the DOM hierarchy. This prevents a TON of string manipulation in
JavaScript due to converting numbers into strings to pass to the CSS engine
via style tags (that will be fixed with Typed CSSOM, but we're not sure
when that's making it into the wild).

In legacy behavior, the same applies to opacity: the HTML/CSS engine caches
values and does multiplication automatically, which makes the end API
easier to use for 3D programming, and *more performant.*

In order to solve the problem that the new opacity behavior entails, the
best solution is (unfortunately) to abandon preserve-3d and nested DOM (*losing
the performance benefits*), then keeping track of the transformation and
opacity hierarchy manually, multiplying down the tree manually, and finally
applying *all the values via number-to-string conversion in the style
attributes, which may involve string parsing, splitting, joining, etc,
before being actually passed to the style attribute (yikes!)*.

Reverting to the non-nested approach means that we are forfeiting the
performance advantages of the nested DOM approach just to achieve what we
want.

Note, using the non-nested approach, virtual hierarchies, and manual
handling of the math allows us to easily make a parent in the hierarchy
transparent but not its children (which is impossible in the nested
approach using the `opacity` property).

=== Impact ===

One argument for the go-ahead to implement this change is that "not many
people are impacted".

I'd like to argue that 3D in the web (as far as DOM goes) is already fairly
difficult compared to code-based APIs (i.e. imperative APIs rather than
declarative APIs, and not just in the web), and relatively few people
actually use CSS3D. The vast majority of websites still use only the
decades-old 2-dimensional technology of the web, and the 3D aspects of
HTML/CSS are relatively new and difficult to mix with the 2-dimensional
features. It may be easy to think that many people aren't affected by this,
but let's take into consideration the ratio of the number of people who use
opacity in 3D scenes to the number of people who program CSS3D scenes. If
you take these numbers into consideration, I am willing to bet that the
number will be a lot higher than the 0.006% mentioned in the blink-dev
thread
<https://groups.google.com/a/chromium.org/d/msg/blink-dev/eBIp90_il1o/9q3M0ww2BgAJ>
.

Famous was making a an open-source library that depended on the legacy
behavior, and this change breaks that library. They've gone proprietary,
and their proprietary stuff probably still depends on the legacy behavior.
They will now face the chore of re-writing parts of their library to
convert from nested dom to non-nested dom if they wish for opacity to work
the same as before.

I'd like to show you how these changes impact my own library at
http://infamous.io (which is being renamed and moved to a new domain soon).

My library gives us both a JavaScript API (imperative) and a
Custom-Element-based HTML API (declarative) for defining 3D scenes. When
using the JavaScript API, it generates the same elements as when using the
HTML API. For our purposes, I will show only the HTML API.

Here is the same car example as above (or, at least similar), made with my
custom elements:

https://jsfiddle.net/trusktr/ymonmo70/5

What is great about this example is that those custom elements are *the
elements themselves* in the final rendering (look in the element inspector
and notice that they have `transform: matrix3d()` applied to them). This
means we can place anything into those elements. For example, let's place
some text onto the surfaces of the car just for fun:

https://jsfiddle.net/trusktr/ymonmo70/6

Now, if we apply opacity, the whole car will be flattened in Chrome 53
(compare it to Safari 9) because we've applied opacity onto the `.car`
element element:

https://jsfiddle.net/trusktr/ymonmo70/7

I believe I have something really nice in the works, and plan to add WebGL
soon, but now it is fundamentally broken: in order to fix the rendering,
but keep my same nested markup, I will have to render new elements into a
flat structure that sits next to my <motor-scene> elements. This means that
for each motor-node element I will create a reciprocal element in the
actual rendering context, while my custom elements will not actually be
rendered (they will be display:none). In other words, although I'd like for
my API to be nested and using preserve-3d, I will have to render *completely
separate elements* next to my custom elements in order to solve the opacity
problem that is on my hands now.

But, there are some problems with this! Remember that we were able to place
content like text inside the custom elements, and it worked fine because
the custom elements are part of the actual rendered scene?

Well, now that won't be the case any more. Now I have the following
problems to deal with:

   1. I have to convert my code so it renders non-nested elements next to
   my custom elements (or maybe inside a ShadowDOM root).
   2. I will have to implement transformation caching just for DOM
   rendering, and can no longer take advantage of the HTML engine's transform
   caching.
   3. Same for opacity caching down the scene graph, I will need to
   implement it rather than let the HTML engine handle it.
   4. Lastly, but most problematically, I have to find a way to clone or
   copy the user's content from inside my custom elements into the new
   non-nested elements. This will have detrimental effects on the user's
   ability to target and select those elements for styling, as they will no
   longer appear in their original places. Users will need to use IDs all over
   the place when they probably shouldn't, and will likely mess up a bunch of
   times before discovering the ID workaround. This may also have an impact on
   SEO, but I haven't looked into that much yet.

The first three points aren't as bad as the fourth, as I'll have to
implement the first three points when I add WebGL anyways. But, the fourth
point is really bad! The new behavior is an extreme pain because of point
number four.

I hope you can see how, from my perspective as a library author writing on
top of HTML/CSS3D, this is a very unwelcoming change if opacity is to have
any meaning in my 3D scenes.

There aren't many people doing what I'm doing. There are only three
libraries out there that are doing what I am doing (that I know of):

   - Three.js <http://threejs.org> - Three.js uses the non-nested approach,
   so does not face the opacity problem. People who use Three.js'
   CSS3DRenderer know what they are working with from the start, and they
   won't possibly get confused in targeting their elements for styling,
   because their elements will not be transplanted like with my library. But
   note, Three.js will not be able to update to the nested-approach due to
   this problem, and therefore won't be able to have the performance benefits
   of the nested approach.
   - Famo.us <http://deprecated.famous.org> (0.3 and below) - Like
   Three.js, uses non-nested approach, and is worry-free like Three.js.
   - Famo.us Engine <http://github.com/famous/engine> (0.5 and above) -
   Uses the nested approach, and is now broken just like my library.
   - Samsara <http://samsarajs.org> - Uses the non-nested approach, so no
   problems there.
   - Infamous <http://infamous.io> - My library, which uses the nested
   approach. Problem!

In sincerity, opacity flattening is a breaking change that shouldn't have
been implemented by any browsers without a full solution for "legacy" code
with 3D in mind.

=== What We Need ===

What we need are possibly at least three forms of opacity:

   1. 3-dimensional, like the legacy behavior where things don't get
   flattened and opacities are multiplied down the hierarchy.
   2. Parent-only (opacity on a single element), where nothing is
   flattened, and opacity is applied only to the target element and not its
   children. Note, there is currently no way to do this in the nested
   approach, only in the non-nested approach with virtual hierarchies. In
   "legacy" behavior, we can only apply opacity to an entire subtree of a 3D
   context, not just to a parent element, so that's either all or nothing.
   3. Flattening, which is the new behavior, though I fail to see why
   anyone applying opacity to a 3D object desires for the object to be
   suddenly flat, as that doesn't make much sense and is unintuitive, and
   means the API is more difficult to use that what we'd like.

Note, with "legacy" behavior, *it is already possible to create a new 3D
context and apply opacity to it*, so this new behavior isn't actually
needed (from a functional perspective), because we can currently achieve
the same behavior with the "legacy" implementation. The new behavior does
two things:

   1. it eliminates the desired 3D behavior in nested scenes that use
   preserve-3d
   2. it adds a second method of opacifying an entire 3D scene which *we
   can already do*.

Here is an example of just that using "legacy" techniques, where I simply
add the style `motor-scene {opacity: 0.5}` to the CSS, which opacifies the
3D context:

https://jsfiddle.net/trusktr/ymonmo70/8

We all agree that the spec (whatever it says) should be well defined so
that it is clear what browsers should implement. I also believe that the
legacy behavior is much more desirable for 3D programmers than is the
flattening of objects (legacy-like behavior just needs to be clearly
spec'd).

=== Possible Solution ===

Maybe we can improve the spec (before making breaking changes, and browser
developers, you all roll back to "legacy" implementation ASAP) so that the
spec can appease all three opacity styles.

This could be achieved with something like a new CSS property called
`opacity-style`, f.e.

   - `opacity-style: 3d` which is similar to the legacy behavior and is
   great for 3D scenes, and does not flatten anything. Opacity is multiplied
   all the way down the tree with other same-style opacities and stops at
   elements that have a different opacity-style (`flat` or `single`).
   - `opacity-style: single` which does not flatten anything and only
   applies opacity to the target element but not its children
   - `opacity-style: flat` which is like the new behavior that Chrome 53
   just introduced, things are flattened and a new 3D context is made. I still
   fail to see how this is desirable. *Assuming we can apply an opacity to
   a 3D object to make it transparent and then having it become flat like
   paper is simply not intuitive.*

The default value can be `opacity-style: single`, which I think is the
least impacting of the three. We can then see what 3D programmers end up
using more often (I'd like to bet that `opacity-style: flat` will be the
least used). Or, perhaps the default for a 3D object can be `single`, and
`flat` for non-3D objects.

What are your thoughts on something like this, `opacity-style`? Because, as
it stands, opacity in the 3D web isn't very workable now in nested scenes
(it is much more workable with the legacy behavior).

*/#!/*JoePea

On Thu, Sep 15, 2016 at 11:16 AM, Tab Atkins Jr. <jackalmage@gmail.com>
wrote:

> On Thu, Sep 15, 2016 at 10:02 AM, Rik Cabanier <cabanier@gmail.com> wrote:
> > On Wed, Sep 14, 2016 at 8:44 PM, /#!/JoePea <trusktr@gmail.com> wrote:
> >> Here's an example using Famous Engine (http://github.com/famous/engine
> ):
> >>
> >> First, with opacity at the default value of 1.0, the "Famous Code" logo
> >> moves back and forth and rotates:
> >> http://jsfiddle.net/trusktr/spauv8fs/5
> >>
> >> With opacity reduced, it breaks in Chrome 53:
> >> http://jsfiddle.net/trusktr/spauv8fs/6
> >>
> >> Famous Engine has the ability to mix WebGL with DOM, and this includes
> >> opacity. This new behavior causes the DOM elements not to move in 3D
> space
> >> the WebGL meshes.
> >>
> >> First, here's a demo, with opacity at 1.0:
> >> http://jsfiddle.net/trusktr/spauv8fs/7
> >>
> >> The, with opacity at 0.7:
> >> http://jsfiddle.net/trusktr/spauv8fs/8
> >>
> >> What you are supposed to see is a DOM element and a WebGL Mesh that both
> >> seem to intersect with the pink DOM-based background (the
> implementation is
> >> not perfect yet...). In the second fiddle (spauv8fs/8) the "Famous Code"
> >> logo appears not to move any more while the WebGL mesh continues to
> move.
> >>
> >> There is a bug in the WebGL renderer which I believe may be due to
> changes
> >> in WebGL, but the WebGL part of those examples is supposed to be
> transparent
> >> as well.
> >>
> >> I myself am working on a new implementation of DOM + WebGL, and it
> allows
> >> application of opacity. This change in Chrome 53 completely breaks how
> that
> >> is supposed to work. I don't have an online demo yet...
> >>
> >> I would say let's definitely consider undoing the changes to the spec so
> >> that flattening does not happen when applying an opacity to 3D DOM
> elements.
> >> The reasoning is not just to prevent breaking apps, but because the new
> >> behavior simply doesn't make sense as far as 3D goes.
> >
> >
> > I don't understand how you come to that conclusion.
> > The new behavior seems more logical since it applies the opacity to
> element
> > that has the property applied. The old implementation distributed the
> value
> > to its children which is counter to any other CSS value.
> > Are you proposing that we also do this for filters, blending, backdrop
> > blurring and other effects, or is opacity special in some way,
> >
> > As Simon stated, if you want the old behavior, just add a selector to
> your
> > opacity parameter so it's applied to the children.
>
> Yes. This is not a spec bug, it's a natural and unavoidable
> consequence of doing a "group effect", which opacity, filters, and a
> few other effects are.  These types of effects require the group to be
> rendered as a unit, then have the effect applied; in 3d space, this
> has the effect of flattening them. (If you didn't flatten, then other
> items could get between the individual pieces in 3d order, and there's
> no consistent or sensible way to render that. This is identical to how
> z-index is "flattened" by opacity and other group effects, so you
> can't sandwich elements elsewhere in the page between the elements in
> the group.)
>
> If you want to sandwich things, you need to push the effect further
> down into the leaves, so it doesn't group as many things together.
> This lets you do more re-ordering, but has a different visual effect -
> if one item occludes another in a group, when they're made transparent
> they still occlude; if they're made individually transparent, you'll
> be able to see the second item behind the first.  Similar differences
> exist for other group effects - if you're doing a gaussian blur,
> blurring two boxes as a group can look quite different than blurring
> them individually.
>
> As Simon said, Chrome 52 and earlier Safari are simply buggy and not
> "grouping" things correctly; they're instead automatically pushing the
> opacity down further toward the leaves when 3d is involved.  If you
> want the same effect, just do so manually, as Rik recommends.
>
> (We had an almost identical request in the SVG Working Group a few
> days ago. I posted
> https://github.com/w3c/svgwg/issues/264#issuecomment-246750601 as an
> extended explanation of what's happening.)
>
> ~TJ
>

Received on Monday, 19 September 2016 00:52:26 UTC