W3C home > Mailing lists > Public > www-style@w3.org > October 2012

[css3-transforms] Decomposing of 2D matrices

From: Dirk Schulze <dschulze@adobe.com>
Date: Thu, 18 Oct 2012 19:07:24 -0700
To: "www-style@w3.org list" <www-style@w3.org>
Message-ID: <A54A72B2-5BFB-46F7-8B08-BE2B7A3EA8D2@adobe.com>

CSS3 Transforms use decomposing for matrix() and matrix3d() on animations. Means matrices gets decomposed to a set of regular transformation operations like scale() or translate().

The specification describes a way to decompose matrices with Quaternions[2] and does not differ between 2D and 3D matrices.

In the case that a UA does not support 3D transformations, the specification suggests to fallback to the "unmatrix" method in "Graphics Gems II, edited by Jim Arvo"[2].

Issues were raised that the first method results into a 4x4 matrix, even for interpolation between 2D matrices. 2D matrices needs to get flattened to a 3x3 matrix. This could be done like suggested in "Two Dimensional Subset"[1] , by setting 3D arguments of the 4x4 matrix to 0 (or 1). The second issue is that the "unmatrix" code may not give expected results in some cases.

I think this needs input from browser vendors. How do Firefox, Opera and Internet Explorer handle interpolations of 2D matrices? WebKit interpolated 2D matrices by transforming them to 4x4 matrices and flatten the results to 3x3 (if needed) like described in section "Two Dimensional Subset"[1].


[1] http://dev.w3.org/csswg/css3-transforms/#two-dimensional-subset
[2] http://dev.w3.org/csswg/css3-transforms/#matrix-decomposing
Received on Friday, 19 October 2012 02:07:52 UTC

This archive was generated by hypermail 2.4.0 : Monday, 23 January 2023 02:14:20 UTC