Re: Adding normal matrix to CSS Shaders

On Jan 12, 2012, at 11:20 AM, Gregg Tavares (wrk) wrote:

> Once you add in CSS shaders you have no idea what they will be used for
> and what kinds of effects people will try to make.
> 
> There is already a standard for shaders in the from of Standard Annotations and Semantics that
> shader editors already use. It's used in NVidia's FX Composer. It's used in AMD's Render Monkey,
> It's used in several other shader editors. It provides 24 standard matrices. Why diverge from this standard?
> 
> 24 is not that many. It's trivial to implement. They can be computed on demand only when 
> needed and cached if needed again for the same view. They are standard. They cover more
> needs.

I'm not sure what we're arguing anymore, so let me recap. I believe there are 3 issues:

1) Should we pass more (or any) matrices to the CSS shaders?

2) If so, which matrices and what do we call them?

3) How do we avoid computing matrices needlessly?

If I understand what Gregg is proposing, I agree with him. We should make all 24 matrices available and from what I remember of his list, we should use that naming, or at least agree on a standard naming.

Once we agree on naming, I believe we can simply use that name in the shader and compute the given matrix when we see it being used. You can get a list of the uniforms declared in a shader, so it should be a simple matter of running through that list and computing only the matrices used.

I've added this proposal to https://www.w3.org/Bugs/Public/show_bug.cgi?id=15253

-----
~Chris
cmarrin@apple.com

Received on Thursday, 12 January 2012 22:58:49 UTC