W3C home > Mailing lists > Public > public-fx@w3.org > October to December 2011

Re: [css-shaders] GLSL implementation defined limits

From: Vincent Hardy <vhardy@adobe.com>
Date: Wed, 16 Nov 2011 20:55:23 -0800
To: Chris Marrin <cmarrin@apple.com>
CC: Gregg Tavares <gman@google.com>, "public-fx@w3.org" <public-fx@w3.org>
Message-ID: <CAE9D0C5.22893%vhardy@adobe.com>
Hi Chris,

From: Chris Marrin <cmarrin@apple.com<mailto:cmarrin@apple.com>>
Date: Mon, 14 Nov 2011 11:11:28 -0800
To: Adobe Systems <vhardy@adobe.com<mailto:vhardy@adobe.com>>
Cc: Gregg Tavares <gman@google.com<mailto:gman@google.com>>, "public-fx@w3.org<mailto:public-fx@w3.org>" <public-fx@w3.org<mailto:public-fx@w3.org>>
Subject: Re: [css-shaders] GLSL implementation defined limits

On Nov 14, 2011, at 7:36 AM, Vincent Hardy wrote:

So, If a developer makes a page that uses 3 valid shaders and on some hardware out there 1 of them fails because of that hardware's limitations, how is the developer supposed to deal with this? It's especially an issue if the effect the developer is trying to achieve requires that all effects 3 work together. The developer needs a way to either check that all 3 worked, or a way to say check if any one of them failed they should all fail so the page goes back to its non-shader fallback for all 3 effects.
I think the only thing we can do is to preload the shader programs and fallback, either with a @supports mechanism or with passing multiple programs as I've shown.
Yes, as we discussed earlier in this thread, we need a clear definition of what failure means, and the one you have above (failure to load) makes sense to me. Slow running shaders, do not fit in the failing category. They are in the same boat as slow filter effects (you can write a nasty filter graph that can run well on high end desktop and will be slow on lower end hw) and to some extent in the same boat as slow scripts or pages in general, where the behavior will vary depending on the target hw (granted shaders are a bit different in that they run on the GPU).
@chris: is your suggestion to tighten up the definition of 'failure' to load in section 3.2? (https://dvcs.w3.org/hg/FXTF/raw-file/tip/custom/index.html, see the description of vertexShader and fragmentShader). The current text is:
" If the shader cannot be retrieved, or if the shader cannot be loaded or compiled because it contains erroneous code, the shader is a pass through."
What I would propose is something like this:
"If the shader cannot be retrieved and run, for any reason (e.g., failure to compile or link), then the custom() function becomes a pass through filter."

I don't think we should mention "and run". A shader may run, but run so slowly that it causes a GPU reset. we can't deal with that. I think a better statement would be:

"If either the vertex or fragment cannot be loaded, a shader fails to compile, or the shader program fails to link, then the custom() function becomes a pass through filter."

I think that covers all the cases of a shader program "failing". I believe that if the link is successful it means you haven't blown any resources.

>> I agree. I created a Bugzilla issue for this:

This makes it clearer that the failures can be anything that prevents the shader to run and also adds that a failure in one of the shaders invalidates the whole custom() effect.

Should we mention anything about how failures might be handled? In reading the @supports documentation (http://dev.w3.org/csswg/css3-conditional/), that might be a fine approach. If you say:

@supports ( filter: shader(url(shader.vs) url(shader.fs) ...) ) {
.myfilter { filter: shader(url(shader.vs) url(shader.fs) ...); }

@supports not ( filter: shader(url(shader.vs) url(shader.fs) ...) ) {
.myfilter { filter: shader(url(fallback.vs) url(fallback.fs) ...); }

then you can compile and link the shader in the @supports statement, and know at that point whether or not it can be used. This is pretty wordy, requiring the shader to be stated 3 times. It would be nice if @supports had the ability to have an else clause, like:

@supports ( filter: shader(url(shader.vs) url(shader.fs) ...) ) {
.myfilter { filter: shader(url(shader.vs) url(shader.fs) ...); }
@else {
.myfilter { filter: shader(url(fallback.vs) url(fallback.fs) ...); }

to cut out one of those. But maybe that's overkill. either way, you'd need to encapsulate the shader program and make it possible to compare programs to avoid needless reconstruction of the shader.

>> Reading section 6.1 of CSS conditionals (http://dev.w3.org/csswg/css3-conditional/#support-definition), it seems to me that the @support rule is about testing if a particular property/value pair is supported in general. I think what you are suggesting goes a bit beyond, in that the implementation needs to both support the filter effect and the shader() function _and_ successfully load/compile/link the referenced shader. I think that would be a new feature. In some way, it seems to me that what you are asking for is more like multiple <video> sources, where the author can specify several sources. If one fails to load or is not supported (because of the source's type), then the next one can be used.

>> I also think that the case of basically invalid shader code or shader that fails to load is similar to a JavaScript that fails to load or is invalid. This may break the page. I am not quite convinced that the situation with shaders is unique compared to scripts and warrants a new mechanism.

Kind regards,
Received on Thursday, 17 November 2011 04:55:57 UTC

This archive was generated by hypermail 2.3.1 : Monday, 22 June 2015 03:33:46 UTC