Re: Ill-defined programs

Maciej,

Thanks for expressing your position!

Out of curiosity, does WebGL working group take the same stance on
ill-defined programs (i.e. applying interoperability requirements to them)?

> Yes. Strong objection. If behavior of any programs is not fully
specified, then web developers will start to accidentally depend on the
behavior of one browser (usually whichever is most popular), and then
browsers will have to reverse-engineer each others' behavior.

This sentence touches 2 important topics:
  1. *Full specification*. I don't believe we can fully specify behavior of
even well-defined programs... Say, you have a number of UAV/SSBO writes in
the shaders - there is no efficient way to guarantee particular ordering
(hence, Unordered part) for us, and thus - any developer relying on the
order produced by their dev machine may be surprised to see what other
users get, depending on the backend, OS, driver, etc. If we agree here, we
need to figure out where to draw the line between strong portable behavior
and whatnot. My point is, excluding the ill-defined programs from the
equation provides more opportunity for well-defined programs to run
efficiently.
  2. *Reverse-engineering* each other behavior. AFAIK, this group's goals
is to produce an open-source WebGPU implementation that can be shared by
browser vendors, no need to reverse-engineer. I would prefer us (as a
group) focusing on the standard more than implementation though, given that
we have an alternative.

>>  2. Are there any objections to having an optional debug validation
layer?

> I don't know what that means so no opinion,

We touched the validation layer topic a few times, accidentally. For
example, when discussing the robust buffer access, we mentioned that there
can be a validation layer doing all the clamping and potentially signalling
errors up (to the CPU side), but we (Mozilla) don't believe it to be
necessary at all times, given that both Vulkan and D3D12 provide decent
robustness out of the box. The validation layer could do much more, e.g.
ensuring pipeline barriers are correct, but the exact responsibilities
depend on the chosen API direction.


> Security (for some appropriate definition) is a hard requirement.

Absolutely! What about ordering of the rest? (portability, performance, API
simplicity)


Kai,

>  When we say, "There is no duty to detect or report whether an
application is actually ill-behaved," we're only saying that it's not
required for WebGPU to be secure.

Yes, I understand. I took the discussion outside of the bounds/context of
the original issue, hence putting it on the mailing list (as opposed to
discussing in place).

Thank you,
Dzmitry


On Mon, Nov 13, 2017 at 4:42 PM, Kai Ninomiya <kainino@google.com> wrote:

> Dzmitry, I want to just clarify David's intent, if I understand it
> correctly. His document is focused solely on the security constraints. When
> we say, "There is no duty to detect or report whether an application is
> actually ill-behaved," we're only saying that it's not required for WebGPU
> to be secure.
>
> There is, of course, still difference of opinion in whether we should
> ultimately actually detect/normalize ill behavior.
>
> On Mon, Nov 13, 2017 at 12:39 PM Maciej Stachowiak <mjs@apple.com> wrote:
>
>>
>> On Nov 13, 2017, at 12:27 PM, Maciej Stachowiak <mjs@apple.com> wrote:
>>
>>
>>
>> On Nov 13, 2017, at 11:44 AM, Dzmitry Malyshau <dmalyshau@mozilla.com>
>> wrote:
>>
>> I'd like to follow-up on discussion points mentioned in
>> https://github.com/gpuweb/gpuweb/issues/39:
>>
>> > There is no duty to detect or report whether an application is actually
>> ill-behaved. Attempting to do so may incur overhead. For a development
>> environment it may be desirable to detect bad behaviour, but that is not a
>> security requirement.
>> > We don't care about the computed results or performance of an
>> ill-behaved application. That may be a user-experience concern, but not a
>> security concern.
>>
>> This is wider than just shaders or just security, so I'm following up on
>> email as opposed to hijacking the GitHub thread.
>>
>>
>> I agree that consistent behavior for ill-behaved applications is nit a
>> security requirement. But it is a
>>
>>
>> Meant to say: but it is an interoperability requirement.
>>
>>
>>
>> The approach expressed by David matches the direction we (as in -
>> Mozilla) would like to see WebGPU going: allow the user to fully
>> (*explicitly*) specify what needs to be done, promise the portability and
>> best performance for well-defined programs, and secure the ill-defined
>> programs without going extra mile for making them portable or fast.
>>
>> It goes somewhat across the Apple's idea that since the browser has to
>> validate all the API calls (or, in this case, shader code/execution) for
>> security, it can make detailed decisions on how the work is submitted to
>> the backend API (e.g. insert pipeline barriers, allocate resource memory,
>> clamp array indices, etc), thus turning the exposed API to have more
>> implicit parts.
>>
>> Trying to figure out the steps to make an objective (non-opinionated)
>> call to this issue, we can start with these questions:
>>   1. Are there any objections to lowering portability/performance
>> guarantees for ill-defined programs?
>>
>>
>> Yes. Strong objection. If behavior of any programs is not fully
>> specified, then web developers will start to accidentally depend on the
>> behavior of one browser (usually whichever is most popular), and then
>> browsers will have to reverse-engineer each others' behavior. This has
>> happened so many times in the course of web standards development that it's
>> almost a running joke. Every once in a while someone says "hey, let's just
>> not define error handling, we only need to define the behavior for valid
>> content" it happens. The first time was HTML, Browsers ended up
>> reverse-engineering each other's error handling until finally they got sick
>> of the W3C not defining this and formed the WHATWG to create HTML5, which
>> fully specified parsing behavior for all invalid documents. CSS, JavaScript
>> and WebAssembly also have fully interoperable behavior by spec, even in
>> "invalid" or "error" or "ill-defined" cases.
>>
>> Let's not make this rookie mistake. We must fully define the behavior of
>> all programs.
>>
>>
>>   2. Are there any objections to having an optional debug validation
>> layer?
>>
>>
>> I don't know what that means so no opinion,
>>
>>   3. Can we agree that API simplicity is lower priority than security
>> (starting with obvious), portability, and performance?
>>
>>
>> Security (for some appropriate definition) is a hard requirement.
>>
>>
>> Thanks,
>> Dzmitry
>>
>>

Received on Monday, 13 November 2017 22:14:18 UTC