Re: VideoStreamTrack: takePhoto()

On Mon, Apr 8, 2013 at 4:17 PM, Randell Jesup <randell-ietf@jesup.org> wrote:
> I looked at some of the links mentioned, and while interesting I'm not
> entire clear on how this would affect the real-world usage of application
> writers; I'd want to see the impact on code they'd write, and evaluate how
> much of existing code and examples could survive this change with minimal
> mods or with mechanical rewrites. Also, I'd want to talk about how
> coordinated support for futures is or would be among the major browser
> vendors.

Basically:

getUserMedia(options, accept, reject)

becomes:

getUserMedia(options).done(accept, reject)

And:

obj.takePhoto()
obj.onphoto = accept
obj.onphotoerror = reject

becomes:

obj.takePhoto().done(accept, reject)

(It has still not been explained why these follow a distinct pattern.)

As I explained in another email futures further allow you to composite
them so you can do operations on the result of several futures once
any has completed, or once all have completed, etc.

As far as browsers go, they've all been involved in the discussion and
either already have such a concept implemented (e.g. Gecko has
DOMRequest, a poor man's futures) or have patches (WebKit) and/or are
planning on doing so (Blink, Gecko).


> At this point I'm still skeptical of making this major a change in the API
> this far down the path, and in doing so being the first adopter.  I agree
> that we have an unusually inherently asynchronous API that would benefit
> from futures, which makes my first point unfortunate, so I'm open to
> argument and also to paths forward to adopting Futures that don't kill the
> momentum that WebRTC and getUserMedia have (i.e. Dom's point).

If you are still able to rename API members this really should be a
no-brainer. I appreciate the early-adopter reluctance, and I apologize
for us realizing futures four years after their debut within the
JavaScript community, but the choices we make for these APIs affect us
for the very long term so we better get them right. As Robin said,
while 16-bit code units sucks (it's not utf-16), iso-8859-1 would've
been just downright painful.


--
http://annevankesteren.nl/

Received on Monday, 8 April 2013 15:56:47 UTC