W3C home > Mailing lists > Public > whatwg@whatwg.org > March 2011

[whatwg] Stream API Feedback

From: Robert O'Callahan <robert@ocallahan.org>
Date: Wed, 16 Mar 2011 11:19:36 +1300
Message-ID: <AANLkTi=BQyncinV-HZ2DOU1Uh4oMxUjRHAn192nS-YcR@mail.gmail.com>
On Wed, Mar 16, 2011 at 11:11 AM, Bjartur Thorlacius
<svartman95 at gmail.com>wrote:

> On 3/15/11, Robert O'Callahan <robert at ocallahan.org> wrote:
> > Instead of creating new state signalling and control API for streams,
> what
> > about the alternative approach of letting <video> and <audio> use sensors
> as
> > sources, and a way to connect the output of <video> and <audio> to
> encoders?
> > Then we'd get all the existing state machinery for free. We'd also get
> > sensor input for audio processing (e.g. Mozilla or Chrome's audio APIs),
> and
> > in-page video preview, and using <canvas> to take snapshots, and more...
> >
> That would make sense, assuming you are suggesting reusing objects
> representing media elements, rather than using HTML elements.

HTML elements can be used as objects just fine. See "new Audio()" etc.

"Now the Bereans were of more noble character than the Thessalonians, for
they received the message with great eagerness and examined the Scriptures
every day to see if what Paul said was true." [Acts 17:11]
Received on Tuesday, 15 March 2011 15:19:36 UTC

This archive was generated by hypermail 2.4.0 : Wednesday, 22 January 2020 16:59:31 UTC