W3C home > Mailing lists > Public > whatwg@whatwg.org > April 2017

Re: [whatwg] <audio> metadata

From: Andy Valencia <ajv-cautzeamplog@vsta.org>
Date: Sun, 23 Apr 2017 09:58:17 -0700 (PDT)
To: whatwg@whatwg.org
Message-Id: <20170423165817.259B44037C@vsta.org>
I've become aware of quite a bit more metadata support in the world
of web browsers; please consider my old proposal withdrawn.

=== Reporting
> Only "artist" and "title" are required for royalties reporting for
> internet radio.

I'm sorry for a bit of topic drift on this list, and I'm sure requirements
vary by nation.  I do reporting for a local station and among the
requirements I have to meet:
    https://www.soundexchange.com/service-provider/reporting-requirements/
This is the last thing I'll say on this tangent.

=== Dynamic versus static metadata

Pretty much all audio formats have at least one metadata format.  While
some apparently can embed them at time points, this is not used by any
players I can find.  The Icecast/Streamcast "metastream" format is the
only technique I've ever encountered.  The industry is quickly shifting
to the so-called "Shoutcast v2" format due to:
    https://forums.developer.apple.com/thread/66586

Metadata formats as applied to static information are, of course, of
great interest.  Any dynamic technique should fit into the existing
approach.

You have to draw a distinction between API's concerned with UI/UX
presentation, and those concerned with getting the underlying
information.  MediaMetadata is an example of the former, mozGetMetadata
the latter.  I'm now looking at mozGetMetadata as a starting point,
with a minimal change to add dynamic metadata events.

Of course, mozGetMetadata makes a very nice counterpart to Chrome's
MediaMetadata API for rendering the information to the listener.

=== Processing a stream programatically
> If the same-origin policy stops you, it should also stop a C++
> implementation. It's there for a reason.

This is all framed by techniques exemplified by <audio src="...">
which enjoy permissive origin treatment.  The goal is to glean
the information which is *available anyway*.  Without changing the
protection regime.

=== Non-trivial audio application
> Also royalty reporting is done in a earlier stage, what a listener sees
> is not what is logged/given for royalties reporting.

In addition to the obvious benefits of gleaning metadata and
updating the UI, I'm also interested in non-trivial audio applications
like:
    https://en.wikipedia.org/wiki/Broadcast_automation
both static and dynamic metadata sources are very much of interest.
The browser execution environment is an excellent platform for these
sorts of applications.

=== Proposed new API direction

Here's the approach which now makes the most sense to me.

Ultimately, I'd hope that the moz-prefixed mozGetMetadata could indeed
be standardised (to getMetadata).  Keep the same semantics, possibly
formalize some basic fields (artist, title, album, year, ...).  If
one of those receives a value from the underlying media, it'll have
that value.  It's legal to omit fields which have no value.

Then, metadatachange is added.  While an event handler is active,
then on each detected change of metadata a callback occurs.

For the special case of Icecast/Shoutcast where the initial HTTP GET
requires a special header, the change handler must be in place before
the stream is opened.  Thus "<audio id="player" onmetadatachange="changes()>"
will fire changes() after player.src is set.  The initial call would be
at the same point that the existing loadedmetadata event would fire,
and then subsequent calls at each metadata update from the stream.

Thanks again for your comments,
Andy Valencia
Received on Sunday, 23 April 2017 16:58:49 UTC

This archive was generated by hypermail 2.3.1 : Sunday, 23 April 2017 16:58:50 UTC