CVS 2011/webrtc/editor/sources

Update of /sources/public/2011/webrtc/editor/sources
In directory roscoe:/tmp/cvs-serv5025/sources

Modified Files:
	getusermedia.css getusermedia.html getusermedia.js 
Log Message:
Added 20140619 archived version.

--- /sources/public/2011/webrtc/editor/sources/getusermedia.css	2013/11/05 18:10:45	1.1
+++ /sources/public/2011/webrtc/editor/sources/getusermedia.css	2014/06/20 02:36:32	1.2
@@ -1,4 +1,3 @@
-
 @media screen {
   html { background: #eeeeee; }
   body { margin-bottom: 30%; border-bottom: thin solid #3c790a; }
--- /sources/public/2011/webrtc/editor/sources/getusermedia.html	2014/05/08 02:05:54	1.4
+++ /sources/public/2011/webrtc/editor/sources/getusermedia.html	2014/06/20 02:36:32	1.5
@@ -59,6 +59,12 @@
     contains.</p>
 
 
+    <p>Conformance requirements phrased as algorithms or specific steps may be
+    implemented in any manner, so long as the end result is equivalent. (In
+    particular, the algorithms defined in this specification are intended to be
+    easy to follow, and not intended to be performant.)</p>
+
+
     <p>Implementations that use ECMAScript to implement the APIs defined in
     this specification must implement them in a manner consistent with the
     ECMAScript Bindings defined in the Web IDL specification [[!WEBIDL]], as
@@ -175,7 +181,7 @@
 
         <p>Although settings are a property of the source, they are
         only exposed to the application through the tracks attached to
-        the source.  The <a>Constrainable</a> interface provides this
+        the source.  The <a>ConstrainablePattern</a> interface provides this
         exposure.</p>
 
 
@@ -220,7 +226,7 @@
         corresponding capability that describes whether it is
         supported by the source and if so, what the range of supported
         values are. As with settings, capabilities are exposed to the
-        application via the <a>Constrainable</a> interface.</p>
+        application via the <a>ConstrainablePattern</a> interface.</p>
 
 
         <p>The values of the supported capabilities must be normalized to the
@@ -253,7 +259,7 @@
 
 
         <p>Constraints are exposed on tracks via
-        the <a>Constrainable</a> interface, which includes an API for
+        the <a>ConstrainablePattern</a> interface, which includes an API for
         dynamically changing constraints.  Note
         that <a>getUserMedia()</a> also permits an initial set of
         constraints to be applied when the track is first
@@ -261,7 +267,7 @@
 
 
         <p>It is possible for two tracks that share a unique source to
-        apply contradictory constraints. The <a>Constrainable</a>
+        apply contradictory constraints. The <a>ConstrainablePattern</a>
         interface supports the calling of an error handler when the
         conflicting constraint is requested.  After successful
         application of constraints on a track (and its associated
@@ -297,20 +303,20 @@
       <h2>Introduction</h2>
 
 
-      <p>The <code><a>MediaStream</a></code> interface is used to represent
-      streams of media data, typically (but not necessarily) of audio and/or
-      video content, e.g. from a local camera. The data from a
-      <code><a>MediaStream</a></code> object does not necessarily have a
-      canonical binary form; for example, it could just be "the video currently
-      coming from the user's video camera". This allows user agents to
-      manipulate media streams in whatever fashion is most suitable on the
-      user's platform.</p>
+      <p>The two main components in the MediaStream API are the <code>
+      <a>MediaStreamTrack</a></code> and the <code><a>MediaStream</a></code>
+      interfaces. The <code><a>MediaStreamTrack</a></code> object represents
+      media originating from a single media source in the user agent, e.g.
+      video from a web camera. A <code><a>MediaStream</a></code> is used group
+      several <code><a>MediaStreamTrack</a></code> objects into one unit that
+      can be rendered in a media element or recorded.</p>
 
 
-      <p>Each <code><a>MediaStream</a></code> object can contain zero or more
-      tracks, in particular audio and video tracks. All tracks in a MediaStream
-      are intended to be synchronized when rendered. Different MediaStreams do
-      not need to be synchronized.</p>
+      <p>Each <code><a>MediaStream</a></code> can contain zero or more <code>
+      <a>MediaStreamTrack</a></code> objects. All tracks in a <code>
+      <a>MediaStream</a></code> are intended to be synchronized when
+      rendered. Different <code><a>MediaStream</a></code> objects do not need
+      to be synchronized.</p>
 
 
       <p class="note">While the intent is to synchronize tracks, it could be
@@ -322,44 +328,24 @@
       playback and the effect that these have on user perception.</p>
 
 
-      <p>Each track in a MediaStream object has a corresponding
-      <code><a>MediaStreamTrack</a></code> object.</p>
-
-
       <p>A <code><a>MediaStreamTrack</a></code> represents content comprising
       one or more channels, where the channels have a defined well known
-      relationship to each other (such as a stereo or 5.1 audio signal).</p>
+      relationship to each other (such as a stereo or 5.1 audio signal). A
+      channel is the smallest unit considered in this API specification.</p>
 
 
-      <p>A channel is the smallest unit considered in this API
-      specification.</p>
+      <p><img alt="A MediaStream" src="images/media-stream.png" width="418" />
+      </p>
 
 
-      <p>A <code><a>MediaStream</a></code> object has an input and an output.
-      The input depends on how the object was created: a
-      <code><a>MediaStream</a></code> object generated by a <code><a href=
-      "#dom-navigator-getusermedia">getUserMedia()</a></code> call (which is
-      described later in this document), for instance, might take its input
-      from the user's local camera. The output of the object controls how the
-      object is used, e.g., what is saved if the object is written to a file or
-      what is displayed if the object is used in a <code>video</code>
+      <p>A <code><a>MediaStream</a></code> object has an input and an output
+      that represent the combined input and output of all the object's
+      tracks. The output of the <code><a>MediaStream</a></code> controls how
+      the object is rendered, e.g., what is saved if the object is recorded to
+      a file or what is displayed if the object is used in a <code>video</code>
       element.</p>
 
 
-      <p>Each track in a <code><a>MediaStream</a></code> object can be
-      disabled, meaning that it is muted in the object's output. All tracks are
-      initially enabled.</p>
-
-
-      <p>A <code><a>MediaStream</a></code> can be <dfn><a>finished</a></dfn>,
-      indicating that its inputs have forever stopped providing data.</p>
-
-
-      <p>The output of a <code><a>MediaStream</a></code> object MUST correspond
-      to the tracks in its input. Muted audio tracks MUST be replaced with
-      silence. Muted video tracks MUST be replaced with blackness.</p>
-
-
       <p>A new <code><a>MediaStream</a></code> object can be created from
       accessible media sources (that does not require any additional
       permissions) using the <code><a href=
@@ -371,10 +357,6 @@
       possible to compose a stream from different source streams.</p>
 
 
-      <p><img alt="A MediaStream" src="images/media-stream.png" width="418" />
-      </p>
-
-
       <p>Both <code><a>MediaStream</a></code> and
       <code><a>MediaStreamTrack</a></code> objects can be cloned. This allows
       for greater control since the separate instances can be manipulated and
@@ -477,13 +459,6 @@
       </ol>
 
 
-      <p>A <code><a>MediaStream</a></code> can have multiple audio and video
-      sources (e.g. because the user has multiple microphones, or because the
-      real source of the stream is a media resource with many media tracks).
-      The stream represented by a <code><a>MediaStream</a></code> thus has zero
-      or more tracks.</p>
-
-
       <p>The tracks of a <code><a>MediaStream</a></code> are stored in a
       <dfn id="track-set">track set</dfn>. The track set MUST contain the
       <code><a>MediaStreamTrack</a></code> objects that correspond to the
@@ -625,6 +600,25 @@
         </dd>
 
 
+        <dt>sequence&lt;MediaStreamTrack&gt; getTracks()</dt>
+
+
+        <dd>
+          <p>Returns a sequence of <code><a>MediaStreamTrack</a></code> objects
+          representing all the tracks in this stream.</p>
+
+
+          <p>The <dfn id=
+          "dom-mediastream-gettracks"><code>getTracks()</code></dfn>
+          method MUST return a sequence that represents a snapshot of all the
+          <code><a>MediaStreamTrack</a></code> objects in this stream's
+          <a href="#track-set">track set</a>, regardless of kind. The
+          conversion from the <a href="#track-set">track set</a> to the
+          sequence is user agent defined and the order does not have to stable
+          between calls.</p>
+        </dd>
+
+
         <dt>MediaStreamTrack? getTrackById(DOMString trackId)</dt>
 
 
@@ -799,10 +793,18 @@
       source in the user agent. Several <code><a>MediaStreamTrack</a></code>
       objects can represent the same media source, e.g., when the user chooses
       the same camera in the UI shown by two consecutive calls to
-      <code><a href="#dom-navigator-getusermedia">getUserMedia()</a></code>
+      <code><a href="#dom-mediadevices-getusermedia">getUserMedia()</a></code>
       .</p>
 
 
+      <p>The data from a
+      <code><a>MediaStreamTrack</a></code> object does not necessarily have a
+      canonical binary form; for example, it could just be "the video currently
+      coming from the user's video camera". This allows user agents to
+      manipulate media in whatever fashion is most suitable on the user's
+      platform.</p>
+
+
       <p>A script can indicate that a track no longer needs its source with the
       <code><a href=
       "#dom-mediastreamtrack-stop">MediaStreamTrack.stop()</a></code> method.
@@ -820,85 +822,62 @@
       <section>
         <h3>Life-cycle and Media Flow</h3>
 
-
+        <h4>Life-cycle</h4>
         <p>A <code><a>MediaStreamTrack</a></code> has three stages in its
-        lifecycle; <code>new</code>, <code>live</code> and <code>ended</code>.
-        A track begins as <code>new</code> prior to being connected to an
-        active source.</p>
-
+        life-cycle: <code>new</code>, <code>live</code> and <code>ended</code>.
+        A track begins as <code>new</code> prior to being connected to a
+        source. The current stage is reflected by the object's <code><a href=
+        "#dom-mediastreamtrack-readystate">readyState</a></code> attribute.</p>
+
+        <p class="note">This document describres no way to create a
+        <code><a>MediaStreamTrack</a></code> that is _not_ connected to
+        a source. <code>new</code> allows for future extensions. </p>
 
         <p>Once connected, the <code><a href=
         "#event-mediastreamtrack-started">started</a></code> event fires and
         the track becomes <code>live</code>. In the <code>live</code> state,
-        the track is active and media is available for rendering at a
-        <code><a>MediaStream</a></code> <a>consumer</a>.</p>
-
-
-        <p>A muted or disabled <code><a>MediaStreamTrack</a></code> renders
-        either silence (audio), black frames (video), or a
-        zero-information-content equivalent. For example, a video element
-        sourced by a muted or disabled <code><a>MediaStreamTrack</a></code>
-        (contained within a <code><a>MediaStream</a></code>), is playing but
-        the rendered content is the muted output.</p>
-
-
-        <p>The muted/unmuted state of a track reflects if the source provides
-        any media at this moment. The enabled/disabled state is under
-        application control and determines if the track outputs media (to its
-        consumers). Hence, media from the source only flows when a
-        <code><a>MediaStreamTrack</a></code> object is both unmuted and
-        enabled.</p>
-
-
-        <p>A <code><a>MediaStreamTrack</a></code> is <dfn id=
-        "track-muted">muted</dfn> when the source is temporarily unable to
-        provide the track with data. A track can be muted by a user. Often this
-        action is outside the control of the application. This could be as a
-        result of the user hitting a hardware switch, or toggling a control in
-        the operating system or browser chrome. A track can also be muted by
-        the user agent.</p>
-
-
-        <p>Applications are able to <dfn id="track-enabled">enable</dfn> or
-        disable a <code><a>MediaStreamTrack</a></code> to prevent it from
-        rendering media from the source. A muted track will however, regardless
-        of the enabled state, render silence and blackness. A disabled track is
-        logically equivalent to a muted track, from a consumer point of
-        view.</p>
-
-
-        <p>For a newly created <code><a>MediaStreamTrack</a></code> object, the
-        following applies. The track is always enabled unless stated otherwise
-        (for example when cloned) and the muted state reflects the state of the
-        source at the time the track is created.</p>
-
+        the track is active and media is available for use by consumers (but
+        may be replaced by zero-information-content if the
+        <code><a>MediaStreamTrack</a></code> is <a href=
+        "#track-muted">muted</a> or <a href="#track-enabled">enabled</a>, see
+        below).</p>
 
         <p>A <code><a>MediaStreamTrack</a></code> object is said to
         <em>end</em> when the source of the track is disconnected or
         exhausted.</p>
 
 
+        <p>A <code><a>MediaStreamTrack</a></code> can be <dfn id=
+        "track-detached">detached</dfn> from its source. It means that the
+        track is no longer dependent on the source for media data. If no other
+        <code><a>MediaStreamTrack</a></code> is using the same source, the
+        source will be <a href="#source-stopped">stopped</a>. <code>
+        <a>MediaStreamTrack</a></code> attributes such as <code><a href=
+        "#dom-mediastreamtrack-kind">kind</a></code> and <code><a href=
+        "#dom-mediastreamtrack-label">label</a></code> MUST not change values
+        when the source is detached.</p>
+
+
         <p>When a <code><a>MediaStreamTrack</a></code> object ends for any
         reason (e.g., because the user rescinds the permission for the page to
         use the local camera, or because the data comes from a finite file and
         the file's end has been reached and the user has not requested that it
-        be looped, or because the UA has instructed the track to end for any
-        reason, it is said to be ended. When track instance <var>track</var>
+        be looped, or because the application invoked the <code><a href=
+        "#dom-mediastreamtrack-stop">stop()</a></code> method on
+        the <code><a>MediaStreamTrack</a></code> object, or because the UA has
+        instructed the track to end for any reason) it is said to be ended.</p>
+
+        <p>When a <code><a>MediaStreamTrack</a></code> <var>track</var>
         ends for any reason other than the <code><a href=
-        "#dom-mediastreamtrack-stop">stop()</a></code> method being invoked on
-        the <code><a>MediaStreamTrack</a></code> object that represents
-        <var>track</var>, the user agent MUST queue a task that runs the
-        following steps:</p>
+        "#dom-mediastreamtrack-stop">stop()</a></code> method being invoked,
+        the user agent MUST queue a task that runs the following steps:</p>
 
 
         <ol>
           <li>
-            <p>If the track's <code><a href=
+            <p>If the <var>track's</var> <code><a href=
             "#dom-mediastreamtrack-readystate">readyState</a></code> attribute
-            has the value <code>ended</code> already, then abort these steps.
-            (The <code><a href="#dom-mediastreamtrack-stop">stop()</a></code>
-            method was probably called just before the track stopped for other
-            reasons.)</p>
+            has the value <code>ended</code> already, then abort these steps.</p>
           </li>
 
 
@@ -909,12 +888,8 @@
           </li>
 
           <li>
-            <p>Detach <var>track's</var> source.</p>
-
-
-            <p>If no other <code><a>MediaStreamTrack</a></code> is using
-            the same source, the source will be <a href=
-            "#source-stopped">stopped</a>.</p>
+            <p><a href="#track-detached">Detach</a> <var>track's</var>
+            source.</p>
           </li>
 
           <li>
@@ -927,6 +902,51 @@
         <p>If the end of the stream was reached due to a user request, the
         event source for this event is the user interaction event source.</p>
 
+
+        <h4>Media Flow</h4>
+
+        <p>There are two concepts related to the media flow for a
+        <code>live</code> <code><a>MediaStreamTrack</a></code>: muted or not,
+        and enabled or disabled.</p>
+
+        <p><dfn id="track-muted">Muted</dfn> refers to the input to the
+        <code><a>MediaStreamTrack</a></code>.
+        If live samples are not made available to the <code>
+        <a>MediaStreamTrack</a></code> it is muted.</p>
+
+        <p>Muted is out of control for the application, but can be observed by
+        the application by reading the <code>
+        <a href="#dom-mediastreamtrack-muted">muted</a></code>
+        attribute and listening to the associated events <code>
+        <a href="#event-mediastreamtrack-mute">mute</a></code> and <code>
+        <a href="#event-mediastreamtrack-unmute">unmute</a></code>.There can
+        be several reasons for a <code><a>MediaStreamTrack</a></code> to be
+        muted: the user pushing a physical mute button on the microphone, the
+        user toggling a control in the operating system, the user clicking a
+        mute button in the browser chrome, the UA (on behalf of the user)
+        mutes, etc.</p>
+
+        <p><dfn id="track-enabled">Enabled/disabled</dfn> on the other hand is
+        available to application to control (and observe) via the <code>
+        <a href="#dom-mediastreamtrack-enabled">enabled</a></code> attribute.</p>
+
+        <p>The result for the consumer is the same in the meaning that
+        whenever <code><a>MediaStreamTrack</a></code> is muted or disabled
+        (or both) the consumer gets zero-information-content, which means
+        silence for audio and black frames for video. In other words, media
+        from the source only flows when a <code><a>MediaStreamTrack</a></code>
+        object is both unmuted and enabled. For example, a video
+        element sourced by a muted or disabled <code>
+        <a>MediaStreamTrack</a></code> (contained in a <code>
+        <a>MediaStream</a></code>), is playing but rendering blackness.</p>
+
+        <p>For a newly created <code><a>MediaStreamTrack</a></code> object, the
+        following applies: the track is always enabled unless stated otherwise
+        (for example when cloned) and the muted state reflects the state of the
+        source at the time the track is created.</p>
+
+
+
       </section>
 
 
@@ -939,7 +959,7 @@
 
         <p>Whether <code><a>Constraints</a></code> were provided at track

[1444 lines skipped]
--- /sources/public/2011/webrtc/editor/sources/getusermedia.js	2014/05/08 02:05:54	1.4
+++ /sources/public/2011/webrtc/editor/sources/getusermedia.js	2014/06/20 02:36:32	1.5
@@ -12,14 +12,14 @@
    // publishDate:  "2009-08-06",
 
    // new ability to override the copyright completely
-   overrideCopyright:  "<p class='copyright'>Initial Author of this Specification was Ian Hickson, Google Inc., with the following copyright statement:<br /> &#169; Copyright 2004-2011 Apple Computer, Inc., Mozilla Foundation, and Opera Software ASA. You are granted a license to use, reproduce and create derivative works of this document.<\/p> <p class='copyright'>All subsequent changes since 26 July 2011 done by the W3C WebRTC Working Group and the Device APIs Working Group are under the following <a href='http://www.w3.org/Consortium/Legal/ipr-notice#Copyright'>Copyright<\/a>:<br />&#169; 2011-2013 <a href='http://www.w3.org/'><acronym title='World Wide Web Consortium'>W3C<\/acronym><\/a><sup>&#174;<\/sup> (<a href='http://www.csail.mit.edu/'><acronym title='Massachusetts Institute of Technology'>MIT<\/acronym><\/a>, <a href='http://www.ercim.eu/'><acronym title='European Research Consortium for Informatics and Mathematics'>ERCIM<\/acronym><\/a>, <a href='http://www.keio.ac.jp/'>Keio<\/a>, <a href='http:/ev.buaa.edu.cn/'>Beihang<\/a>), All Rights Reserved. <a href='http://www.w3.org/Consortium/Legal/copyright-documents'>Document use<\/a>  rules apply.<\/p> <p class='copyright'>For the entire publication on the W3C site the <a href='http://www.w3.org/Consortium/Legal/ipr-notice#Legal_Disclaimer'>liability<\/a> and <a href='http://www.w3.org/Consortium/Legal/ipr-notice#W3C_Trademarks'>trademark<\/a> rules apply.<\/p>",
+   overrideCopyright:  "<p class='copyright'>Initial Author of this Specification was Ian Hickson, Google Inc., with the following copyright statement:<br /> &#169; Copyright 2004-2011 Apple Computer, Inc., Mozilla Foundation, and Opera Software ASA. You are granted a license to use, reproduce and create derivative works of this document.<\/p> <p class='copyright'>All subsequent changes since 26 July 2011 done by the W3C WebRTC Working Group and the Device APIs Working Group are under the following <a href='http://www.w3.org/Consortium/Legal/ipr-notice#Copyright'>Copyright<\/a>:<br />&#169; 2011-2014 <a href='http://www.w3.org/'><acronym title='World Wide Web Consortium'>W3C<\/acronym><\/a><sup>&#174;<\/sup> (<a href='http://www.csail.mit.edu/'><acronym title='Massachusetts Institute of Technology'>MIT<\/acronym><\/a>, <a href='http://www.ercim.eu/'><acronym title='European Research Consortium for Informatics and Mathematics'>ERCIM<\/acronym><\/a>, <a href='http://www.keio.ac.jp/'>Keio<\/a>, <a href='http:/ev.buaa.edu.cn/'>Beihang<\/a>), All Rights Reserved. <a href='http://www.w3.org/Consortium/Legal/copyright-documents'>Document use<\/a>  rules apply.<\/p> <p class='copyright'>For the entire publication on the W3C site the <a href='http://www.w3.org/Consortium/Legal/ipr-notice#Legal_Disclaimer'>liability<\/a> and <a href='http://www.w3.org/Consortium/Legal/ipr-notice#W3C_Trademarks'>trademark<\/a> rules apply.<\/p>",
 
    // if the specification's copyright date is a range of years, specify
    // the start date here:
    // copyrightStart: "2005",
 
    // if there is a previously published draft, uncomment this and set its YYYY-MM-DD
-   prevED: "http://dev.w3.org/2011/webrtc/editor/archives/20140321/getusermedia.html",
+   prevED: "http://dev.w3.org/2011/webrtc/editor/archives/20140507/getusermedia.html",
 
    // if there a publicly available Editor's Draft, this is the link
    edDraftURI:           "http://dev.w3.org/2011/webrtc/editor/getusermedia.html",

Received on Friday, 20 June 2014 02:36:35 UTC