- From: Harald Alvestrand <harald@alvestrand.no>
- Date: Thu, 01 Dec 2011 08:33:40 +0100
- To: Justin Uberti <juberti@google.com>
- CC: public-webrtc@w3.org
- Message-ID: <4ED72DD4.6080608@alvestrand.no>
On 12/01/2011 07:32 AM, Justin Uberti wrote:
>
>
> On Thu, Dec 1, 2011 at 1:00 AM, Harald Alvestrand
> <harald@alvestrand.no <mailto:harald@alvestrand.no>> wrote:
>
> On 11/30/2011 08:45 AM, Stefan HÃ¥kansson wrote:
>
> On 11/30/2011 05:25 AM, Justin Uberti wrote:
>
> Unlike Stefan, I think the API makes most sense if it's
> an API on a
> MediaStreamTrack object. If it is an API on a
> PeerConnection, it
> would have to be something like
> PeerConnection.DTMF(StreamID,
> TrackID, "12345"), which seems somewhat bizarre. It
> could easily be
> defined to generate failure if the other end of the
> MediaStreamTrack
> is not a playout into a PeerConnection.
>
> I had a simpler, but error prone, proposal which meant you
> would not have to supply Stream aor TrackID.
>
>
>
> This matches my viewpoint. We've created a nice
> object-oriented API, so
> I'd like to maintain that design as much as possible, even
> if means a
> bit more implementation work.
>
> It seems that most people want to go this way (i.e. attach
> sendDTMF to a MediaStreamTrack), so this is probably what we
> should do.
>
> A follow up question: if that track is recorded or played out
> locally or remotely, should there be any traces/signs of the
> DTMF inserted? You could imagine that the actual tones are
> played/recorded, but you could equally well imagine that there
> are no signs of the DTMF stuff when recording/playing out.
>
> My feeling is that this relates to the question of "what do we
> record when we record".
> If we're recording the RTP packet stream, the DTMF should be
> there; if we record a WAV file, it has to be converted somehow;
> the converter may well be aware of DTMF and record them as sound.
>
>
>
> Followup question: should we define a specific
> AudioMediaStreamTrack
> that inherits from MediaStreamTrack, and only expose this
> DTMF API on
> AudioMediaStreamTracks?
>
> Without thinking very much about it, defining separate
> AudioMediaStreamTrack and VideoMediaStreamTrack makes sense to me.
>
> Or should we expose it from all tracks, and have
> it throw an exception on tracks that don't support DTMF?
> And how should
> apps know if DTMF is supported?
>
> My suggestion would be to introduce AudioMediaStreamTrack,
> and have the
> DTMF API fail if the other side doesn't support
> telephone-event. Support
> for telephone-event can be determined from parsing the
> incoming SDP
> (with ROAP), or the PeerConnection.remoteDescription
> method (with JSEP).
>
> Sounds reasonable to me.
>
> Javascript's standard mechanism for extensibility seems to be "if
> in doubt, call it and catch the failure". Defining an
> AudioMediaStreamTrack (or AudioMediaStreamTrackThatSupportsDTMF)
> seems like a logical way to describe this interface in IDL; what
> the implementations should do seems less clear to me.
>
>
> Sure, but the UI needs to know if DTMF is supported before we call the
> DTMF method, i.e. should we even display a DTMF dialpad in the UI.
>
I think that the JS
if (!track.DMTF)
dialpad.disable()
will do it (check if the track object has a DTMF property; functions are
properties too).
But the more common JS idiom seems to be
try
track.DTMF("")
catch
dialpad.disable()
Apologies for errors in JS syntax.
Received on Thursday, 1 December 2011 07:34:31 UTC