W3C home > Mailing lists > Public > public-webrtc@w3.org > December 2011

Re: Where to attach a DTMF API

From: Justin Uberti <juberti@google.com>
Date: Thu, 1 Dec 2011 01:32:24 -0500
Message-ID: <CAOJ7v-0YqcBHEOcSGd=SDk1=C_ubAQjZDSYk38DaHHgpA1wZJw@mail.gmail.com>
To: Harald Alvestrand <harald@alvestrand.no>
Cc: public-webrtc@w3.org
On Thu, Dec 1, 2011 at 1:00 AM, Harald Alvestrand <harald@alvestrand.no>wrote:

> On 11/30/2011 08:45 AM, Stefan HÃ¥kansson wrote:
>> On 11/30/2011 05:25 AM, Justin Uberti wrote:
>>     Unlike Stefan, I think the API makes most sense if it's an API on a
>>>    MediaStreamTrack object. If it is an API on a PeerConnection, it
>>>    would have to be something like PeerConnection.DTMF(StreamID,
>>>    TrackID, "12345"), which seems somewhat bizarre. It could easily be
>>>    defined to generate failure if the other end of the MediaStreamTrack
>>>    is not a playout into a PeerConnection.
>> I had a simpler, but error prone, proposal which meant you would not have
>> to supply Stream aor TrackID.
>>> This matches my viewpoint. We've created a nice object-oriented API, so
>>> I'd like to maintain that design as much as possible, even if means a
>>> bit more implementation work.
>> It seems that most people want to go this way (i.e. attach sendDTMF to a
>> MediaStreamTrack), so this is probably what we should do.
>> A follow up question: if that track is recorded or played out locally or
>> remotely, should there be any traces/signs of the DTMF inserted? You could
>> imagine that the actual tones are played/recorded, but you could equally
>> well imagine that there are no signs of the DTMF stuff when
>> recording/playing out.
> My feeling is that this relates to the question of "what do we record when
> we record".
> If we're recording the RTP packet stream, the DTMF should be there; if we
> record a WAV file, it has to be converted somehow; the converter may well
> be aware of DTMF and record them as sound.
>>> Followup question: should we define a specific AudioMediaStreamTrack
>>> that inherits from MediaStreamTrack, and only expose this DTMF API on
>>> AudioMediaStreamTracks?
>> Without thinking very much about it, defining separate
>> AudioMediaStreamTrack and VideoMediaStreamTrack makes sense to me.
>>> Or should we expose it from all tracks, and have
>>> it throw an exception on tracks that don't support DTMF? And how should
>>> apps know if DTMF is supported?
>>> My suggestion would be to introduce AudioMediaStreamTrack, and have the
>>> DTMF API fail if the other side doesn't support telephone-event. Support
>>> for telephone-event can be determined from parsing the incoming SDP
>>> (with ROAP), or the PeerConnection.**remoteDescription method (with
>>> JSEP).
>> Sounds reasonable to me.
> Javascript's standard mechanism for extensibility seems to be "if in
> doubt, call it and catch the failure". Defining an AudioMediaStreamTrack
> (or AudioMediaStreamTrackThatSuppo**rtsDTMF) seems like a logical way to
> describe this interface in IDL; what the implementations should do seems
> less clear to me.

Sure, but the UI needs to know if DTMF is supported before we call the DTMF
method, i.e. should we even display a DTMF dialpad in the UI.
Received on Thursday, 1 December 2011 06:33:13 UTC

This archive was generated by hypermail 2.3.1 : Monday, 23 October 2017 15:19:26 UTC