Re: Where to attach a DTMF API

On Thu, Dec 1, 2011 at 2:33 AM, Harald Alvestrand <harald@alvestrand.no>wrote:

> **
> On 12/01/2011 07:32 AM, Justin Uberti wrote:
>
>
>
> On Thu, Dec 1, 2011 at 1:00 AM, Harald Alvestrand <harald@alvestrand.no>wrote:
>
>> On 11/30/2011 08:45 AM, Stefan HÃ¥kansson wrote:
>>
>>> On 11/30/2011 05:25 AM, Justin Uberti wrote:
>>>
>>>     Unlike Stefan, I think the API makes most sense if it's an API on a
>>>>    MediaStreamTrack object. If it is an API on a PeerConnection, it
>>>>    would have to be something like PeerConnection.DTMF(StreamID,
>>>>    TrackID, "12345"), which seems somewhat bizarre. It could easily be
>>>>    defined to generate failure if the other end of the MediaStreamTrack
>>>>    is not a playout into a PeerConnection.
>>>>
>>> I had a simpler, but error prone, proposal which meant you would not
>>> have to supply Stream aor TrackID.
>>>
>>>>
>>>>
>>>> This matches my viewpoint. We've created a nice object-oriented API, so
>>>> I'd like to maintain that design as much as possible, even if means a
>>>> bit more implementation work.
>>>>
>>> It seems that most people want to go this way (i.e. attach sendDTMF to a
>>> MediaStreamTrack), so this is probably what we should do.
>>>
>>> A follow up question: if that track is recorded or played out locally or
>>> remotely, should there be any traces/signs of the DTMF inserted? You could
>>> imagine that the actual tones are played/recorded, but you could equally
>>> well imagine that there are no signs of the DTMF stuff when
>>> recording/playing out.
>>>
>>  My feeling is that this relates to the question of "what do we record
>> when we record".
>> If we're recording the RTP packet stream, the DTMF should be there; if we
>> record a WAV file, it has to be converted somehow; the converter may well
>> be aware of DTMF and record them as sound.
>>
>>
>>>
>>>> Followup question: should we define a specific AudioMediaStreamTrack
>>>> that inherits from MediaStreamTrack, and only expose this DTMF API on
>>>> AudioMediaStreamTracks?
>>>>
>>> Without thinking very much about it, defining separate
>>> AudioMediaStreamTrack and VideoMediaStreamTrack makes sense to me.
>>>
>>>> Or should we expose it from all tracks, and have
>>>> it throw an exception on tracks that don't support DTMF? And how should
>>>> apps know if DTMF is supported?
>>>>
>>>> My suggestion would be to introduce AudioMediaStreamTrack, and have the
>>>> DTMF API fail if the other side doesn't support telephone-event. Support
>>>> for telephone-event can be determined from parsing the incoming SDP
>>>> (with ROAP), or the PeerConnection.remoteDescription method (with JSEP).
>>>>
>>> Sounds reasonable to me.
>>>
>>  Javascript's standard mechanism for extensibility seems to be "if in
>> doubt, call it and catch the failure". Defining an AudioMediaStreamTrack
>> (or AudioMediaStreamTrackThatSupportsDTMF) seems like a logical way to
>> describe this interface in IDL; what the implementations should do seems
>> less clear to me.
>>
>
>  Sure, but the UI needs to know if DTMF is supported before we call the
> DTMF method, i.e. should we even display a DTMF dialpad in the UI.
>
>  I think that the JS
>
> if (!track.DMTF)
>    dialpad.disable()
>
> will do it (check if the track object has a DTMF property; functions are
> properties too).
> But the more common JS idiom seems to be
>
> try
>    track.DTMF("")
> catch
>    dialpad.disable()
>
> Apologies for errors in JS syntax.
>
>
If I understand correctly, you are suggesting that we only expose this
function from a AudioMediaStreamTrack if the remote side supports DTMF.

I'm not opposed to this, but is there a way to express this in WebIDL?

Received on Thursday, 1 December 2011 18:16:43 UTC