- From: Sergio Garcia Murillo <sergio.garcia.murillo@gmail.com>
- Date: Thu, 1 Feb 2018 12:41:20 +0100
- To: Lennart Grahl <lennart.grahl@gmail.com>, public-webrtc@w3.org
- Cc: Martin Thomson <martin.thomson@gmail.com>, Toshiya Nakakura <t.nakakura@kmd.keio.ac.jp>
AFIK the problem is that the rtp timestamp is not know until after the video frame is encoded very deep on the video processing pipeline, so it is not possible to retrieve it on js at sending time. Best regards Sergio On 01/02/2018 11:50, Lennart Grahl wrote: > I really am not an expert when it comes to A/V but there's a timestamp > field in the RTP... if the API would allow to retrieve that value > (.currentTimestamp) when sending and wait for it when receiving > (.waitForTimestamp(timestamp))... shouldn't that work? > > Cheers > Lennart > > > On 01.02.2018 08:22, Sergio Garcia Murillo wrote: >> If you can send a metadata payload within rtp, you can sync with >> external sources. Jus' sayin' ;) >> >> For example, if you create a json object with an uuid and send it via >> datachannel, you could send that uuid in the rtp metadata. >> >> Then on the receiving side you could get an onmetadata event on the >> media stream track, retrieve the uuid sent by rtp and sync it with the >> data sent via datachannel. >> >> If the json is small, you could even send the full json on the rtp >> metadata and not even send it via datachannels. >> >> Best regards >> Sergio >> >> >> On 01/02/2018 0:04, Martin Thomson wrote: >>> RTP synchronizes well with RTP. Jus' sayin'. >>> >>> On Thu, Feb 1, 2018 at 9:25 AM, Sergio Garcia Murillo >>> <sergio.garcia.murillo@gmail.com> wrote: >>>> +1 >>>> >>>> Although I would prefer an option to attach a small metadata with a >>>> video >>>> frame at rtp level which would trigger an event on the corresponding >>>> mediastreamtrack. >>>> >>>> That metadata could either be the end data needed by the app, or a >>>> key for >>>> synchronizing the data send/received via datachannel. >>>> >>>> Best regards >>>> Sergio >>>> >>>> >>>> On 31/01/2018 1:12, Toshiya Nakakura wrote: >>>>> Hi All, >>>>> >>>>> I'm sorry I might making a new thread because I haven't received the >>>>> original mail. >>>>> I'm a Ph.D student researching immersive style Telepresence. >>>>> I think that you can understand what I research roughly with this >>>>> video. >>>>> <https://www.youtube.com/watch?v=KoC1iTOmYTg&t=38s> >>>>> >>>>> Recently, I have been trying to implement this system with Web >>>>> Technology. >>>>> I started with WebRTC and WebVR, but I faced several problems. >>>>> I think that sharing about this edge case and issue*s* will >>>>> contribute to >>>>> the discussion of WebRTC Next. >>>>> This mail also include the part which should be discussed in IETF. >>>>> >>>>> The major one is synchronization between five sense information. >>>>> Current WebRTC MediaStream supports only for video and audio, >>>>> so users have to send tactile information over DataChannel. >>>>> MediaStream synchronize its tracks, >>>>> but there isn't any option to synchronize between MediaStream and >>>>> DataChannel. >>>>> Then I cannot implement the synchronization between vision and tactile. >>>>> It makes very weird sensation when the robot grabs something. >>>>> >>>>> I hope some API to support this edge use-case like following plans. >>>>> >>>>> Plan A. MediaStream supporting another kind of five sense information >>>>> Plan B. An API to synchronize data and media >>>>> Plan C. An API for observing the event of MediaStream, such as >>>>> retrieving >>>>> video-image from ring buffer, to obtain the information necessary >>>>> for the >>>>> user to implement the synchronization engine >>>>> >>>>> I would appreciate if you have an interest in my use-case. >>>>> >>>>> Regards, >>>>> Toshiya Nakakura >>>>> >>>>> >>>>> >>
Received on Thursday, 1 February 2018 11:41:47 UTC