Re: What would you like to see in WebRTC next? A low-level API?

The metadata idea would be nice!

Best Regards,
Toshiya Nakakura

 >RTP synchronizes well with RTP.  Jus' sayin'.
 >
 >On Thu, Feb 1, 2018 at 9:25 AM, Sergio Garcia Murillo
 ><sergio.garcia.murillo@gmail.com> wrote:
 >> +1
 >>
 >> Although I would prefer an option to attach a small metadata with a 
video
 >> frame at rtp level which would trigger an event on the corresponding
 >> mediastreamtrack.
 >>
 >> That metadata could either be the end data needed by the app, or a 
key for
 >> synchronizing the data send/received via datachannel.
 >>
 >> Best regards
 >> Sergio
 >>
 >>
 >> On 31/01/2018 1:12, Toshiya Nakakura wrote:
 >>>
 >>> Hi All,
 >>>
 >>> I'm sorry I might making a new thread because I haven't received the
 >>> original mail.
 >>> I'm a Ph.D student researching immersive style Telepresence.
 >>> I think that you can understand what I research roughly with this 
video.
 >>> <https://www.youtube.com/watch?v=KoC1iTOmYTg&t=38s>
 >>>
 >>> Recently, I have been trying to implement this system with Web 
Technology.
 >>> I started with WebRTC and WebVR, but I faced several problems.
 >>> I think that sharing about this edge case and issue*s* will 
contribute to
 >>> the discussion of WebRTC Next.
 >>> This mail also include the part which should be discussed in IETF.
 >>>
 >>> The major one is synchronization between five sense information.
 >>> Current WebRTC MediaStream supports only for video and audio,
 >>> so users have to send tactile information over DataChannel.
 >>> MediaStream synchronize its tracks,
 >>> but there isn't any option to synchronize between MediaStream and
 >>> DataChannel.
 >>> Then I cannot implement the synchronization between vision and tactile.
 >>> It makes very weird sensation when the robot grabs something.
 >>>
 >>> I hope some API to support this edge use-case like following plans.
 >>>
 >>> Plan A. MediaStream supporting another kind of five sense information
 >>> Plan B. An API to synchronize data and media
 >>> Plan C. An API for observing the event of MediaStream, such as 
retrieving
 >>> video-image from ring buffer, to obtain the information necessary 
for the
 >>> user to implement the synchronization engine
 >>>
 >>> I would appreciate if you have an interest in my use-case.
 >>>
 >>> Regards,
 >>> Toshiya Nakakura
 >>>
 >>>
 >>>
 >>
 >>

Received on Thursday, 1 February 2018 02:01:36 UTC