W3C home > Mailing lists > Public > public-webrtc@w3.org > January 2018

Re: What would you like to see in WebRTC next? A low-level API?

From: Toshiya Nakakura <t.nakakura@kmd.keio.ac.jp>
Date: Wed, 31 Jan 2018 09:12:58 +0900
To: public-webrtc@w3.org
Message-ID: <bc96d5ed-e63c-030b-94af-c95d8e10a93d@kmd.keio.ac.jp>
Hi All,

I'm sorry I might making a new thread because I haven't received the 
original mail.
I'm a Ph.D student researching immersive style Telepresence.
I think that you can understand what I research roughly with this video.
<https://www.youtube.com/watch?v=KoC1iTOmYTg&t=38s>

Recently, I have been trying to implement this system with Web Technology.
I started with WebRTC and WebVR, but I faced several problems.
I think that sharing about this edge case and issue*s* will contribute 
to the discussion of WebRTC Next.
This mail also include the part which should be discussed in IETF.

The major one is synchronization between five sense information.
Current WebRTC MediaStream supports only for video and audio,
so users have to send tactile information over DataChannel.
MediaStream synchronize its tracks,
but there isn't any option to synchronize between MediaStream and 
DataChannel.
Then I cannot implement the synchronization between vision and tactile.
It makes very weird sensation when the robot grabs something.

I hope some API to support this edge use-case like following plans.

Plan A. MediaStream supporting another kind of five sense information
Plan B. An API to synchronize data and media
Plan C. An API for observing the event of MediaStream, such as 
retrieving video-image from ring buffer, to obtain the information 
necessary for the user to implement the synchronization engine

I would appreciate if you have an interest in my use-case.

Regards,
Toshiya Nakakura
Received on Wednesday, 31 January 2018 13:59:54 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 31 January 2018 13:59:55 UTC