W3C home > Mailing lists > Public > whatwg@whatwg.org > June 2012

Re: [whatwg] Proposal for HTML5: Motion sensing input device (Kinect, SoftKinetic, Asus Xtion)

From: Tab Atkins Jr. <jackalmage@gmail.com>
Date: Mon, 25 Jun 2012 13:22:46 -0700
Message-ID: <CAAWBYDBqp6F9QJ2GmC0rzPU8ZU-qS0rUiDGwoKYUeQmfgz2XKw@mail.gmail.com>
To: Jesús Ruiz García <jesusruiz2007@gmail.com>
Cc: whatwg@whatwg.org
On Mon, Jun 25, 2012 at 9:10 AM, Jesús Ruiz García
<jesusruiz2007@gmail.com> wrote:
> I start indicating that this message can be considered useless. I apologize
> for this.
>
> A few weeks ago I was in the chat #WHATWG, and asked how to send an email
> to the list on a proposal to HTML5 (JavaScript).
> I've taken a few days before sending this email, because I have been
> investigating whether there was a similar project in production, and I've
> seen one.
>
> My proposal for HTML5 is to make it functional with Kinect, SoftKinetic,
> Asus Xtion, and similar devices to interact with the web.
> Logically, Kinect is the device most commonly used, would be ideal for this
> proposal.
>
> Kinect patent must be owned by Microsoft. I am informed that in HTML5,
> there have been discussions on these issues of patents, so this aspect
> could possibly be some kind of problem.
> From my point of view, it would sell more devices of this type. Surely even
> in future be replaced by webcams these devices more powerful and included
> as standard on all computers.
>
> Also, users would have an advance on the web. Because really, I mean just
> to give you support to make gestures to browse the web, but for more useful
> things.
> I have some functions to the web, and see that are not being developed:
>
> *- Online shopping or Online retailing:* Want to buy clothes, but do not
> know what size you are using actually. Online stores may have an option to
> run Kinect and scan your body to tell you the correct size for you, for
> that article.We could even see if that shirt looks good on you or not.
>
> *- Webs makeup/hair salon:* With face recognition, could learn/test
> different makeup on the market. Obviously these products would be tested
> virtually, and then could be purchased.
>
> *- Webs fitness/rehabilitation:* While this can be considered as a
> videogame, I see it more as an application. It would check if the person is
> performing well exercise, getting not cause any injury. Rhythm of physical
> exercise, and progress in their mobility.
>
> *- Possible support for Canvas:* Interact with Canvas, via Kinect. Although
> this can be done also with multitouch technology.
>
> There are many ideas, but these are four simple possibilities that have
> been happening while I was writing this text.
>
> Currently being developed by *MIT*, a JavaScript library, called *DepthJS*.
> It Allows Any Web page to interact With the Microsoft Kinect using
> Javascript:
> https://github.com/doug/depthjs
> A few months ago that they do not update, and so far only allow web
> browsing via gestures. I suppose that at the moment, not perform a body
> scan to display it on screen in the browser.
>
> Microsoft according to some reports, also being developed for Xbox 360, a
> version of Internet Explorer with supports Kinect.
>
> Well, with this information, you can become a situation of my proposal.
>
> I apologize as said at the beginning of this text, if this proposal is
> absurd, or not is functional in the HTML5 philosophy, and is better to have
> a separate library as *DepthJS* for this.
>
> I hope to know your opinion and read your comments.

The ability to capture sound and video from the user's devices and
manipulate it in the page is already being exposed by the getUserMedia
function.  Theoretically, a Kinect can provide this information.

More advanced functionality like Kinect's depth information probably
needs more study and experience before we start thinking about adding
it to the language itself.

~TJ
Received on Monday, 25 June 2012 20:23:41 UTC

This archive was generated by hypermail 2.4.0 : Wednesday, 22 January 2020 16:59:43 UTC