- From: Rob Manson <roBman@mob-labs.com>
- Date: Fri, 30 Jul 2010 01:17:06 +1000
- To: cperey@perey.com
- Cc: public-poiwg@w3.org
Hi all, +1 for seeing this AR WG charter move forward. > I wish I could express this mathematically, or using code. > - an event or a trigger (as we described it earlier, the detection of a > set of conditions in the user's environment via the use of sensors), when > > (a) it matches a trigger (the set of conditions specified by a > publisher) in a database, and > (b) the AR support (software and in some cases hardware) is > available in the user's device, To me this maps very closely to sensory perception at a digital level and I think that's exactly what AR represents. > should return (send via an IP network if it is in the cloud, retrieve if > it is stored locally) to the user's device that digital data with which > the trigger was previously associated (by a publisher, in the broadest > sense of the word). Also "or access as a stream from a local sensor (e.g. camera or even a VOIP connection)." > The data would be "displayed" (visualized but also possibly including an > audio file plays) in accordance with > > (a) the user preferences, > (b) conditions of use (e.g, did the user pay a subscription to the > "history" database or to the "tourism" database, or to the "commerce" > database, or all three or more? this involves some rights management) > (c) user's device capabilities (can render/view the 3D object live, > can only support 2D view, etc) > (d) user's environment (time of day, noisy, quiet, light, dark, etc). Also "User preferences may be allowed to be overridden to artificially adjust a users context (e.g. exploring the past or other locations, etc.)." > The details of the "display," or representation step, are the "user > experience" and should (in my opinion) be defined entirely by the > developer of the browser or application. I think it's really important that both layers are modeled and discussed. e.g. browser developer and application developer (of course as well as publisher and author). > I believe we need to develop the standard(s) such that when a trigger is > associated with a set of data (using the open AR Data format in a > database) by a publisher, any device with AR support and with which the > user has the rights to receive said publisher's data (see "b" above), > can "display" the data in the real world context. I think you should drop the "in the real world context" part. The interaction between "real world" and "remote/online" is critical too and should not be excluded. > This would increase the content publisher's appetite for publishing once > to many audiences/viewers under conditions specified by the publisher > (e.g., show my UGC AR post only to my friends). I believe this is in the > spirit of the W3C and the Web. Strong +1 Great to see such a good discussion so far 8) -- Rob Manson Managing Director MOB - start something! The Mobile & Online Business innovation lab http://mob-labs.com l: http://www.linkedin.com/in/robertmanson t: http://twitter.com/nambor s: http://slideshare.net/robman ---------------------------------------------------------------------------------- "The Pervasive Experience" research project review is now online: http://slideshare.net/robman/the-pervasive-experience-project-review-july-2010 ----------------------------------------------------------------------------------
Received on Friday, 30 July 2010 19:45:42 UTC