Re: Physical web project

Miguel and Scott

thanks for sharing the thinking

I would add the requirement to interact with the management of the
broadcasting devices (beacon) also at operating level (change parameters,
execute commands) via the proposed interface.
Is that in the plan?
If so, I expect quite a lot of modelling needs to be done, with different
classes of objects (devices) with different properties /commands etc
Is that right?

On Tue, Oct 21, 2014 at 9:44 PM, Miguel <miguel.ceriani@gmail.com> wrote:

> Dear Scott and all,
> IMHO there is already something "semantic webby" in your approach.
>
> What I understood of your project is that a physical object broadcast
> an URL through which some related information can be gathered.
> In a "Semantic Web of Physical Objects" view, that URL (or URI, for
> that purpose) could also actually identify that physical object. That
> would allow to gather information related to that object from
> different sources and not just from that single URL (e.g. independent
> information on a product in a super-market). Moreover it would allow
> the user to produce information related to the object (e.g. using an
> annotation service).
>
> I think this is a very good reason for us to keep in touch with your work.
> What I sketched is basically a possible "semantic interpretation" over
> the "Physical Web" idea, not something that would necessarily add any
> technical requirements to your project.
>
> To be concrete, there are potentially simple ways to make use of the
> semantics of physical objects.
> For example, if the URL broadcasted by an object points to an HTML
> page, RDFa can be used to embed meta-data in HTML code.
> Some of the meta-data could be gathered directly by the app and shown
> to the user somehow (an icon for the type/category of the object, a
> color for the time to expiration of a perishable item, ...).
>
> Best,
> Miguel Ceriani
>
> On Tue, Oct 21, 2014 at 4:13 PM, Scott Jenson <scott@jenson.org> wrote:
> > On Tue, Oct 21, 2014 at 2:13 AM, Paola Di Maio <paola.dimaio@gmail.com>
> > wrote:
> >>
> >> What would help us here is some idea of what your system looks like
> >> (in design terms), so that we could, in principle, include any
> >> requirements you may have in our work
> >
> > Not sure I understand but I'll give it a shot: our system is a series of
> > hardware beacons that are broadcasting URIs using the BLE advertising
> > packet. These URIs are expected to be URLs but we are exploring other
> > encodings (e.g. URNs but that is a bit more speculative) This creates the
> > 'senders' of our system. The 'receivers' (at this time) are phones
> running
> > an app. However, that is just for prototyping purposes. We expect this
> to be
> > built into the OS for most systems. The goal of these receivers is to
> > collect the nearby beacons, display them to the user WHEN THEY ASK (no
> > proactive beeping!) rank them in some way, and if the user taps on one,
> take
> > them to that web page. The receivers, much like browsers today, can vary
> > quite a bit (and even be proprietary) We don't expect to 'control' the
> > receivers and hope there is a wide range of experiments here. What we do
> > need to standardize however, is the broadcasting packet so everything
> sends
> > out a URI the same way.
> >
> >>
> >>
> >> A question in return:  is the physical web already thinking what kind
> >> of interface is it going to have,  and would you benefit from input
> >> from this community (bearing in mind that we are a collection of
> >> individuals with different views on things),
> >
> > Of course, that is why we released early to get hard questions and
> > experiments. The primary goal is simple: Show beacons to the user as
> simply
> > and easily as possible. Our current prototype uses the Android
> notification
> > manager (with no sounds or vibrations) so seeing nearby beacons is just
> two
> > taps. We expect other platforms to try different things.
> >
> > Scott
>

Received on Friday, 24 October 2014 05:11:03 UTC