Re: DAP rechartering discussion

Dear Mark,
Dear Colleagues,

On 2011-04-07, at 22:27 , Mark Watson wrote:

> [...]
>> So IMHO there should be this baseline, and there should be pre-defined slots for proprietary extensions. To make them proprietary as opposed to closed, the extensions should be be identified in a MIME-type-like way. UPnP did the same for DVB metadata. They defined a generic container labeled "foreign metadata", and one of the container attributes is the "metadata format".
> 
> So, I'm not saying there should be no baseline metadata of cross-service search.

So we're on the same page.

> But I think we should try to find ways that services themselves can participate more directly in any cross-service content discovery.
> 
> For example, rather than every service on the device providing metadata and the device performing a search on that, maybe the device should give the search term and any available context to each service and ask each one to return its results, then merge the results ?

Yes, for instance. For this to work, there would need to be at least a common data model, if not also a common encoding. By this common data model, we could services to provide at least e.g. title and genre so users could search for that. That was what I was thinking of when speaking about "minimum user experience" (not UI).

> Maybe the display of each service's results should be delegated to the service ?
> 
> The display of detailed metadata for a content item should certainly be delegated to that service.

Yes, and with a fallback to sth. which is built into the device if the service doesn't provide anything. I know this may seem redundant, but I think it will be an important point for getting broadcasters on board as it lowers the entry barrier for service providers significantly. Maybe we should think of this as "client-side fallback CSS"?

> Just thinking aloud, but what I mean to say is that there are better metaphors for device/service interaction than thinking of each service as a simple database of common-format metadata.

But it could be a starting point? If we allow service specific encodings, we should at least have a minimum, common data model so that users can launch searches with a minimum hope of sensible results.

>>> I think we should be enabling similar models for TV services and for multi-device services as well. 
>> 
>> I'm nopt quite sure what Mark refers to here. Home network?
>> 
> 
> By multi-device services I mean example like searching on a tablet and having the content rendered on the TV. Or being able to interact with ads on the TV through a tablet. Interaction may or may not be directly between the devices over the home network.

I see. To me that does indeed sound like a home network issue. A device locates content and delegates the rendering to another device on the home net.

In your first example "searching on a tablet and having the content rendered on the TV", the tablet obtains a URI or URL for the content and tells the TV to play that. Apparently the tablet figured out based on what is known about the TV and the content, that the TV would be the best choice for rendering.

In your second example "being able to interact with ads on the TV through a tablet", the TV figures out that related content is available for what's on, and tells the tablet to render that. Apparently the TV figured out based on what is known about the tablet and the content, that the tablet would be the best choice for rendering.

Of course the service can and should also have a say in this by being able to tag sth. as "render on secondary display", i.e. don't disturb what's playing. If there is no tablet, a fallback for the TV could be switch to some picture-in-picture configuration.


So I guess we're basically asking for very similar - if not the same - thing. My point may be a little stronger on the standardized data model than Mark's actually.


Thanks a lot and cheers,

  --alex

Received on Monday, 11 April 2011 15:27:39 UTC