Video Streaming Properties [WAS RE: Meeting Summary - 15 October 2007]

Thank you Runar for your comments and your detailed information.

 

The main reason the group decided that the video streaming property was
not to be in the DDR Core Vocabulary was because of the nature of the
vocabulary. The big challenge facing the mobile aspect of the Web is
just getting a functional presentation of ordinary Web content in such a
diverse context. The core vocabulary is there to capture the absolutely
essential properties that will enable a typical Web page to be adapted
to mobile contexts.

 

For more complex content, or more complex adaptation strategies, the
group envisages a set of additional vocabularies to accompany the DDR
Core Vocabulary. To ensure compatibility across these vocabularies, it
is expected that all will reference a common ontology, which provides
the semantics for all of the concepts and data represented in the
vocabularies. The W3C UWA WG is the group responsible for maintaining
the ontology.

 

Any properties that the DDWG believes are not core for the objectives
set out in our charter, will not be abandoned. Instead, two further
steps are taken:

 

First, we determine if the proposed property represents something not
already captured in the UWA ontology, and if so, we submit it to the UWA
for consideration. This means that although the property would not be
represented in our core, it would be possible to have an additional
vocabulary that includes the property and also references the common
ontology.

 

Second, we make contact with the proposer and other (specialist) groups
to see if they would be interested in creating a vocabulary that
references the ontology entry so that more advanced adaptation solutions
can make use of it. For example, we are already working closely with the
OMA who will have a vocabulary of their own for their particular needs,
and we have made contact with the OpenAjax Alliance to propose that they
too would have a vocabulary for Ajax-related properties.

 

In time, adaptation systems will be able to make use of Device
Description Repositories that contain data belonging to several
vocabularies. We also expect that advanced adaptation solutions will
have custom vocabularies of their own, possibly with their own
ontologies, though our preference is that the UWA's ontology should be
sufficient for any vocabulary.

 

The reason why Supported Image Formats was considered to be an important
property is because of the many years in which images have been part of
Web content, the fact that almost every Web page contains at least one
image, and the fact that almost all mobile devices support at least one
image format thus making it possible for them to render most of the Web
if appropriately adapted.

 

One might think that having a property in the core that says "this
device is capable of video stream playback" would be a good property to
include. However, to make use of the fact you would then have to have
additional properties to determine which particular formats, bit rates
etc are applicable. It sounds like a whole new vocabulary for video is
required. For this more specialised case, an adaptation system would
connect to a DDR that had a Video Vocabulary and it would ask "does this
device support video streaming?" and subsequently it could ask other
questions relating to video, assuming the right properties were in the
vocabulary (and the repository contained data to go with those
properties).

 

The DDWG will consider the groups that should be contacted in order to
propose the creation of a video vocabulary. Meanwhile, the information
you have provided will be forwarded to the UWA for consideration by
means of copying this response to the public list of the UWA.

 

Thanks again.

 

---Rotan (chair).

 

 

 

From: Runar J. Solberg [mailto:runar.solberg@adactus.no] 
Sent: 19 October 2007 08:48
To: Rotan Hanrahan
Cc: public-ddwg@w3.org
Subject: RE: Meeting Summary - 15 October 2007

 

Hi,

 

I have a comment regarding the H.264 property. I was the one that
proposed it. I appreciate that the group has taken the time to consider
it. Based on the group's feedback I realize my proposal was badly
formulated. 

 

My intention of the proposal was rather to indicate if the device can
playback H.264 for videos. This is regardless if the delivery method is
streaming or download. In this context the property should have been
called "video of H264 Baseline Profile Level 1.0" not "Streaming video
of H264 Baseline Profile Level 1.0". I apologize for this bad naming. 

 

Nevertheless the bit rates stipulated are characteristics of a video
conformant with the corresponding profile, but have little to do with
the delivery method of the bits. It is actually the bit rate of the
H.264 video. How the bits got to the device (Streaming, download) I now
realize is out side the scope of the Core Vocabulary. But is the
playback capabilities also outside the scope? I see you have approved
gif87 and gif89a for gif images, so I am curious to know the groups
stand on video capabilities? 

 

I am happy to submit a new proposal illustrating this, if it is any
chance of getting it accepted. Based on the emphasis of delivery
bitrates (streaming) I understand that the groups decision to mark it as
"Not Core". However I hope you will reconsider the new proposals below: 

 

Name

Description

Measurement

Justification

Video h264 Baseline  Level 10

Device is capable of playing a video of H264 Baseline Profile (BP) Level
1.0. 

Measured by trying to playback a conformant H264 video at baseline
profile level 1, to the device. Playback should be successful.   

Primarily for lower-cost applications with limited computing resources,
this profile is used widely in videoconferencing and mobile
applications.

Video h264 Baseline  Level 1.b

Device is capable of playing a video of H264 Baseline Profile (BP) Level
1.b.

Measured by trying to playback a conformant H264 video at baseline
profile level 1.b, to the device.

Playback should be successful. 

Primarily for lower-cost applications with limited computing resources,
this profile is used widely in videoconferencing and mobile
applications.

Video h264 Baseline  Level 1.1

Device is capable of playing a video of H264 Baseline Profile (BP) Level
1.1.

Measured by trying to playback a conformant H264 video at baseline
profile level 1.1, to the device. Playback should be successful.   

Primarily for lower-cost applications with limited computing resources,
this profile is used widely in videoconferencing and mobile
applications.

Video h264 Baseline  Level 1.2

Device is capable of playing a video of H264 Baseline Profile (BP) Level
1.2.

Measured by trying to  playback a conformant H264 video at baseline
profile level 1.2, to the device. Playback should be successful.    

Primarily for lower-cost applications with limited computing resources,
this profile is used widely in videoconferencing and mobile
applications.

 

 

Regards

 

Runar J. Solberg

 

 

[...]

Received on Friday, 19 October 2007 09:58:33 UTC