W3C home > Mailing lists > Public > public-dxwg-wg@w3.org > July 2018

Re: Definition of 'a profile'

From: Karen Coyle <kcoyle@kcoyle.net>
Date: Mon, 9 Jul 2018 10:02:32 -0700
To: public-dxwg-wg@w3.org
Message-ID: <2ef73d02-9a82-28fa-7b59-28955945e8bf@kcoyle.net>
Thanks, Phil. To my mind this is the kind of thing we need for an
introduction to the profile guidance document.

I'm a bit less optimistic about "directly interoperable" than you seem
to be although I think that would be a best case scenario or a goal to
aim for.

In terms of validation, in our requirements we are including the
possibility of profiles providing the information needed for validation
in a machine-actionable form, although I don't get the sense that WG
members feel that is a MUST; maybe a SHOULD. One thing we've talked
about is couching the requirements in terms of functionality - e.g. if
you want there to be automated validation, then you need A,B,C. If you
don't have A,B,C then don't expect easily automated validation. Kind of
a "you get what you pay for" approach. This would mean that profiles can
be in a range from a minimal feature set, with few supported functions,
to a more robust feature set, and still be profiles.


On 7/9/18 2:06 AM, Phil Archer wrote:
> On last week's call I was asked to offer a definition of 'a profile.' I
> hope it's OK just to offer some text here as I'm a little out of touch
> with how this group uses GH and, ahem, I admit, I've not touched a GH
> client for a year.
> === Begins ===
> 1. Introduction
> Builders of vocabularies and ontologies are encouraged to make their
> work as broadly applicable as possible so as to maximize future
> adoption. As a result, vocabularies and ontologies typically define a
> data model using minimal semantics. For example, DCAT defines the
> concept of a dataset as an abstract entity with distributions and data
> services as means of accessing that data. It is silent on whether a
> distribution should be in a particular serialization, or set of
> serializations; it is silent on how data services should be configured;
> while it states that the value of dcat:theme should be a SKOS concept,
> it does not specify a particular SKOS concept scheme, and so on. Other
> vocabularies such as Dublin Core (@@@ add more here?? @@@) are equally
> parsimonious in their prescriptions of how they should be used.
> This is good practice: it means that data models and methods of working
> can applied in different circumstances than those in which the original
> definition work was carried out and in that sense promotes broad
> interoperability.
> However, any individual system will be designed to meet a specific set
> of needs, that is, it will operate in a specific context. It is that
> context, and the individual choices made by the engineers working within
> it, that will determine how a vocabulary or set of vocabularies will be
> used. For example, a system ingesting data may require that a specific
> subset of properties from a range of vocabularies is used and that only
> terms from a defined code list are used as values for specified
> properties. In other words, where the 'base vocabulary' might say "the
> value of this property SHOULD be a value from a managed code list", the
> profile will say "the value of this property MUST be from *this*
> specific code list."
> This discussion leads to the following definition of a profile, also
> known as an "application profile" or "metadata application profile:
> A profile is a named set of constraints on one or more identified base
> specifications. Such constraints should ensure that software systems
> that implement the profile are directly interoperable.
> === Ends ===
> Is that the kind of thing that's needed? I was tempted to talk about
> validation, i.e. that it should be possible to validate a given dataset
> against a given profile - but I'm not sure the WG agrees with that??
> Phil
> For tracker: Action-141

Karen Coyle
kcoyle@kcoyle.net http://kcoyle.net
m: 1-510-435-8234 (Signal)
skype: kcoylenet/+1-510-984-3600
Received on Monday, 9 July 2018 17:02:58 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:42:05 UTC