W3C home > Mailing lists > Public > public-sdw-wg@w3.org > April 2015

More detailed requirements

From: Linda van den Brink <l.vandenbrink@geonovum.nl>
Date: Wed, 15 Apr 2015 10:00:21 +0000
To: "SDW WG (public-sdw-wg@w3.org)" <public-sdw-wg@w3.org>
Message-ID: <13F9BF0BE056DA42BFE5AA6E476CDEFE72530F50@GNMSRV01.gnm.local>
Hi all,

As I mentioned in a separate email we (Geonovum) are currently reviewing the requirements and trying to make sense of them and add more detail to them.  This is a little bit ahead of the work of the SDWWG, but I think could be relevant to the question of how to structure the UCR document. As an example, please find below one requirement with our questions and more detailed sub-requirements.

1.             It should be easy to find spatial data on the web
1.1           Non-experts should be able to find spatial data on the web through free-text searches
1.2           Spatial data on the web - datasets as well as single objects - should be findable through numerous ontologies/synonyms
1.3           Popular search engines should be able to find spatial data on the web (see requirement #5)
1.4           It should be possible to add semantics and structure to spatial data on the web (through existing mechanisms e.g. schema.org, Location Core)
1.5           Human/computer search mechanisms must be able to search for spatial entities/objects within datasets
1.6           There should be standardized search filters (e.g. within, nearby) for spatial data that are supported by popular search engines
1.7           Spatial objects should be findable through a (set of) basic/primitive spatial property e.g. centroid (see requirement #2).
1.8           Metadata should be inseparable from the data it describes
1.9           It should be possible for users to annotate (add tags and links to other datasets) the (meta)information they find (so that search is improved)
1.10         APIs and "smart" geo-services (e.g. WPS, WFS) should be (made) crawlable

Questions about this requirement:

-          What are the problems of current data finding mechanisms? Is it lack of suitable technologies or lack of robust governance structures (i.e. high quality publishing process, robust maintenance process, etc.)? One problem could be that spatial data is published through mechanisms that are not in itself crawlable. E.g. WFS services or atom feeds (in non-standard formats). See also last bullet.

-          Through what mechanism is the entity (human/machine) searching for data? Centralized registers? Decentralized (but linked) registers? Free-text searches? (in our opinion at least free text search)

-          Which data/information properties are queried: keywords, extent, timespan, free-text fields, crowdsourced tags, location (using spatial operators), zoom level/resolution/level of detail, coordinate system, place name, license?

-          How should APIs and "smart" geo-services (.e.g WFS) be indexed?

We have similar sub-requirements (and questions) about several of the other requirements, but have not quite gotten through the list. Should we put all this somewhere on the wiki?

Linda
______________________________________
Geonovum
Linda van den Brink
Adviseur Geo-standaarden

a: Barchman Wuytierslaan 10, 3818 LH Amersfoort
p: Postbus 508, 3800 AM Amersfoort
t:  + 31 (0)33 46041 00
m: + 31 (0)6 1355 57 92
e:  l.vandenbrink@geonovum.nl<mailto:r.beltman@geonovum.nl>
i:  www.geonovum.nl<http://www.geonovum.nl/>

________________________________
Received on Wednesday, 15 April 2015 10:01:52 UTC

This archive was generated by hypermail 2.4.0 : Thursday, 24 March 2022 20:31:16 UTC