- From: Kevin Smathers <kevin.smathers@hp.com>
- Date: Tue, 08 Jul 2003 08:11:52 -0700
- To: "John S. Erickson" <john.erickson@hp.com>
- Cc: www-rdf-dspace@w3.org
John S. Erickson wrote: >Hi Folks! > >PaulS suggests: > > > >>As to a suggestion for the text, how about something like: >> >>At present we have no evidence that those who actually >>manage archives require the ability to track changes to >>the *descriptive* metadata over time. In traditional library/ >>information management systems logs are kept around to track >>metadata changes temporarily, but it's just not considered >>important to the core mission of managing the \emph{content} >>over time. Schemas change, contexts change, resources get >>described in myriad ways (all at the same time), people make >>mistakes, fix them, we add stuff, we remove stuff, and libraries >>do not track all this. It is therefore difficult to predict the >>potential value(s) and uses of such data and functionality in >>practice. >> >>However there are some newer kinds of metadata for which the >>community do seem to want to track provenance; namely, preservation >>metadata, i.e. capturing and preserving provenance metadata >>related to preservation activities -- i.e. what was done to >>the digital object over time in order to preserve it. >> >> > >JSE: I respect the fact that there will be metadata elements for with >provenance needs to be maintained, and other elements for which such "care and >feeding" is not necessarily required. BUT WHO DECIDES (and when)? Certainly >NOT the system designer. This should be, minimally, a curatorial decision, >probably driven by institutional policy (however one wishes to define >"institutional"). > > I think traditionally it has been the system designer (with feedback from the userbase) who has decided. The features of the system define what data is collected, and what is tossed. Customarily a system will allow gross adjustments to the detail level of recorded data, but increasing customizeability goes in step with increasing configuration, leading to issues of maintenance and startup cost. Compare the usability of Oracle with that of MySQL -- MySQL sets up in ten minutes in a configuration that is typical for small to medium web applications. Oracle can be configured for similar uses, but it would take days of reading to actually determine whether the configuration that you have is right for the application you are writing. >These are but two examples of the "lifecycle management of objects" concept >that Mick introduced about two weeks ago: the ability to associate policies >with digital objects for a variety of purposes, including (and esp.) to >facilitate their management. Just as RDF-based technologies promise to >transform the discovery and retrieval of resources, they can also transform >their management, by providing a means of specifying the scope of policies >based upon the characteristics and relationships of objects rather than simply >their position in hierarchies. > >John > > I agree, there is definitely an active element here in addition to the passive sense of provenance. Whether that is coded into policies or written in hard code, there must at least be a way of associating specific instances with enough context to enact appropriate policy. I think that indicates a specific need for provenance data (ie: metadata about the metadata). -- ======================================================== Kevin Smathers kevin.smathers@hp.com Hewlett-Packard kevin@ank.com Palo Alto Research Lab 1501 Page Mill Rd. 650-857-4477 work M/S 1135 650-852-8186 fax Palo Alto, CA 94304 510-247-1031 home ======================================================== use "Standard::Disclaimer"; carp("This message was printed on 100% recycled bits.");
Received on Tuesday, 8 July 2003 11:13:43 UTC