- From: Lars G. Svensson via GitHub <sysbot+gh@w3.org>
- Date: Tue, 09 Oct 2018 12:16:20 +0000
- To: public-dxwg-wg@w3.org
@kcoyle scripsit: > I don't think I would limit it necessarily to software, but to any automated process. Thinking about this again, it might be better to say "any formal process", since validation can be a manual task, too. > Bibliographic records that I have worked with were not "profiled" to address that algorithm; the algorithm had to work with metadata that was not aware of the needs of the algorithm. Yes, sort of Postel's law. OTOH you could say that MARC is a profile of ISO 2709: You can use a generic parser to pick fields and subfields, but you need to know more about the semantics of fields and subfields to do the de-duplication. So the data was profiled up front (being MARC) and the consuming application was written to handle data in that profile. It's often that way around: 1. There is data available that has a specific syntax/profile combination 2. software applications are written that can handle that data 3. Those applications get popular, so people produce more data for them > I would consider the logical algorithm to be something that defines a set of rules regardless of the software that implements it. Considering that your algorithm probably would work regardless if the data is in MARC 21 (ISO) or MARC-XML, I'd definitely agree. -- GitHub Notification of comment by larsgsvensson Please view or discuss this issue at https://github.com/w3c/dxwg/issues/448#issuecomment-428169372 using your GitHub account
Received on Tuesday, 9 October 2018 12:16:21 UTC