- From: C H <craighubleyca@yahoo.com>
- Date: Tue, 25 Nov 2008 16:36:07 -0800 (PST)
- To: Renato Iannella <renato@nicta.com.au>, public-xg-eiif <public-xg-eiif@w3.org>
My recommended wording to express the issue is something like this: "Some concise date/time representations omit time zones and assume "some local" time zone, notably ISO 8601 and XML dateTime. Such dates must be corrected to unambigously refer to a specific time in a specified time zone on input. Any such corrections, even obvious-seeming ones, will be recorded as exceptions in a log for later inspection or reversion. W3' OWL-Time terminology, which specifies intervals, durations and other scheduling concepts, provides the time model most appropriate and robust for emergency management purposes. Output from any EIIF-compliant system must specify the time zone, e.g. by adding a "Z" suffix to signify UTC as in 8601. Subsystems with no temporal processing may use the ISO 8601 notation directly to specify points in time but must adopt full OWL-Time terminology for intervals/durations. That is, no application-specific ways of specifying spans or periods of time that combine ISO 8601 dates will be allowed." Differentiating how input will be handled, how output will be handled, is the question. Obviously all the time "standards" map onto each other somehow. How the correction is handled on input of an (ambiguous) ISO 8601 date, we shouldn't be specifying. We should only be saying that you can use the simpler notation if you are dealing only in points in time in one module, but as soon as you have to do time math or anything to do with scheduling you must adopt the more robust W3 approach. That seems fair to everyone, and has ambiguity only insofar as the date/time might be misinterpreted on input somehow. We could specify that dates are assumed to be UTC but that any such correction/assumption gets logged/traced seems like a logical precaution to me.
Received on Wednesday, 26 November 2008 00:36:48 UTC