Re: comments on BLD draft part II

> My final batch of comments:
> 
> 
> 18 - section 2.1.4, truth valuation for formulas: the notion of a
> "well-formed formula" is not defined for the presentation syntax

The presentation syntax is the same as the formal syntax at this level.

> 19 - section 2.1.4, interpretation of data types: the interpretation of
> symbols with datatypes is not included in the definition of an
> interpretation (see also [1]).
> For example:
> Given a set of supported datatypes DT, IC is a mapping from Const to
> elements of D, such that for every (literal, label) in Const it holds
> that whenever label is an identifier of a datatype d in DT, IC((literal,
> label))=L2V^d^(literal), where L2V^d^ is the literal-to-value mapping of d.

There were certain omissions, but I now fixed them.

To reduce the amount of controversy I decided to adopt your earlier
suggestion that non-datatype symbol spaces should not have value spaces.

> Furthermore, the discussion about value spaces should be moved to where
> data types are introduced, since value spaces are part of a datatype.

Value spaces and lexical->value mappings are part of the semantics of data
types and they interact with the overall semantics. I think they better fit
in the semantic part.

> 20 - section 2.2.1:
>   a- it seems unnecessary to introduce the formula "true"
>   b- the syntax of the return value of Unnest is unclear.  I would
> propose to use the presentation syntax instead.  In this way, it is easy
> to avoid using the formula "true":
> If f is a term, then Unnest(f)=""
> Otherwise, if f is a frame of the form o[a->v] then
> Unnest(f) = And(Unnest(o) Obj(o)[a->Obj(v)] Unnest(v))

Why is it better?  Why do you think that introducing "true" - something
that everyone understands - is a problem? I think this makes the
transformation easier to understand.


> 21 - section 3.1.1.2: the notion of a "well-formed condition formulas"
> is used here, but was never defined.

thanks. now it is defined.

> 22 - There is no way to identify rules in the current syntax; this is
> probably related to the metadata discussion (which has a low priority on
> the face-to-face, unfortunately).
> 
> 23 - Section 3.1.2.1, last paragraph: it is not clear what is meant with
> extensions being required to be "compatible" with a given definition.
> It seems that in many cases these is not possible, especially if the
> semantics does not have a model theory. Perhaps compatibility should be
> defined with respect to entailment?

Done (with respect to intended models, which is simpler).

> 24 - section 2.1.1: The notions of free variable occurrences and closed
> formulas need to be defined here. A statement "We adopt the usual
> scoping rules for quantifiers from first-order logic." is not sufficient
> for a language specification.

Right. But for the draft it is not a burning issue.


	--michael  


> 
> best, Jos
> 
> -- 
> Jos de Bruijn            debruijn@inf.unibz.it
> +390471016224         http://www.debruijn.net/
> ----------------------------------------------
> The third-rate mind is only happy when it is
> thinking with the majority. The second-rate
> mind is only happy when it is thinking with
> the minority. The first-rate mind is only
> happy when it is thinking.
>   - AA Milne

Received on Wednesday, 26 September 2007 06:04:34 UTC