[Prev][Next][Index][Thread]

Re: semantics



Ron Whitney wrote:

> The only semantics you
> make reference to is a low-level semantics (e.g. Mathematica
> semantics) whereas OpenMath is attempting to be rather more
> mathematically oriented ("system" independent).

Robert S. Sutor/Watson/IBM Research wrote:

> For OpenMath this just means that "contexts" will
> be locally developed and used until a particular community
> decides that a field is mature enough to have contexts that
> are in the official registry.

> Neil has some very good points about sharing semantics between
> CA systems. OpenMath is attempting to solve this problem, so, in some
> sense, we don't have to do so here.

Having 'system independent' or 'official' semantics has the benefit that
at any given point in time a community of users can more effectively
communicate and utilize whatever tools take advantage of these
semantics.

The simplicity and appeal of this statement is obvious.  Less obvious is
the question, would mathematics and society allow or benefit from such a
classification?  

Certainly Mathematica introduces semantics for a wide class of
notations.  And Mathematica exploits the notion of contexts
(Begin["V4`"]) to allow different semantics for a notation.  Where a
notation is ambiguous in a particular context we also attach an
interpretation to the notation via an InterpretationBox.  Type
information is added via pattern
contraints (x_Integer).  

Mathematica designers are free of any official rules.  Although we are
strongly influenced by existing conventions, we also try and correct or
clarify existing practice.  Examples are the use of [] for function
brackets and the use of a DifferentialD for integration.

What if we were subject to some mediated semantics?  How would this
affect our development and innovation?

HTML provides an example of the effect of introducing a universal
language design for a class of documents.  The short term benefit was
incredible.  It created a critical mass of Web users that fuelled the
Web revolution.  But what were the costs and what will be the longer
term benefit?  First many documents are now no longer authored in rich
environments like word processors, but publishers like Britannica Online
aim for the lowest common denominator - HTML that is widely supported by
browsers. Secondly, language design and standards have lagged behind the
innovation of the browser vendors.  Thirdly, with the coming rise of
network component architectures (eg. ActiveX and ONE) HTML may die and
be replaced by compound documents with very few vendors providing
solutions.

Let's say some group could deliver some 'offical' sanctioned semantics
for math, what would the effect be?  First any such standard would
likely not be as rich as some of the existing environments and is
unlikely to have a critical mass of users - as such there would be no
reason for authors to degrade their work.  Seondly, the standards group
will not drive the adoption of semantics, but will be in tension with
the math vendors (Both Netscape and Microsoft claim to support open
standards, but they try to lead the pack, not follow).  Thirdly, a new
'solution' to the problem of semantics will likely evolve independently
- in particular we would continue to innovate and provide a richer and
more flexible system than the 'offical' semantics.

I should note that notational math will likely be a component which
survives in the transition to network component architectures.  This
component math should be able to specify its semantic domain.  This
could point to Mathematica, OpenMath or some other math system.  Even
within Mathematica there will be notations that are ambiguous and we
would look for the equivalent of an InterpretationBox for the intended
meaning.  If an OpenMath document was to be interpreted by Mathematica
we would attempt to faithfully translate the OpenMath semantics.  What
is important from our point of view is that the notation design
introduces sufficient structure to give us a chance of creating accurate
syntax - eg. inferring authors implied grouping.

In spoken languages there are syntax guides for representing a language,
but no 'offical' semantics.  To provide a syntax guide that will allow
some meaning to be inferred by math engines would be great
accomplishment.  Such a design need not seek to standardize the
semantics.  Indeed it would be folly to even try, both from a practical
point of view and because of intrinsic properties of mathematics within
society.

Researchers/professionals who dream of being able to send one notation
automatically to an array of math engines, can if each vendor provides
and maintains an appropriate translation utility.  This will be possible
most of the time if the notational information is rich enough.

Educators who would like a simple uniform notation and semantics to be
recommended, may be do their students better service by helping them use
one system and at the same time make them aware of the diversity of
other approaches.  (This said, most of the semantics are by convention,
somewhat consistent for popular math.  But, it is a treacherous step to
actually demand consistency, or suggest you are 'non-standard' if you
don't adopt the recommended semantics.)


Follow-Ups: References: