W3C home > Mailing lists > Public > www-rdf-logic@w3.org > August 2001

Re: SW, meaning and reference

From: Seth Russell <seth@robustai.net>
Date: Fri, 31 Aug 2001 15:56:13 -0700
Message-ID: <01c901c13270$239b70c0$b17ba8c0@c1457248a.sttls1.wa.home.com>
To: "pat hayes" <phayes@ai.uwf.edu>
Cc: <www-rdf-logic@w3.org>
From: "pat hayes" <phayes@ai.uwf.edu>

> >The more agents can meet their goals using a theory of reference, the
> >"correct" that  theory is.
> Fair point. But when some of those agents live only as software, but
> others (us) live in the real world where some of the referents are
> located also, it gets nontrivial both to say what the goals are, and
> how to tell if they have been met. What does it even mean to say that
> a software agent refers to something concrete, for example?

It means that the software agent has taken an action that tries to places
itself in some relation to that concrete thing ... of course that relation
might only be that it has corresponded with another agent about it.   I
don't see why a 'software agent' is any different than a biological agent in
that regard.  Perhaps a disembodies text based agent with no stepper motors
would have a similar relation to concrete objects as we have to alleged
spiritual objects .. and would need blind faith axioms instead of
perceptions grounded in casual events.

> > > So I'm not spreading doom and gloom; just
> > > claiming that reference is the weak spot when we come to formalise the
> > > 'semantic' web.
> >
> >Why formalize it at all ?  Why not just start making mechinisms that work
> >may the best theory win.
> Making mechanisms is one way to do the formalizing. Programs are
> formal, just like axioms. In the case of the SW, 'formal' often means
> 'useable by software', in fact.

I'll buy that.

> >Logic is great, survival better :)
> But poor reasoners get eaten more easily than good ones, which is
> probably why evolution gave us big-headed apes such a comparatively
> good time.

You got any references for that claim ?

Life evolved for may an eons with no reasoning ability that remotely
resembles anything like what  passes for logic today. I think the jury is
still out on whether such logic will help us survive in the long run or not.
Certainly agents who rely only on that logic, blindly assuming that their
'resource set' (IR) always correspond to reality, are not going to be
playing with a full deck.

Seth Russell
Received on Friday, 31 August 2001 18:57:10 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 2 March 2016 11:10:36 UTC