W3C home > Mailing lists > Public > www-math@w3.org > April 2000

Re: standard usage of O(x^3) can be confusing

From: David Eppstein <eppstein@ics.uci.edu>
Date: Wed, 12 Apr 2000 10:33:47 -0700
To: www-math@w3.org
cc: roconnor@uwaterloo.ca, jsdevitt@radicalflow.com
Message-ID: <12958114.3164524427@cx344290-c.irvn1.occa.home.com>
"Russell Steven Shawn O'Connor" <roconnor@uwaterloo.ca> writes:
> but note that in both your case and in my case we hand to translate the
> statement into something else.  I rearranged stuff and used set
> inclusion.  Your rearranged a bit less, but added two quantifiers.
> Either way the point is that if the statement is going to get used by a
> computer, it is going to have to get translated into a first-order
> statement.  In that sense (1) is meaningless because it is not a
> first-order statement.

Bogus argument.  The whole point of any notation is as a shorthand for 
something else.  Plenty of other notations need translation to become 
first order e.g.

lim_{x->0} (sin x / x) = 1

"really means"

Exists(L) all(epsilon>0) exists(delta>0) all(x:|x|<=delta)
|(sin x / x) - L| <= epsilon,
and furthermore L=1.

People don't usually do this sort of explicit translation whenever they 
manipulate limits or O's, why do you think a computer should have to?  For 
instance, in Mathematica or other symbolic symbols, it would be perfectly 
straightforward to add a simplification rule that O(x^i)+p(x) simplifies 
to O(x^i) whenever p is a polynomial with degree at most i.  Such a rule 
doesn't require any deeper understanding of what O "really means".

-- 
David Eppstein       UC Irvine Dept. of Information & Computer Science
eppstein@ics.uci.edu http://www.ics.uci.edu/~eppstein/
Received on Wednesday, 12 April 2000 13:33:53 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Saturday, 20 February 2010 06:12:49 GMT