Re: citing Shannon on Information Theory

Just to be clear:  I am not expert in information theory and it's possible 
that Shannon's is not quite the right work to reference.    Whatever the 
reference may be, the concept I have in mind is not statistical.  One bit 
can convey a choice between two states.  Two bits a choice between four 
states.  Given some context, then I can associate some of these states 
with real world choices such as the choice between possible sequences of 
Unicode characters, countable numbers, etc.  My probably over-simplified 
assumption is that Shannon's work allows one to say:  given a channel with 
certain characteristics, what is the statistical probability that I can 
convey with full fidelity a message of n-bits.  Perhaps I have 
misunderstood.  In any case, as I've said before, I am NOT suggesting that 
we refer to the statistical aspects of Shannon's theory, which are in fact 
its essence.  I am suggesting that we refer to what I had presumed was a 
precondition for applying his theory: I.e. that one has in hand a set of 
perfectly determined bits that we might wish to call information.  This is 
surely closely associated with the "entropy" referenced in the Wiki entry, 
but I too have insufficient depth in this area to fill in the blanks. 

If I have misused Shannon's terminology or misunderstood the theory please 
accept my apologies for any confusion caused.  Perhaps someone more 
knowledgeable in this area can help us out.

--------------------------------------
Noah Mendelsohn 
IBM Corporation
One Rogers Street
Cambridge, MA 02142
1-617-693-4036
--------------------------------------








Dan Connolly <connolly@w3.org>
Sent by: www-tag-request@w3.org
10/15/04 06:47 PM

 
        To:     www-tag@w3.org
        cc:     (bcc: Noah Mendelsohn/Cambridge/IBM)
        Subject:        citing Shannon on Information Theory


Further to...

"NM suggested that Claude Shannon's work on information theory might
provide a suitable definition"
  -- http://www.w3.org/2001/tag/2004/10/05-07-tag#infores2

I'm still not to the point where I understand well enough
to cite Shannon's work, I think I know what the canonical
work is now:

"It is generally believed that the modern discipline of information
theory began with the publication of Shannon's article "The Mathematical
Theory of Communication" in the Bell System Technical Journal in July
and October of 1948."
  -- http://en.wikipedia.org/wiki/Information_theory

I'm slowly swapping in that part of wikipedia.


-- 
Dan Connolly, W3C http://www.w3.org/People/Connolly/

Received on Saturday, 16 October 2004 20:07:06 UTC