Story Of Information

¤~~~~~~~~~¤~~~~~~~~~¤~~~~~~~~~¤~~~~~~~~~¤~~~~~~~~~¤

Arisbeans, Resourcerors, ...

It occurs to me that this reflection of Tor Arnesen's lights
may serve a useful function in a very slightly wider ellipse.

¤~~~~~~~~~¤~~~~~~~~~¤~~~~~~~~~¤~~~~~~~~~¤~~~~~~~~~¤

Re:

| "This paper is based upon the theory already established, that the function
| of conceptions is to reduce the manifold of sensuous impressions to unity"
| (Peirce, "On a New List of Categories").  But, cognition does not seem
| to be given a vital "life support" function, so to speak.

What do you think happens to a living creature who cannot
"reduce the manifold of sensuous impressions to a unity"?

Do you think such a creature is long for this world?

<...>

What I have for a long spell now found of interest
is the relationship between this formative concept
of concept-formation, whose refrain Peirce took up
from its harmonious pre-establishment in the tones
of Kant, and what our contemporary hue and cry now
heralds as the harbingers of the recent revolution,
that is, the notion of Information, Our Hero, that
is to follow, and the legend of its secular strife
with Uncertainty, Our Nemesis, whose story is here:

¤~~~~~~~~~¤~~~~~~~~~¤~~STORY~~¤~~~~~~~~~¤~~~~~~~~~¤

I do not know how others may comprehend the name "Information",
but for me it gets its meaning in a particular kind of context.

In the setting where this notion of information makes sense to me,
we have, as a part of the overall background setting, a measure of
a quantity called "entropy" or "uncertainty".  This is a measure on
distributions ("frequency distributions" or "probability densities").

The distribution F : X -> R is a function from a sample space X
to the real numbers R, such that the real value F(x) in R can be
interpreted as the probability that x will happen.  (More carefully,
this interpretation only works for discrete frequency distributions,
but that is more or less the general idea.)  (Also, it is traditional
to use a capital omega for the sample space, for "outcomes" or perhaps
for "occurrences", I think, but here I will have to sign it with an "X".)

The entropy or uncertainty measure M is thus a function of the
type M : (X -> R) -> R, and it satisfies some additional axioms
that make it a decent formalization of our intuitive notions of
doubt or uncertainty in the situations that are described by the
distributions of type (X -> R).

In this setting, if we can say what our measure of uncertainty
would be both before and after receiving a particular sign, say,
by way of making an observation or by way of some other courier,
then we can define a quantity called the "information capacity"
of the set of signs at issue.  This information capacity is the
"Average Uncertainty Reduction On Receiving Each Sign" (AURORES).
Thus comes the dawn of information theory.

The way I understand it, sign-tokens are physical things --
they are actually another set of "outcomes" that "occur"
in the real world, and so they have to have some sort of
physical basis, but sign-types are classes of sign-tokens,
what statistical peoples called "events", that is, subsets
of some sample space, and so they have an abstract quality
to them.  We "typically" intend the token as a representative
of its class, so there is almost always a slight ambiguity here.

So if I hear folks talking about a category of being that
they call "information", I have to stop and say to myself:

Okay, they mean a sign that is given in a setting where it
posseses and potentially conveys a quantity of information.

¤~~~~~~~~~¤~~~~~~~~~¤~~YROTS~~¤~~~~~~~~~¤~~~~~~~~~¤

Exercise for the Reader -- Compare and Contrast:

| The function of a conception is to reduce the
| manifold of sensuous impressions to integrity.

| The function of a sign or signal is to reduce the
| measure of uncertainty in significant indications.

In this setting, "significant" = "vital".

Eccentric Regards,

Jon Awbrey

¤~~~~~~~~~¤~~~~~~~~~¤~~~~~~~~~¤~~~~~~~~~¤~~~~~~~~~¤

Received on Wednesday, 24 January 2001 23:30:44 UTC