Re: [ontolog-forum] One new English word every 98 minutes

Adrian, its really a funny service.

I concur with John that monitoring "the number of words in the English 
Language" is an otiose and confusing job. For it is no more than a 
statistical count of new meaning situations: "each word was analyzed to 
determine which depth (number of citations) and breadth (geographic extent 
of word usage), as well as number of appearances in the global print and 
electronic media, the Internet, the blogosphere, and social media." 
http://www.languagemonitor.com/no-of-words.

Unlike other creatures, aliens excluded, humans able to create meanings, new 
meanings and senses, by developing novel things: substances and materials 
and systems, states and situations and conditions, events and changes and 
processes, and relationships. Such newfound aspects of the world are labeled 
and registered as language constructions: terms and words and phrases and 
compounds and sentences, mostly using the standard semantic techniques of 
meaning-change processes: specialization and generalization, radiation, 
transference or projection, degrading and upgrading, and tropes as 
hyperbole, figurative extension, etc.

This "really funny service" has recently registered 'financial tsunami' as 
1m and more word. Actually, this "new meaning situation" is a compound of 
two things, financing and calamity (disaster, cataclysm, tragedy, or 
catastrophe), "an event resulting in great loss and misfortune". Then, 
depending on your imagination, there may be many subclasses here: "financial 
meltdown", "financial apocalypse", "financial plague", "financial 
visitation", etc. It is plain that such senses could be mechanically 
generated by any effective semantic applications, which would define it 
simply as "global financial crime" asking for the proper punishment.

 Still this nonsense of wordy games is thriving bringing a lot of money to 
domain name registers, for example, selling empty names.  The whole concept 
of domain name sales is false; for the number of domain names is virtually 
infinite and should be distributed free. Instead, the meaningful URI names, 
which are hard to construct, should be served, distributed and served to 
public, better sold.

Presently, a domain holder (a domain owner) can virtually sell an infinite 
number of subdomains under its domain name.  For example, the owner of 
example.com could provide such subdomains as foo.example.com, 
foo.bar.example.com. and so on, what is a flat commercial and semantic 
absurdity to be corrected for our social benefit. Just GoDaddy (my web sites 
host) has registered about 30,000,000 domain names. Now consider the new 
Internet protocol version, IPv6, with 3.4×1038 unique addresses.

We have to advance the new meaningful Internet and WWW, founded on the 
knowledge infrastructure: the ontological top-level URIs and domain URIs, 
generic top-level ontologies and domain ontologies, semantic metadata and 
effective reasoning mechanisms, thus building the meaningful global 
cyberspace with the intelligent web.

See STRATEGIC GOALS on http://www.semanticwww.com:

1. Assigning globally unique and permanent URI's to all entities 
(information web resources, network-retrievable entities, and 
non-information resources, as individuals, collections, and abstractions, 
network-irretrievable entities), using the existent DNS.

2. Developing a universal name space as .world or .entity or .thing, which 
is to systematically order all existent namespaces as .com., .net, .org, 
.biz, .info, .name, .pro and new sponsored ones as .gov, .edu, .int, .jobs, 
.tel, .mil, .travel, .jobs, .mobi, .cat, etc.


Azamat Abdoullaev

http://www.standardontology.org





----- Original Message ----- 
From: "John F. Sowa" <sowa@bestweb.net>
To: "[ontolog-forum]" <ontolog-forum@ontolog.cim3.net>
Cc: "SW-forum" <semantic-web@w3.org>
Sent: Thursday, June 11, 2009 12:41 AM
Subject: Re: [ontolog-forum] One new English word every 98 minutes


Adrian, Rich, Joel, and Martin,

The goal would be more impressive if the result weren't so silly:

 > The Global Language Monitor today announced that 'Web 2.0' has
 > bested 'Jai Ho', 'n00b', and 'slumdog' as the 1,000,000th English
 > word or phrase added to the codex of the fourteen hundred-year-old
 > language.

Source:

Trying to define the boundary of what or is not an English word is
as difficult or ridiculous as defining a boundary that distinguishes
a river from all the wetlands, streams, swamps, and flood plains it
travels through, near, around, or over.

Some people say that a language consists of the totality of all
its dialects.  But that raises the question of how one can define
a dialect or distinguish two similar dialects.

I like Wittgenstein's definition of a language as the totality of all
language games that are played with a given vocabulary.  But that
raises the question of how one defines a vocabulary or a language game.

AW> So, does this strengthen the case for controlled vocabularies,
 > or indicate that the task of controlling the English language is
 > hopeless?

RC> No, it means that the "control" mechanism must be dynamic and
 > adaptable.

It's clearly hopeless to control any natural language, and any attempt
to stop the unbridled growth is counterproductive.  The attempts by
l'Académie Française were as effective as building a picket fence
to stop a tidal wave.

But controlled natural languages are a very natural development
for creating new language games for special purposes.

Aristotle's use of Greek for syllogisms is a well-known example
that was adopted and extended by Euclid and other mathematicians.
Other examples include the special language games created by text
messagers, baseball umpires, religious preachers, diplomats,
tobacco auctioneers, and parents who talk to their babies.

JW> ... I just end up sounding like a long winded n00b.

If I took the Language Monitor gang seriously, I would point
out that 'Web 2.0' made it into the language, but 'n00b' didn't.

MH> I did some quantitative studies (preliminary) trying to assess
 > what typical delays for consensus in standardization of terminology
 > or structures mean for the coverage of the current vocabulary in
 > the artifact specifying the vocabulary.

Such studies are important for understanding how terminologies grow.
We can't stop the growth, but if we get out front, we might be able
to guide some of the rivulets into promising directions.

John Sowa

Received on Thursday, 11 June 2009 17:21:52 UTC