From: Keith Henson (email@example.com)
Date: Sun 02 Feb 2003 - 18:32:44 GMT
At 08:27 AM 02/02/03 -0800, you wrote:
>The information contained in a statue or painting or much of anything else
>is not measured in the numbert of atoms it contains.
Sure it can be. Not that it is all that useful, but you could create an
exact atomic copy if you knew the kind and placement of every atom. (And
had a nanofabrication device.)
>You can't say a big statue contains more information than a small
>statue. Especially if one is a copy of the other.
It certainly does. A small copy of a large statue could be reconstructed
to the atom by a smaller amount of information.
>Or perhaps we should first agree on what you mean by information. I see
>it as what is being transfered from one human being to another.
Bits. Information is *measured* in bits. Transfer has nothing to do with
measurement. Humans however can only store a few bits per second, so
*many* gram-moles (Avogadro's Number) of information are *massively* compressed. We would consider a copy of a statue of a human accurate to 0.01 percent to be identical to another one. They aren't of course, but at the compression we use, you store an essentially identical memory by viewing either.
How much memory do I retain from seeing Rodin's _Burghers of Calais_ at
Stanford University a few years ago? (By bright moonlight I might
add.) Enough to remember they were strung out across a plaza and not in a
group they way they are here:
I also remember the hands and feet being exaggerated (even for the large
size of the sculpture). I remember enough to agree with this snippet from
... The posture expresses an attitude of utter grief and despair. Rodin could have selected many scenes from the legend of "The Burghers of Calais". ... rubens.anu.edu.au/student.projects/ garden/bofc/boc.html - 10k - Cached - Similar pages
(Well worth reading!)
>You may see it as anything that can be analyzed and some information
>discovered about it. But most of that information would not be memetic
>because it was not transfered from one mind to another. Of course, if you
>write a paper about that information, then it would become memetic.
I am sorry, but you just *can't* use words like Humpty Dumpty did in
Through the Looking Glass!
At least not in a discussion where there are people who are into
scientific/engineering culture. There is a *deep* understanding of
"information" in mathematical and physical terms.
bit (b) 
the basic unit of information. Each bit records one of the two possible
answers to a single question: "0" or "1," "yes" or "no," "on" or "off."
Logically, this is the smallest quantity of information that can exist. The
word "bit", coined by the American statistician and computer scientist John
Tukey (b. 1915) in 1946, is an acronym for binary digit.
bit (b) 
a logarithmic unit of storage capacity, equal to the base-2 logarithm of
the number of possible states of the storage device or location. If data is
stored as binary digits, this reduces to definition : an 8-bit storage
location, for example, has 28 = 256 possible states, so its capacity is
log2 28 = 8 bits. If, however, a storage location stores one letter, then
it has 26 possible states, and its storage capacity is log2 26 = 4.7004 bits.
bit (b) 
a unit of information content, now known properly as the shannon.
a unit of information content used in information and communications
theory. The definition is based on the idea that less-likely messages are
more informative than more-likely ones (for example, if a volcano rarely
erupts, then a message that it is erupting is more informative than a
message it is not erupting). If a message has probability p of being
received, then its information content is -log2 p shannons. For example, if
the message consists of 10 letters, and all strings of 10 letters are
equally likely, then the probablity of a particular message is 1/2610 and
the information content of the message is 10(log2 26) = 47.004 shannons.
This unit was originally called the bit , because when the message is a
bit string and all strings are equally likely, then the information content
turns out to equal the number of bits. One shannon equals log10 2 = 0.301
030 hartley or loge 2 = 0.693 147 nat. The unit is named for the American
mathematician Claude Shannon (1916-2001), the founder of information theory.
Lucent - Information Theory
... In 1948, Bell Labs scientist Claude Shannon developed Information Theory, and the world of communications technology has never been the same. ... Description: A basic introduction and history of information theory from Bell Labs. Category: Science > Math > Applications > Information Theory www.lucent.com/minds/infotheory/ - 13k - 1 Feb 2003 - Cached - Similar pages
A Mathematical Theory of Communication
... was reproduced in the collection Key Papers in the Development of Information Theory
. The paper also appears in Claude Elwood Shannon: Collected Papers [3 ... Description: Claude Shannon's seminal paper, made available by Bell Labs in PostScript and PDF. Category: Science > Math > Applications > Communication Theory cm.bell-labs.com/cm/ms/what/shannonday/paper.html - 5k - Cached - Similar pages
If you want to redefine "square root" in the context of a discussion on
mathematics you have to expect people to object.
I think what you are trying to do is discuss perception of something like a
painting or statue in cultural (meme pool) terms. As you may be able to
tell, this is something I really appreciate. But please be careful about
using words that have specific and long established technical definitions.
This was distributed via the memetics list associated with the
Journal of Memetics - Evolutionary Models of Information Transmission
For information about the journal and the list (e.g. unsubscribing)
This archive was generated by hypermail 2.1.5 : Sun 02 Feb 2003 - 18:56:41 GMT