Fwd: Re: What are memes made of?

From: Robin Faichney (robin@faichney.demon.co.uk)
Date: Mon Jan 31 2000 - 18:57:50 GMT

  • Next message: Kenneth Van Oost: "Re:memetics-digest V1# 118"

    Received: by alpheratz.cpm.aca.mmu.ac.uk id TAA23629 (8.6.9/5.3[ref pg@gmsl.co.uk] for cpm.aca.mmu.ac.uk from fmb-majordomo@mmu.ac.uk); Mon, 31 Jan 2000 19:06:59 GMT
    From: Robin Faichney <robin@faichney.demon.co.uk>
    Organization: Reborn Technology
    To: memetics@mmu.ac.uk
    Subject: Fwd: Re: What are memes made of?
    Date: Mon, 31 Jan 2000 18:57:50 +0000
    X-Mailer: KMail [version 1.0.21]
    Content-Type: text/plain
    Message-Id: <00013118594402.00393@faichney>
    Content-Transfer-Encoding: quoted-printable
    Sender: fmb-majordomo@mmu.ac.uk
    Precedence: bulk
    Reply-To: memetics@mmu.ac.uk
    

    Forwarded to the list with Mark's permission. My response follows.

    ---------- Forwarded Message ----------
    Subject: Re: What are memes made of?
    Date: Sun, 30 Jan 00 21:58:48 -0000
    From: "Mark M. Mills" <mmills@htcomp.net>

    Robin,

    Thanks for posting your essay. I hope you continue with it.

    My primary criticism is the lack of clear foundations, particularly for
    the key term 'information.'

    The essay starts with a description of information as 'simply the form,
    or structure, of matter.' Though not stated, I'm assuming this reflects
    Frieden's use of the term. In the second sentence, you say: "Physical
    information is inversely proportional to entropy..."

    I'm assuming this is Frieden's sense of 'information.' I'm more familiar
    with Shannon information theory, so this confuses me. Shannon shows
    information capacity proportional to entropy (higher the entropy, the
    higher the information capacity).

    C=H/(T/N-lambda) (from Shannon's 1948 paper 'A Mathematical theory of
    Communication')

    C=information capacity
    H=entropy
    T=time
    N=number of bits per signal
    Lambda= noise term (I'm not entirely confident on this)

    In Shannon terms, the presence of noise (high entropy) increases the
    information content. For example, radio started with amplitude
    modulation. Noise was frequency modulation. FM technology turned FM
    noise into signal, increasing radio's information capacity above that of
    AM technology.

    Since I'm familiar with Shannon information theory, making high entropy
    low information confuses me. I have to ask how fiber optics, using high
    entropy light frequency ends up labeled low information. Alternatively,
    prior to the big bang, the universe was a massively compact spec of
    matter. Entropy was almost zero. In this state, there was no
    information contained in the universe (no history). How can this be
    described as a 'high information' state?

    Information, form, identity.. all these terms are very difficult to
    define, but critical to your argument. I could probably in step with
    your use of these terms, I just need definitions and maybe an example or
    two.

    Here are two words I found myself wishing you would use: isomorphism and
    'bit.' Isomorphism is central to any logical differentiation of one
    'identity' from another. When talking about parent and child 'carrying'
    a single gene, one has to be using some sense of isomorphism.

    Once isomorphism comes up, the idea of a 'bit' seems to follow. A bit is
    simply the smallest unit of isomorphic form.

    Back to the essay...

    It seems that one of your main points is differentiation of substance
    isomorphism and formal isomorphism. Your comment that evolutionary
    biologists use 'hierarchical, recursive definition, so that one higher
    level gene is composed of two or more lower level ones" was very, very
    interesting. Unfortunately, this theme was not explored explicitly. I
    would have loved to have read more about this. I'm assuming this is
    suggesting evolutionary biologists use formal isomorphism while molecular
    biologists use substance isomorphism.

    Instead of elaborating on evolutionary biologists using multilevel genes,
    the essay goes into a discussion of form, identity and coding. It seems
    your goal is a proof that genes and memes are information, not physical
    substance. I'm not sure I'm reading you right, but that's the best I can
    make of it. You end the essay saying 'Memes, like genes, are encoded
    physical information, but exist at a higher level of organization.' This
    seems to be a restatement of the evolutionary biologists view (the
    multilevel notion for the term 'gene').

    Thus, as best I can tell, the essay is really an attack on the
    conventional belief that chunks of DNA are literally genes.

    Well, that's all. Just some comments. I find the topics very important,
    so I like your paper. Keep working on it!

    Mark

    PS If you want me to post this reply to the list, let me know.

    --
    Robin Faichney
    

    ===============================This was distributed via the memetics list associated with the Journal of Memetics - Evolutionary Models of Information Transmission For information about the journal and the list (e.g. unsubscribing) see: http://www.cpm.mmu.ac.uk/jom-emit



    This archive was generated by hypermail 2b29 : Mon Jan 31 2000 - 19:06:59 GMT