Received: by alpheratz.cpm.aca.mmu.ac.uk id LAA04229 (8.6.9/5.3[ref firstname.lastname@example.org] for cpm.aca.mmu.ac.uk from email@example.com); Wed, 9 May 2001 11:14:00 +0100 Date: Wed, 9 May 2001 10:16:45 +0100 To: firstname.lastname@example.org Subject: Re: Information Message-ID: <20010509101645.A480@ii01.org> References: <3AED8233.15786.6E25AE@localhost>; <20010501175304.A1297@ii01.org> <3AEEB6DE.18806.39FF8D@localhost> <20010502094920.A752@ii01.org> <3AF0B02C.605EB5F@wehi.edu.au> <20010504101910.A533@ii01.org> <3AF5ECA7.4F801F8A@wehi.edu.au> Content-Type: text/plain; charset=us-ascii Content-Disposition: inline User-Agent: Mutt/1.3.15i In-Reply-To: <3AF5ECA7.4F801F8A@wehi.edu.au>; from wilkins@wehi.EDU.AU on Mon, May 07, 2001 at 10:30:30AM +1000 From: Robin Faichney <email@example.com> Sender: firstname.lastname@example.org Precedence: bulk Reply-To: email@example.com
On Mon, May 07, 2001 at 10:30:30AM +1000, wilkins wrote:
> Robin Faichney wrote:
> > Fisher information, as the New Scientist article puts it, captures how
> > much information you can squeeze out of a physical system. What Frieden
> > has shown is that using Fisher information (I) and the information
> > inherent in the system (J), the laws of physics can be derived. So I
> > think "the Fisher Information account of measurement that Frieden
> > proposes" is misleading. J is information as the concept occurs in
> > thermodynamics, i.e. a measure of the structure of matter, and that is,
> > in one sense at least, more fundamental than Fisher information.
> A friend is doing his PhD on Freiden, which is why I know his work. The
> NS article was very vague and according to my friend, the critical
> measure *is* I, because J is inaccessible (a bit like Chaitin's Omega,
> but for distinct philosophical reasons). I is what is known in
> statistics as the Cramer-Rao Bound - the likelihood of an accurate measure.
I said J is "more fundamental than Fisher information". You say I is
"the critical measure". Neither phrase is very precise -- they could,
conceivably, both be true -- and I'm happy to drop any suggestion of
I/J competition. What is important to me is that, using *both* of
these, and publishing in peer-reviewed physics journals, Frieden and
colleagues have *successfully* derived the laws of physics.
> > > 2. The Shannon-Weaver account that makes information of a sequence its
> > > (prior) probability of being encountered
> > There are close connections between information in communication theory
> > and in thermodynamics:
> > Information transmission has been defined as the change in
> > probability of an event in one ensemble resulting from the
> > occurrence of an event in another. Interactions of this sort
> > typically involve a transformation of energy. Any transformation
> > of energy, conversely, is accompanied by changes in probability
> > among associated events. Thus thermodynamics, the study of energy
> > transformations, might be expected to bear a close relationship
> > to communication theory.
> > Even a summary discussion of this relationship would involve more
> > mathematical resources than presently at our disposal. The upshot
> > of such a discussion, however, would be that communication theory
> > provides a basis upon which thermodynamics can be systematically
> > developed. Relying upon earlier work by Maxwell and Boltzmann, on
> > how to compute the properties of gases by statistical methods, and
> > upon subsequent work by Gibbs and Planck showing how the results
> > of classical thermodynamics can be got from quantum theory through
> > these statistical methods, Jaynes was able to show in the late
> > 1950s how these same results could be got more perspicuously on
> > the basis of communication theory. Given Shannon's formulae it is
> > now possible, as Tribus puts it, "to begin by considering the
> > properties of the smallest particles and by simple mathematical
> > methods to deduce the properties" of macroscopic systems.
> > A clue to the relationship between thermodynamics and
> > communication theory is that both employ "entropy" as a technical
> > term, with definitions that bear a close formal resemblance.
> > [Cybernetics and the Philosophy of Mind, Kenneth Sayre, pp36-7]
> >  Citation available on request.
> > The quote at the top is also from Sayre (p22).
> Not to put too fine a point on it, much crap has been written about the
> relation between S-W info and thermo. The relationship is purely one of
> mathematical analogy.
Sayre makes some quite specific claims in the quote above, and I'm
not clear to what extent and in what way you think these are wrong.
For instance, do you believe it is *not* possible, using Shannon's
formulae, "to begin by considering the properties of the smallest
particles and by simple mathematical methods to deduce the properties"
of macroscopic systems?
> Brillouin wrote an excellent book back in the 60s,
> I believe, in which he effectively destroyed the idea that they are
> commensurate or covariant measures. He used a lovely argument about
> Maxwell's Demon that I think you ought to look at - in sum, for the
> Demon to work, he needs to employ energy to get what is effectively the
> Fisher Info, but not the SW info.
I quite admit that this stuff is pushing my boundaries, but I've seen
nothing yet to make me doubt the one point upon which my philosophy
depends, that it is entirely valid, and can be very fruitful, to treat
the structure of matter as information -- where that word, obviously,
does *not* imply semantic content. I use Frieden and thermodynamics
as illustrations of this, and though I do not directly rely on either
for my own work, I obviously need to do some research to confirm that
they're good examples.
> > > 4. the semiotic or intentional account of the Peircians, Meinongians and
> > > other representationalists.
> > >
> > > So far as memetics is concerned, only the first three are relevant (it
> > > matters not a whit is the information being transmitted is true,
> > > coherent or in any way of significance to any audience, so long as it
> > > spreads through a population).
> > >
> > > If something is a measurement of some state distinct from the observer,
> > > then that information (ie, the error implicit in the measurement) is a
> > > physical mapping of what's in the head to what's in the world. However,
> > > it fails to be memetic information until it is transmitted, and then
> > > senses 2 and 3 come into play, so we can ignore the two extremes:
> > > "objective" information in the sense of accuracy of measurement, and
> > > "subjective" information in terms of what something means within the
> > > head of a semantically or semiotically capable system (ie, some person)
> > > and concentrate instead on the dynamics of information transmission and
> > > the evolution of the signals so transmitted.
> > I think that's all true, regarding memetics itself. What I'm
> > more interested in though is the philosophy of memetics, and the
> > relationship(s) between these different concepts of information, including
> > 4. How, exactly, is semantic information "encoded" in Shannon-Weaver and
> > thermodynamic information? In memetic terms this means how, exactly,
> > do memes as behavioural patterns relate to memes as ideas and such?
> > More generally, this is about the relationship between mind and matter.
> > I think I'm making real progress on that, but I know not everyone agrees!
> There is a philosophical conundrum known as the "linguistic prison" - in
> order to discuss things we must cast them in liguistic terms, but we are
> discussing in the areas of truth and mind matters that cannot be so cast
> without prejudicing the argument - in short, whereof one cannot speak,
> thereof one must be silent (and as some wag had it, "and you can't
> whistle it either").
I don't think throwing one's hands up is a very useful tactic. Whereof
one cannot speak in perfectly explicit, objective, literal terms,
one can use analogy, metaphor, intersubjectivity, etc. That's what
they're for. To use a metaphor about metaphor: in effect, you *can*,
sometimes, whistle it! We're not all confined to the straight-jacket
of strictly scientific methodology. We can use the tools of culture,
even as we talk about it. And we do!!!
> How meaning is encoded is very much context
> relative - see Dretske's book. The encoding protocol used determines the
> meaning of a message at receiver.
Of course it's context relative. Everything about meaning is! But my
interest is at a more abstract level -- admittedly, 'How, exactly,
is semantic information "encoded"' is misleading -- what I should have
said there was "what exactly does 'encoded' mean here?"
> Brillouin, Leon. Science and Information Theory. New York: Academic
> Press, 1956.
Sayre cites this title frequently. The year is given as 1962, which is
presumably a later edition, and not the 60s work you mention above.
> Dretske, Fred I. Knowledge and the Flow of Information. Cambridge,
> Mass.: MIT Press, 1981.
I have read that, and given the title's close association with my
interests, I found it very disappointing.
> The disanalogy of thermo and SW info is discussed in
> Pierce, John Robinson. An Introduction to Information Theory : Symbols,
> Signals and Noise. 2nd, rev. ed. New York: Dover Publications, 1980.
I will definately seek this one out. Thanks a lot.
If anyone is interested, in the light of what's come up in this thread,
I'd be very keen to see any comments on my use of information -- see
-- Robin Faichney Get your Meta-Information from http://www.ii01.org (CAUTION: contains philosophy, may cause heads to spin)
=============================================================== This was distributed via the memetics list associated with the Journal of Memetics - Evolutionary Models of Information Transmission For information about the journal and the list (e.g. unsubscribing) see: http://www.cpm.mmu.ac.uk/jom-emit
This archive was generated by hypermail 2b29 : Wed May 09 2001 - 11:17:43 BST