From: Dace (edace@earthlink.net)
Date: Fri 06 Feb 2004 - 19:21:53 GMT
> From: Keith Henson <hkhenson@rogers.com>
>
> > > But unless you are going to argue for disembodied
> > > spirits, minds are utterly dependent on brains.
> >
> >Take a look at what you're saying here. You've got mind, and you've got
> >brain. That makes two, right?
>
> Talk about apples and oranges. No. You certainly would not say that
about
> a computer and the OS running on the computer.
Sure you would. The OS is one thing, and the hardware is a second thing,
the first thing being dependent on the second.
> > > The situation is identical
> > > to the OS of a computer. It absolutely has to be running on hardware
for
> > > you to interact with it.
> >
> >If we're going to talk in metaphors, a more accurate one would be a coin.
> >That we distinguish "heads" from "tails" doesn't mean we've got two
coins.
> >It's one coin viewed from two angles.
>
> This makes no sense as a metaphor. Different levels. Minds are at a
> different level from brains.
In other words, minds are different from brains.
> Would it help you understand my viewpoint if
> I say the underlying hardware could be changed to a silicon simulation of
> the brain circuits and you could still interact with the same mind? We
> can't do this with human minds yet,
Sure we can. We're doing it right now. There's a continual turnover of
components in the brain. One protein dissolves and is replaced by another.
The brain keeps getting replaced, all the way down to the atomic level, yet
we're still here. That's because the point of an organic system is the
whole not the parts, the mind not the matter.
> >The important
> >thing is not what's in the brain but what we find when we "flip it over"
and
> >view it, from within, as mind. While every brain shows something
different,
> >every mind reveals the same meme. Unfortunately, there's no objective
way
> >of viewing a mind.
>
> That's like saying there is no way to objectively view an OS, something
> that is done every day.
Which might suggest that an OS isn't the same as a mind.
> Minds are also judged objectively for being sane
> by medical people and judges and for being smart, deluded and any number
of
> other characteristics.
While psychiatry is about as exact a science of mental illness as you'll
ever get, there's always an element of subjectivity.
> You apply statistics and other ways to measure signals in noise. If the
> results are still subjective and imprecise, then you are not dealing with
> science.
And this goes for the human sciences as well, but the underlying phenomena
represented by the statistics is inherently subjective.
> > > >This would entail knowing the language
> > > >your brain uses, so you would have to know not only English but
brainese.
> > >
> > > No because the brain's deep hardware does this far below the level of
our
> > > conscious awareness.
> >
> >While this is purely fanciful, even if it were true, it would only push
the
> >question back a step. How, then, do you communicate with your brain's
deep
> >hardware? Now you have to know deep brainese.
>
> You really should read _Society of Mind_ by Minsky and _The Social Brain_
> by Gazzaniga. You don't even have to know you *have* a brain to
> communicate with brain hardware.
That's the problem right there. Aristotle thought the seat of intelligence
was the heart. This is standard in traditional societies the world over.
How do we communicate effectively with something we have no direct awareness
of? Is this something that evolved deep in our phylogenetic history? Are
there worms that are aware of their brains and communicate with them
directly, while for us it's just unconscious habit?
> >The point is that there's only one of you.
>
> That's not actually true. It *seems* like there is only one of you, but
> that "one" has very little idea of all the activity contributing to what
> seems like a unitary experience.
E pluribus unum.
> Reading Gazzaniga and Sacks on various
> brain injuries and experiments will give you an idea of what actually goes
> on.
I'm familiar with Sacks, though I find Ramachandran more intriguing.
Fascinating stuff, none of which disproves the existence of a unitary whole
that brain activity is geared to maintain.
> > > I am a hard line materialist.
> >
> >And yet you're an evolutionist, both natural and cultural. Consider the
> >following passage from Alfred North Whitehead:
> >
> >"A thoroughgoing evolutionary philosophy is inconsistent with
> >materialism. The aboriginal stuff, or material, from which a
materialistic
> >philosophy starts is incapable of evolution... Evolution, on the
> >materialistic theory, is reduced to the role of being another word for
the
> >description of changes of the external relations between portions of
matter.
> >There is nothing to evolve, because one set of external relations is as
good
> >as any other set... There can merely be change, purposeless and
> >unprogressive. But the whole point of the modern doctrine is the
evolution
> >of the complex organisms from antecendent states of less complex
organisms.
> >The doctrine cries aloud for a conception of organism as fundamental for
> >nature." (*Science and the Modern World,* ch. 6).
>
> Just because an author is famous does not prevent him from being seriously
> confused, wrong or just FOS.
>
> >It's not matter that evolves but form.
>
> That's true.
Which is exactly what Mr. FOS was saying.
> In more detail, it is the information contained in the
> form.
Again, you're needlessly complexifying the issue. Information *is* form,
specifically form that's nonrandom in a given context.
> >Instead of reducing form to matter,
> >as in the case of machines, we must recognize that living matter is
> >subservient to its form.
>
> I can't buy that there is any fundamental difference between machines and
> "living matter." If you take a fine enough look at living things, they
> *are* molecular machines.
If you take a fine look at living things, you find total disorder, utter
randomness. Just like a gas cloud. According to Boltzmann, the physicist
who provided the first account of molecular randomness over a century ago,
whatever order we do find in a gas is exactly what we'd expect to find given
that some nonrandomness is bound to randomly arise.
Any causal chain we're able to trace out in an organism ultimately yields to
disorder. There are no long-distance causal chains connecting genes to
organ-level structures. You get into a cell, and you find a few rafts of
causal coherence on an ocean of randomness. The difference between a
machine and a living system is that a machine is founded on molecular
stability while a life-form maintains order at the highest levels despite
total disorder at the lowest levels. Of course, organisms contain all kinds
of machine-like elements. But these function within the context of holistic
organization. In machines, form follows matter. In organisms, matter
follows form.
> >What the matter of your body does is to maintain
> >its form.
>
> You need to expand on this.
The first structure to emerge in an embryo is a general body plan, followed
by a rough outline of large-scale structures like circulatory and organ
systems. Next the organs themselves begin to emerge. Only after this do we
see differentiation into discrete tissues. Lastly, we get fully specialized
cells. Order appears top-down, not bottom-up as with the assembly of a
machine. Rather than the overall form emerging from the activities of
specialized cells, the cells specialize to conform to the order imposed on
them from above.
> >Ultimately, evolution is a holistic concept.
>
> And this.
Just as it's the form that matters in ontogenesis, it's the form that
evolves over time, not the matter.
Ted
===============================================================
This was distributed via the memetics list associated with the
Journal of Memetics - Evolutionary Models of Information Transmission
For information about the journal and the list (e.g. unsubscribing)
see: http://www.cpm.mmu.ac.uk/jom-emit
This archive was generated by hypermail 2.1.5 : Fri 06 Feb 2004 - 19:33:09 GMT