From: "Chris Lofting" <ddiamond@ozemail.com.au>
To: <memetics@mmu.ac.uk>
Subject: Re: Memetic/Ontological correspondence?
Date: Sat, 3 Jul 1999 14:07:28 +1000
-----Original Message-----
From: Aaron Agassi <agassi@erols.com>
To: memetics@mmu.ac.uk <memetics@mmu.ac.uk>;
Critical-Cafe@mjmail.eeng.dcu.ie <Critical-Cafe@mjmail.eeng.dcu.ie>
Date: Saturday, 3 July 1999 3:14
Subject: RE: Memetic/Ontological correspondence?
<snip>
>>before on BOTH of these lists,
>> these patterns are part of the METHOD of analysis
>That is highly debatable! As we agree, patern recognition begins with
>pattern congectural genneration. Then there must be a check for Ontological
>coorespondance. The need for this test does not, in and of itself, rule out
>all correspondance with reality! Even created meaning can actually be
>literally true, though not always.
>
>>and do not necessarily
>> reflect 'out there';
>Indeed, only possibly. But doubt is nothing new, nor does it constetute
>refutation. You must sill demonstrate that wave patterns are no more than
an
>artifact of Methodology.
>
see previous post. Note in that post that the implied wave interference
pattern gets stronger as you do more trials but never changes. If you look
at the graph there are 19 peaks out of a total of 27 states where eight
states are seen as 'troughs' but in fact contain only one pattern.
When you analyse something like Aristotle's syllogisms, so there are a
possible 256 of which only 19 have some 'value'. This reductionism comes
about where either something is 'meaningless' or else it is an equivalence
to another. Both of these states lead to a reductuion and this creates the
implied wave interference pattern.
The problem with statistics is that it imposes these patterns rather than
necessarily detecting them. If you exclude indeterminance or equivalence
then you get a normal distribution curve.
Thus behind all of these apparently different concepts is ONE method --
dichotomous analysis.
>>these experiments emulate dichotomous thinking (e.g.
>> left/right determinations taken over a number of trials, the
>> statistics act
>> to 'sum' the dichotomies and as such apply the original dichotomy
>> recursively).
>Waves are revealed, plain as day, projected upon the laboratory wall, by
>visible interference patterns with light, much as with sound. Hello?
Any experiment that puts together two objects and so makes a dichotomy will
lead to the detection of wave interference patterns. The 'problem' is that
this is a property of the method and so need not necessarily be 'out there';
it does not necessarily reflect characteristics of the objects but more so
of the relationship established by us.
In experiments like those of Newton or Young etc the dichotomy is the
barrier that 'splits' the light. In this sense 'white' light is 'the one'
and the rainbow that can emerge is 'the many' i.e. harmonics of 'the one'
(Newton and prisms) In the slit experiments you impose the 'cut' and so get
a pattern dictated by the method. (it could be seen as statistical but very
quick in operation and so apparently instantaneous. QM just slows it all
down so that it builds-up over time)
That
>one, at least, is not a statistical exercise. And that already remains
>sufficient for the wave part of the wave/particle duality paradox, which
>still demands explanation.
>
There is no paradox; you do not 'see' waves/particles at the same time, the
manner of detection, YOUR INTENT, determines what you see. Go particular and
you see particles, Go general (stats) and you see waves. The latter comes
about since the fundamental unit of a statistical analysis is a PAIR and
this builds-in indeterminacy at the root in that if you try to distinguish
which elements of the PAIR comes 'first' (or goes 'left' or 'right) you get
a 50/50 chance and this introduces dichotomisations where indeterminence
gives you wave patterns.
When you set-up experiments to emulate this you get the patterns. Note that
in the photographic plates so the wave patterns build up over time; you see
the dots from the electrons/photons and then emerges the wave pattern which
implies particles are still there but their distribution is dictated by
patterns in the method and that method is dichotomy based with
indeterminance/equivalence 'built-in'.
>>
>> We see these 'waves' in ANY statistical analysis of dichotomies (see my
>> article on wave patterns in the stock markets
>>
>> http://www.ozemail.com.au/~ddiamond/patterns.html )
>>
>>
>> The use of probabilites in logic, i.e. in fuzzy logic, comes from the
>> realisation that the single context, rigid EITHER/OR processes are too
>> 'bulky', too inflexible when used to deal with reality, we need to use
>> multi-variable forms and this is where the excluded middle is found to be
>> rich in that it is the source of all that could be. This said, we
>> also need
>> to realise that there are patterns in this area that come from
>> the method of
>> analysis and so we need to be more discerning about how we go about
making
>> our maps.
>>
>> >The question, then:
>> >Might there be regions and pathways of the brain better suited
>> to relate to
>> >such Chimera as Quantum Mechanics, than left brained either or pathways?
>> >Specifically, can human ambivalence find correspondence to Quantum
>> >Mechanics? I doubt it. Because, Heisenberg Uncertainty is probabilistic,
>> >while human ambivalence is agenda driven.
>> >
>> >
>>
>> The perception of the Heisenberg Uncertainty Principle (HUP)
>> comes from the
>> method of analysis; it is part of dichotomous processing where
>> you find that
>> there is also a partner in what we could call the Equivalence Principle
>> where our brain-minds collapse concepts like A + B = C and B + A = C into
>> the same 'space', using only one of the equations to represent
>> both modes of
>> expression.
>>
>> In this sense, indeterminance occupies the same 'space' as
>> equivalence; the
>> difference is qualitative where in indeterminance we cannot tell
>> A + B from
>> B + A and in equivalence we can but do not see them as different.
>>
>> This 'same space' occupation can be captured by using the concept of a
>> superposition, making this an example of BOTH/AND-ness.
>>
>> In dichotomous analysis over time (as we find in the double slit etc) the
>> indeterminance/equivalance patterns emerge as frequency distributions
>> suggesting wave interference and if you create experiments that
>> emulate this
>> dichotomy-biased process they will show 'waves'.
>>
>> HUP comes about when you zoom-in to be so precise that you exclude other
>> elements in the context and at the same time try to exclude their
>> dependencies with what you are 'looking' at. This forms 1:many dichotomy
>> which when applied recursively with a continued emphasis on 'the
>> one' forces
>> the absolute negation of 'the many'.
>>
>> In the 'original' HUP this is reduced to seemingly 1:1 dichotomy of
>> position:momentum. Whichever you make 'the one' forces the other to be
>> negative and so indeterminate. In a simple sense, you cannot look at more
>> than one thing at a time when you demand 'precision', EITHER you look at
>> position OR you look at momentum and the closer you get to one the
further
>> you get from the other. The only way to look at both is to change
>> levels and
>> entangle both into a superposition and then view the resulting
>> virtual wave
>> as if 'the one'.
>>
>> The whole concept of SpaceTime is a manifestation of this sort of
analysis
>> where the elements of a dichotomy (Space:Time) are seen as inseperable
and
>> so combined at a 'new' level of analysis. Recursive application of ANY
>> dichotomy leads to an emerging continuum and so you get the
>> PositionMomentum
>> continuum etc etc (in fact this is more an interdigitation of the
original
>> dichotomy, a weaving of the two threads into patterns of 'meaning')
>>
>> This continuum emphasises dependencies that emerge from attempts to
stress
>> independence. This comes from the method of analysis which our
instruments
>> also emulate. With this in mind it follows that the repeated
>> application of
>> 'correct'/'incorrect' dichotomy will lead to this interdigitation and an
>> emerging continuum of possible states -- we enter the world of
>> probabilities, fuzzy logic and wave equations and this is all
>> sourced in the
>> method of analysis, it is all 'in here'.
>>
>> By understanding this, and by analysis of the dichotomy method, we map
out
>> all possible 'states' that could be for ANY discipline based on using
>> dichotomies (and I stress that the fundamental brain-mind dichotomy is
>> 1:many not 1:1. This can lead to logscale type maps etc etc)
>>
>> In Aaron's comments so far there is an emphasise on 'seperation' of 'in
>> here' from 'out there', humans have an agenda and out there does not.
This
>> is false in that 'out there' has an agenda and it is called
>> evolution and it
>> is the structuring of the fundamental particles into
>> object(fermion)/relationships(bosons) dichotomy applied recursively that
>> makes us what we are; we are not independent of 'out there' we are part
of
>> it and our success in survival is because we emulate it all 'in here' by
>> using dichotomies. (although this way of thinking forces the
>> conclusion...!)
>>
>> This internalisation of the characteristics of 'out there' allows us to
>> create our own universes 'in here' and at times these can conflict with
>> 'reality' and adjustments are required no matter how painful.
>>
>> By understanding the structuring and resulting patterns of dichotomous
>> analysis we understand that all disciplines are metaphors for describing
>> object/relationships. This does NOT mean that we throw away all Inquiry
as
>> Aaron has suggested:
>>
>> "We are left, then, with a choice: We can continue, conjecturally, in the
>> hopes of ever finding an explanation. Or we can follow Chris Lofting's
>> example and just give up, invalidating the
>> entire endeavor of Rational Inquiry and Reality Testing."
>>
>> What it does mean is a paradigm shift in how we view things and that can
>> serve to enhance things. Furthermore, all of the metaphors we have
created
>> do help to particularise the general and the development path along
>> complexity principles does allow for emergences that are generally
>> predictable but whose particular expression would be 'novel' due to the
>> context influence -- like phenotypes and genotypes.. but this too is a
>> dichotomy...
>>
>>
>> best,
>>
>> Chris.
>> http://www.ozemail.com.au/~ddiamond
>>
>>
>>
>> ===============================================================
>> This was distributed via the memetics list associated with the
>> Journal of Memetics - Evolutionary Models of Information Transmission
>> For information about the journal and the list (e.g. unsubscribing)
>> see: http://www.cpm.mmu.ac.uk/jom-emit
>>
>
>
>===============================================================
>This was distributed via the memetics list associated with the
>Journal of Memetics - Evolutionary Models of Information Transmission
>For information about the journal and the list (e.g. unsubscribing)
>see: http://www.cpm.mmu.ac.uk/jom-emit
>
===============================================================
This was distributed via the memetics list associated with the
Journal of Memetics - Evolutionary Models of Information Transmission
For information about the journal and the list (e.g. unsubscribing)
see: http://www.cpm.mmu.ac.uk/jom-emit