Complexity Reference: compref120 (BibTex)

Authors: Fraser,AM

Year: 1989

Title: Measuring Complexity in Terms of Mutual Information

Book Title: Measures of Complexity and Chaos

Editors: Abraham,NB

Publisher: Plenum Press, New York, pages: 117-119

Comments: A shannon like entropy defined in terms of mutual information is defined which is useful in noisy situations. Complexity is defined as the rate at which predictability disappears.

Keywords: SCIENCE, COMPLEXITY, ENTROPY, PHYSICS, CHAOS, INFORMATION,

| Top | Authors | Journals | Years | Keywords | Search | New | Comments |


Bruce Edmonds, Centre for Policy Modelling, Manchester Metropolitan University