[Next] [Previous] [Up] [Top] [Contents]
8 Some simple consequences of the framework
8.2 Response to Noise
By noise we mean the aspects of the past accessible of observable data which are not currently capturable or describable by any of its internal models. This may be due to many reasons including (but not exclusively) inherent or effective randomness introduced in the data.
Let us say the distance of the internal model from the past data is large. Given the framework above what are the options conceivable open to such an agent? There are several, some of which I list below.
- Widen its search to a space of models with greater complexity, i.e. choosing a greater value for n in
. Given a typical exponential increase in the size of such spaces with n, this is only plausible if the agent has resources to spare. This is equivalent to the agent not attributing the error not to noise, but to a bridgeable explanatory gap. Of, course, frequently the agent will not know beforehand whether this strategy has any chance of success.
- Increase the volume of its models by making the predictions of its models less precise. This is equivalent to using a coarser graining in its modelling.
- Increase the volume of its models by restricting the conditions of application of its models. This is an acceptance of less generally applicable models and hence the attribution of special factors to lessen the error (e.g. excluding "outliers").
- Most radically - change the language of internal representation, L. This could be restricted so as to avoid overfitting if the agent thought the extra detail was superfluous - this is a trade-off of expressiveness of L against a smaller volume. Alternatively the language could be made more expressive, which would be roughly equivalent to the first option. This is perhaps the least understood and studied option.
The actual course taken will depend on the trade-off relevant to the agent. It is interesting to note that these different courses of action correspond closely with different conceptions of noise - noise as the unexplained, noise as randomness, noise as excess variation, noise as irrelevance and noise as the unrepresentable.
Modelling Learning as Modelling - 23 FEB 98
[Next] [Previous] [Up] [Top] [Contents]
Generated with CERN WebMaker