The Noo Noo

This is my fully autonomous robot vacuum cleaner.   It is based on The Noo Noo from the BBC's Teletubbies.   Although it does actually have the workings of a dust-buster inside, it is more affective than effective.   If you didn't see the movie, here it is.   Looking at the movie, you will know what it is thinking (see below).

This mobile robot also has no sensors - it has a representation of things in its world, but the symbols used in that representaiton are grounded as described here:

@article{IwR,
author ="Peter Wallis",
title ="Intention without Representation",
journal ="Philosophical Psychology",
publisher = "Cambridge University Press",
year = 2004,
volume =2,
month = "June"
}

The input to decision making is energy usage.  The Noo Noo's snout swings with a natural rythm and if that is interrupted, the servos need to work harder.  The snout then "searches" and, as a reactive layer, changes the direction of travel.   The map of our kitchen it contains is effectively a map of places that are bad for it's enery usage.   One day I hope to show this to Professor Boden and see if she thinks it is an implementation of symbol grounding in terms of metabolism...

Affective Robots

Why The Noo Noo? Well The Noo Noo is engaging (not just for two year olds) but does very little. It also expresses almost no emotion. I propose that engagement comes from intentional behaviour. We expect people to do what they believe is in their interests. That is, our folk psychological understanding of other people is the basic BDI architecture - other people have goals and do things that they think will achieve them. This is too obvious to state really, but computers don't normally do this.   The Noo Noo likes tidying up after the Teletubbies when they make a mess (goal). The Noo Noo looks about (updates beliefs) and if it sees a mess, then it forms an intention to go and clean up.   The Noo Noo does express an emotion (agitation) by wobbling its eyes and this "explains why" The Noo Noo runs away when the Teletubbies want to give it a Big Hug. This intentional behaviour is I believe a core component for systems that we want humans to perceive as intelligent. Two papers on The Noo Noo appeared at i-CaP'06:

conference{sg2yo,
author="Peter Wallis",
title="The Symbol Grounding Problem for Two Year Olds",
booktitle="Computers \& Philosophy, an International Conference (i-CaP)",
address="Laval, France",
year=2006,
note="Gives an explicit description of grounding symbolic reasoning in
terms of previously learnt behaviours",
}

@conference{attIntBel,
author="Peter Wallis",
title="Attention, Intention, and the Nature of Believability",
booktitle="Computers \& Philosophy, an International Conference (i-CaP)",
address="Laval, France",
year=2006,
note="Argues that believable action requires an agent fit with the
Dennett's intentional stance. (Dennett was in the audience!)"
}

Back to home page