AI is very powerful at doing a wide range of verbal and numerical tasks, but we haven't been able to make robots effective in the real, physical world. To solve this problem, computational neuroscientist Gaurav Suri argues that we need to give AI models their own motivation. Against our intuition that conscious experience is necessary for motivation, Suri insists that machines can and must become motivated in order to tackle complex and evolving problems. As a result, AI researchers must learn how motivation works in living minds.
A mild hunger may leave us unmoved. We notice it, perhaps, but do nothing. A stronger hunger can change everything. We pull on a coat, step out into the cold, and walk several blocks in search of food. Something similar happens with anger: a mild offense may be brushed aside, while a stronger one may push us toward confrontation. Asked to explain these differences, we might say that the stronger state leaves us more motivated. And if pressed to say what we mean by that motivation, we would probably point to a familiar conscious experience: a feeling of inner pressure or readiness, focused and taut, like a coiled spring. It is hard not to conclude that hunger or anger first creates this feeling, and that the feeling, in turn, drives the right kind of action at the right intensity.
This, in essence, is the intuitive theory of motivation: a bodily signal, such as a need, an emotion, or a goal arising from them, gives rise to a conscious feeling, and that feeling then produces actions suited in both type and intensity to the demands of the moment.
___
Today’s systems are not yet much like organisms. They can execute, but they do not truly regulate their own action in a context-sensitive way.
___
But there is good reason to think this intuitive theory is wrong. Motivated feelings may accompany motivated actions without actually causing them. Even very simple creatures exhibit behavior much like motivated action. The tiny worm C. elegans, with only 302 neurons, becomes more willing to cross an aversive barrier to reach food when hungry, much as a hungry person may become more willing to step out into the cold in search of food. It may be that such a creature has some faint form of experience. But it is hard to imagine that it possesses anything like the rich, differentiated conscious feeling humans mean when they say they feel motivated. Yet its behavior still changes systematically with need. That suggests that the machinery organizing action does not depend on an elaborated conscious feeling. In humans, the feeling may accompany that machinery closely enough to seem like the cause, without actually being it.
If that is right, then the most important work of motivation may be happening beneath awareness. Perhaps the system that changes action is also, in creatures like us, the system that gives rise to the conscious feeling of motivation. On this view, the feeling does not cause the behavior. Both feeling and behavior are outputs of a deeper neural process unfolding in the brain. The feeling accompanies motivated action, often closely enough to fool us into treating it as the engine.
Towards Motivated Machines
Once motivated feeling can be separated from motivated action, a new possibility comes into view: machines may be able to become motivated too. That is, they may be able to select the kind of action a situation calls for and to pursue it with the appropriate intensity, without needing anything like the conscious feeling humans report.
Join the conversation