Did Maslow get it wrong? (and why this matters for games)
You may be familiar with Maslow’s hierarchy of needs (more on this below the cut). Maslow’s theory has heavily influenced the architecture of our AI technology, which is why I’m attuned to discussions of it or instances that support or undercut it. Recently I ran across a theory in education known as “CBUPO,” an ungainly acronym for “Comptence, Belonging, Usefulness, Potency, Optimism” designed by Richard Sagor at Washington State University (an accessible introduction can be found here (pdf)). Sagor’s theory suggests some interesting modifications to Maslow that have consequences for how we understand ourselves — as well as the motivations for gamers and AIs.
(Warning: psychological theory leading to AI and game-relevant thoughts below.)
Maslow’s theory that says we each have motivations that change as we develop: first we’re concerned with physiological needs, then safety and security, then social acceptance, then skill-esteem, and finally our ability to contribute, what Maslow called self-actualization. This has been an important theory in much of psychology, though it’s sort of fallen by the wayside during the cognitivist revolution.
Sagor references Maslow (and others such as Glasser, who has a similar Choice Theory of motivation) in the background to his “CBUPO,” idea. In doing so however he creates a fundamental difference: Sagor essentially says that functional belonging requires basic competency — mixing up Maslow’s “belonging” and “skill” layers.
So whether you’re a kid in school or an adult in the office, you don’t just accept someone “for who they are,” you accept them for what they bring to the group. You may accept them provisionally (“hey new kid, come sit with us”) but their full acceptance will depend on whether they show basic social competencies (how they dress, what they eat, their familiarity with cultural references, etc.). People who don’t appear competent to the local standards are rejected from the group.
Which makes me wonder about Maslow’s clean division between a desire for social acceptance and the need for skill esteem. When I first started thinking about this, these two levels seemed conflated, and possibly really confused or even inverted. The more I’ve thought about this though, the more I’ve begun to think this is actually a new illumination of an important part of Maslow’s hierarchy (that at least in my reading, he doesn’t speak to clearly).
(Yes, this gets back to games and AI, hang on.)
The lower needs in Maslow’s hierarchy are what he calls “deficit” needs – the less you have, the more you want to fill it. This is true of hunger, thirst, basic human contact, feelings of physical security, and to some degree, socialization. The “higher” motivations are termed “growth” needs – these don’t have an urgency when not filled, but represent human growth. We’ve modeled both deficit and growth needs in our AI, though we’ve taken a slightly different approach when modeling the latter. We’ve said, in effect, that the higher motivations, skill esteem and contribution, are quiescent until they’re used, at which point they kind of “perk up” and reward the individual (an AI in our case) with feelings of satisfaction.
So as a young kid you may not be particularly motivated by enhancing your skills, but the first time you tie your shoes, say, you want to do it again — you have a skill-based motivation that is suddenly more part of your overall motivational set. If you don’t have many positive skill-related experiences you won’t develop this motivation much; on the other hand if you have many positive experiences of this type it can become the overriding motivation in your life (see: professional work-aholics).
One hazy part of Maslow’s theory, in terms of trying to build it in as part of an operational AI, is in the transition from deficit needs to growth needs. How and when does this occur? Sagor’s ideas may shed some light on this. If we all have a desire to belong to a group (once our basic needs for today and security for tomorrow are more or less taken care of), then we have to confront what such acceptance requires of us. We all want to believe in “inherent acceptance” — a mother’s love, for example — but even when we have this in our lives (and far too many do not), it does not replace the need for more general acceptance and belonging. As children, as we begin to branch out from our families in to society, we want to be accepted there. And belonging with others socially, as Sagor points out, requires various forms of competency: in effect, “okay kid, what can you do for us?” This is as true in church groups and science clubs as in violent gangs.
And not incidentally, this leads to the activation of the growth-based need for skill-esteem. In effect, our lower-level desire for belonging is able to be satisfied more completely as we show that we have some locally required level of ability. It’s noteworthy that people will do whatever they need to find this sense of belonging: they will excel academically if they can; extend themselves socially (wearing new clothes, using new slang) if they must; learn obscure literary, movie, or musical references; or, if nothing else works, show their ability to dominate in terms of more basic needs via violence and aggression. Ultimately it doesn’t matter if your group accepts you for your knowledge of medieval literature, your ability to quote Star Trek, your skill in hunting deer, or your willingness to jump in on a fight — its your basic competency, evaluated in socially local terms, by which you gain acceptance and keep satisfied your need to belong.
This may all seem pretty theoretical, but it’s interesting to me in terms of designing effective motivational models for AI, and in terms of designing effective social experiences for people playing games. It’s also interesting in terms of what it might mean for using games in learning situations.
In social gaming, we have thus far taken the shallower, weaker track of equating activities like “chatting” with socialization. In MMOGs this is less the case, where someone’s value to a group or guild is often more clearly tied to their in-game abilities (they’re an effective tank or healer — or they can lead the party itself well). But in many “social games” today we have a pretty shallow view of what belonging and acceptance means (consider how we’ve turned “friending” into a verb, and the concept of a friend into a neutered version of its former meaning).
I suspect that there’s an opportunity there for creating increased engagement and satisfaction in social gaming if we provide mechanics and dynamics that support both provisional acceptance and skill-based belonging. Enabling people to accept (“friend”) others for something meaningful to both people and to the group they evolve into — where ‘meaningful’ depends entirely on the context of the game: as in life, it could be your ability to take targets down with a head-shot from far away, your ability to rally troops, grow crops, teach others new knowledge and skills, or lead the group overall.
In games even more than in life, we have the ability to determine the kinds of skill expertise will gain people acceptance and belonging. If we construct social games where knowledge of ancient Egypt or the German imperfect tense are valuable to the group, this will work just as well as a motivation for play (and not incidentally, skill increase — a.k.a. learning) as if the task is destroying buildings or head-shotting enemy soldiers.