This Isn’t the Social AI You’re Looking For

I’m a huge proponent of what I call “social AI;” I’ve written and spoken about this before.  Social AI is in some ways a subset of“Artificial General Intelligence” in that it implies AI that acts in socially plausible ways (a phrase we use to avoid problematic terms like “realistic”) without having to include the complete range of human knowledge and nuance.

My vision for social AI is that it enables computer-driven agents (aka NPCs) to interact with each other and with human participants in socially plausible and satisfying ways.  This, I believe, is necessary for the “very long form story” and non-static worlds that I wrote about earlier, among other uses.

But there are also disturbing examples of what social AI isn’t, at least to my way of thinking.  I’m going to look at a few of these, and then come back to talk more about what social AI can do for us in more positive ways.

First, consider this story: a guy in Japan known only as SAL9000 has a virtual girlfriend, held lovingly inside his Nintendo DS, and he’s entirely happy with that.  Why wouldn’t he be?   As CNN reports, she “looks perfectly perky in sexy skirts, doesn’t pick fights and is always at [his] beck and call.”  Nene Anegasaki, the vgf, is contained in a game called “Love Plus.”  Catchy name, but it’s a bit like calling a gut-busting milkshake a “Body Builder Plus.”  Changing the name doesn’t change the ultimately empty and life-damaging experience.  Is Sal more likely to be social as a result of his attachment to this perfect, subservient, unchanging mirage of a girl?  I doubt it… but I suppose it’s possible someone might be made healthier by a steady diet of those 1100 calorie milkshakes too.

Then there’s this recent story, also from Japan, about a new quasi-robotic system, the Geminoid-F.  This is actually tele-operated, so it has no AI in it.  It does have the ability to smile and frown (minimally), but this is still deep in the uncanny valley.  Once again, it is — at least for now — faux-social.  It sort of looks right, but there’s no heart, no life.  What we don’t need is a lot of zombie robots pretending they’re social.

Combining these two we get “Roxxxy” the robotic girlfriend.  Really.  She’s labeled, without apparent irony, as a “true companion” (that’s even the domain name).  The robot is clearly intended primarily as a sexbot (her “personalities” vary from ‘Frigid Farrah’ to ‘S&M Susan’), and beyond that its companionship seems… limited.  Yes, she knows your likes and dislikes, because you can check them off online and they’re downloaded to her.  But her AI does not extend to getting to know you, growing with you, or providing actual companionship outside of the narrow parameters of her pre-programmed sexuality.

So what’s the big deal?  Sure these are limited in their current implementation, but is there a real problem here?

The thing is that these seem social, but it’s the seeming that’s the issue: it’s faux-sociality.  These reward repeated experience — with themselves, not with other humans.  If the game or the ‘bots actually helped shy guys learn how to talk to a real girl, or if they acted as social glue between people (bringing people together), that would be a huge step toward actual something like “Love (or just Friendship) Plus.”   Instead what we have is essentially — and in the latter case above very thinly veiled, literally and figuratively — social porn: it feels social in the moment, but doesn’t lead you any closer to actual social fulfillment.   Like any mirage, these look good and seem to be able to satisfy the deep social need in all of us… but ultimately they don’t deliver, and pretty soon the oasis turns out to provide nothing more than a mouthful of sand.

Explore posts in the same categories: AI


Both comments and pings are currently closed.

%d bloggers like this: