The murmur of the snarkmatrix…

Jennifer § Two songs from The Muppet Movie / 2021-02-12 15:53:34
A few notes on daily blogging § Stock and flow / 2017-11-20 19:52:47
El Stock y Flujo de nuestro negocio. – redmasiva § Stock and flow / 2017-03-27 17:35:13
Meet the Attendees – edcampoc § The generative web event / 2017-02-27 10:18:17
Does Your Digital Business Support a Lifestyle You Love? § Stock and flow / 2017-02-09 18:15:22
Daniel § Stock and flow / 2017-02-06 23:47:51
Kanye West, media cyborg – MacDara Conroy § Kanye West, media cyborg / 2017-01-18 10:53:08
Inventing a game – MacDara Conroy § Inventing a game / 2017-01-18 10:52:33
Losing my religion | Mathew Lowry § Stock and flow / 2016-07-11 08:26:59
Facebook is wrong, text is deathless – Sitegreek !nfotech § Towards A Theory of Secondary Literacy / 2016-06-20 16:42:52

The soul of a new machine
 / 

Jerome Groopman writes in The New Yorker about Maja Matarić and the quest to create robots that can help people through therapy. Assistive robots are nothing new; Japan’s aging population has been using this technology for years. But Groopman focuses on the social and emotional components of these advancements, and on efforts to enable robots to understand people:

Glancing at the robot, Mary lifted a magazine from the top of the pile and guided it into a rack on top of the shelf. As soon as the magazine was in place, the robot emitted a beep. During the next few minutes, Mary moved each magazine, one by one, to the rack. Gradually, she increased her pace, and the beeps from the robot came faster. Mary began to laugh.

She turned and looked squarely at the robot. With a sly smile, she moved her weak arm toward the remaining magazines on the desk and mimed putting one into the rack. She then stuck her tongue out at the machine.

Matarić said, “She is cheating. She is totally thrilled, because she thinks she cheated the robot.” The robot, though, was on to the game. A reflective white band that Mary wore on her leg allowed the robot to follow her movements. A thin motion sensor attached to her sleeve transmitted Mary’s gestures to the robot, so that it knew almost instantly whether she was raising her arm and in what motion. A sensor in the rack signalled the robot when a magazine was properly placed, and the robot communicated with Mary only when she performed the task correctly.

Although the task lasted about an hour, the novelty of the interaction did not seem to wane. In a debriefing after the study, Mary said, “When I’m at home, my husband is useless. He just says, ‘Do it.’ I much prefer the robot to my husband.”

The article takes an interesting turn when Groopman considers the ethics of this technology:

Thirty years ago, [MIT professor Sherry] Turkle began studying the impact of sophisticated technologies, including virtual-reality computer games and robots, on emotional development in children and social relationships among adults. “I am not a Luddite,” Turkle said. “But there is no upside to being socialized by a robot.” Based on her observation of groups of different ages, Turkle has found that “children and the elderly start to relate to the object as a person. They begin to love it, and nurture it, and feel they have to attend to the robot’s inner state.” With this attachment and projection of their emotions, Turkle says, people begin to seek reciprocity, wanting the robot to care for them. “We were wired through evolution to feel that when something looks us in the eye, then someone is at home in it.”

Robots, Turkle argues, risk distorting the meaning of relationships, the bonds of love, and the types of emotional accommodation required to form authentic human attachments.

In a chat about the article, Groopman ties the matter to the expanding use of remote-controlled drones in warfare.

Apocalyptic visions involving robots tend to focus on what they’ll do to us. It’s interesting that the first real anxieties about this relationship concern what we do with them. Will we become too emotionally invested? Will we become too distant from the ethical reality of taking human lives?

And I wonder, do Sherry Turkle’s concerns extend to things like pets and the Sims?

5 comments

Tim Carmody says…

You know, when someone uses the phrase “wired through evolution,” the chances are pretty good that they’re about to claim something that is at least 150% beyond what you can really justify.

Baby birds are likewise hardwired by evolution to only accept food from their mothers. But these birds can be fooled by puppets, which allows zookeepers to keep them alive. We might have a thing for eye contact from real live human beings, but we can be fooled by movies, photographs, cartoons, pets, and yes, robots.

There’s almost an alternate, weaker version of the Turing Test at work here: if a person interacts with something they KNOW to be a robot, and nevertheless talk about “fooling them,” stick their tongue out at them, and treat them as an adequate social companion, then they ARE an adequate social companion. Grandiloquent dismissals based on a narrow notion of what evolution “ought” to permit aren’t just a waste of time; they’re an intellectual disease.

/ Reply

Heh heh, I agree. Here’s a fun game:

** ** **

“I am not a Lud­dite,” Turkle said. “But there is no upside to being social­ized by a cat.” Based on her obser­va­tion of groups of dif­fer­ent ages, Turkle has found that “chil­dren and the elderly start to relate to the animal as a per­son. They begin to love it, and nur­ture it, and feel they have to attend to the cat’s inner state.” With this attach­ment and pro­jec­tion of their emo­tions, Turkle says, peo­ple begin to seek reci­procity, want­ing the cat to care for them. “We were wired through evo­lu­tion to feel that when some­thing looks us in the eye, then some­one is at home in it.”

Cats, Turkle argues, risk dis­tort­ing the mean­ing of rela­tion­ships, the bonds of love, and the types of emo­tional accom­mo­da­tion required to form authen­tic human attachments.

** ** **

We have been ascribing nonexistent emotional richness to non-human entities for a looong time.

Matt Penniman says…

I agree — and yet…

There is something qualitatively different about a relationship with a robot compared to a relationship with a cat — chiefly, the fact that you can turn a robot off. Robots will never make demands on you when you don’t want them to. They will never require responsibility from you in the way that a cat or (especially) a human would. In that sense, I think Turkle is on to something — a person whose major emotional relationships are with entities that they can utterly control is not a person likely to experience much emotional growth.

Maybe that’s too narrow a reading of the social possibilities of the robot; but it seems like something worth being aware of.

Even while I’m tempted to dismiss Turkle’s concerns, she seems like she’s troubled by this after actually exploring it; she’s not just a random pundit Groopman nabbed to comment on robots. I give her concerns at least as much deference as I’d give to, say, Nicholas Carr. I’m actually interested in reading her book Simulation and Its Discontents.

I think part of her concern comes from the fact that the rehabilitative robots are actually engineered to partly emulate distinctly and recognizably human characteristics, and the article gets into this. (Groopman spends a lot of time talking about efforts to avoid the uncanny valley.) One of the people interviewed for the article, for example, is David Hanson, whose contributions to the art of robot-making include the invention of a material called Frubber, intended to function like human skin. I suspect the verisimilitude is part of what makes Sherry Turkle uncomfortable, as well as what Matt P mentions above.

Tim Carmody says…

Maybe I misread her! But then again, I have always been the snarkiest snarker on the market.