20 Aug 2018

The delights and dangers of artificial affection

12:05 pm on 20 August 2018

Social robots are charming humans, providing company to elderly people and giving joy to children. But is there a danger in letting data-collecting robots into our lives? Thomas Phillips reports.

Joy For All Companion Pets  are social robots intended for eldercare.

Joy For All Companion Pets are social robots intended for eldercare. Photo: Tom Phillips

Helena, an 88-year-old resident of Sevenoaks Retirement Village on the Kāpiti Coast, strokes the robotic cat splayed across her lap. It blinks, twitches its ears, and opens its mouth to let out a protracted meow, its mechanical innards whirring with each movement. “Helena, he’s talking to you,” staffer Dene Hopkins says. “Dobzre,” she responds in her native tongue.  

Helena was one of the hundreds of Polish children who arrived in New Zealand as World War II refugees. Today, she lives in Sevenoaks’ Kauri House, a unit for people with dementia who can no longer live independently. She sits with her fellow residents in a glass conservatory overlooking a manicured garden and an aviary filled with quails, canaries, and lovebirds. The residents take turns petting three Joy For All Companion Pets - social robots intended for eldercare.

Unlike industrial robots hidden away on factory assembly lines, social robots – which often take the form of small animals or childlike humanoids – are designed to fit into domestic settings and offer comfort by simulating real interactions with pets and people. The most well known model is the Furby, a robotic children’s toy released in 1998, but many of today’s social robots are marketed towards adults for purposes like companionship and healthcare.

Tactile interactions with pets, such as stroking dogs, are proven to ease common symptoms of dementia like anxiety and irritability by releasing oxytocin, a hormone associated with social bonding. Studies using Paro, a Japanese-designed robotic harp seal pup, show that similar interactions with robotic animals can also significantly reduce stress.

Sevenoaks has three Joy For All pets – a golden retriever puppy they call Scamp and two tabbies called Smokey and Ceefah (a phonetic abbreviation of ‘c for cat’). The pets have sensors throughout their bodies that allow for a range of movements. The cats nuzzle into your hand during a cheek scratch and eventually roll over for a belly rub after enough engagement. Scamp will yap in response to speech and petting its back will trigger a heartbeat sensation.

When still, their vacant eyes and unrealistic proportions give off an air of amateur taxidermy, but there’s something about their responsiveness to touch that persuades you to treat them as if they’re alive.

Edie, a headstrong woman who seems less captivated by the pets than some of the other residents, says that Scamp might get fleas and suggests the puppy should be let outside to urinate. “This one’s asleep!” says Kate, a laid-back 84-year-old who sits with Ceefah on her lap.

***

Dr Craig Sutherland with with an iRobi, a Paro and Nao.

Dr Craig Sutherland with with an iRobi, a Paro and Nao. Photo: Tom Phillips

Social robots fill the Centre for Automation and Robotic Engineering Science (CARES) at Auckland University. Some are switched on and swivel their heads intermittently. Dr Craig Sutherland, a roboticist, has his three young sons, who are taking a sick day from school, with him and they sit in the corner playing with a Paro.

Sutherland presses a button on a Nao – a small, humanoid robot with a barrel-shaped head and perpetually bewildered eyes that flash with purple light. Twice per week, he takes it into Starship Children’s Hospital as part of a study into whether robots can augment the work of overstretched play specialists –  healthcare professionals who entertain anxious children as a form of therapy.

“You want me to talk now?” asks the Nao, who Sutherland calls Puma. “Yes Puma,” he responds. “Okay, let me stand up.” Its movements are slow and awkward and Sutherland hovers his hand behind it in case it falls over.

“Shall I dance?” asks Puma. “Yes,” I say, enunciating clearly, a habit from my patchy track record with voice recognition technology. A long silence ensues. “Oops. Sorry, I was daydreaming,” says Puma. “Right, let’s start.” Soothing music plays from Puma’s speakers as it goes through a series of movements resembling tai chi before sitting back down. “Okay Craig, that’s enough from me,” it says.

Afterwards, Sutherland reveals that Puma was reciting a pre-set script, ending the illusion that different responses could have led the conversation down different paths. “Getting people to react positively to robots is about learning how people would expect a robot to act in the situation” he says. “Even if we get half-way there, people are going to go the other half all by themselves.”

Sutherland is referring to research from Sherry Turkle’s 2011 book Alone Together: Why We Expect More from Technology and Less from Each Other, in which she observes that children aged eight and under will genuinely believe they are teaching their Furbies to speak English, even though the toys are programmed to gradually abandon their native ‘Furbish’ in favour of predetermined English phrases with or without language lessons.

Adults also project human qualities onto robots. A 2015 study found that when people listened to a Nao robot recite a script expressing fear about losing its memories, and then watched as the robot’s memories were erased by an indifferent technician they felt empathy. Similar research using neuroimaging found that photos of robotic hands being cut with scissors triggered the same upsetting emotions as equivalent images with human hands, albeit to a lesser degree.

The phenomenon predates social robots, Sutherland says. He tells me about a study in which participants were asked to evaluate a computer. When they gave answers using pen and paper they responded honestly, but when answering directly to the computer their responses became sugar coated, as if tactfully trying to avoid hurting the machine’s  feelings. “If it’s something that acts even remotely human, we start to associate human abilities with it,” says Sutherland. “We’ve got no choice … we do it subconsciously.”

***

Many social robots are designed to be perceived not only as sentient but also as cute and vulnerable. “Hello world!” says one of the CARES lab’s robots, with a smiley expression and a touchscreen on its belly. It’s an iRobi -  a Korean-designed robot used in health care and early childhood education. “Its design elicits similar emotions that children would [have],” says Sutherland. He picks the iRobi up to demonstrate. “I’m scared, put me down!” it says.

A recent study, led by CARES researcher Dr Elizabeth Broadbent, gave a group of New Zealanders living with chronic lung disease iRobis to test whether they would help with home rehabilitation and limit future hospital visits. The robots measured patients’ breathing patterns and heart rates and reminded them when to take their medication and do exercises. The majority not only found their iRobis useful for maintaining their health, but also enjoyed their companionship.

EveR uses a camera to tilt its gaze towards passersby like a possessed bust sculpture.

EveR uses a camera to tilt its gaze towards passersby like a possessed bust sculpture. Photo: Tom Phillips

“I named the robot after my great-grandson because I miss him now that he is overseas,” said one participant during a research interview. “It made it like he is here with me.” Another said their robot “was like one of us. I would pat it on the head and he would respond. I often found myself having conversations with him.”

Broadbent believes that appearance plays a part in iRobi’s popularity. “A lot of people think the robot looks quite cute,” she says. “When robots have got big eyes people think that the robots have got a warmer personality and they’re more friendly, whereas if they look creepy and they look otherworldly then people are not very comfortable.”

One “creepy and otherworldly” robot at the CARES lab is the ‘EveR,’ which takes the form of a disembodied human head with silicon skin. It uses a camera to tilt its gaze towards passersby like a possessed bust sculpture. Removing its feathery brunette wig and detachable scalp reveals 20 motors buzzing behind its face to recreate a range of human expressions. “You do a double take whenever you see it,” says Sutherland, describing the feeling of discomfort triggered by something that looks almost human but behaves unexpectedly - a phenomenon known as the ‘uncanny valley.’

***

Encountering cuteness, whether it be a panda cub or child-like robot, “triggers an instant reward type mechanism,” says Dr Cherie Lacey, a media studies lecturer at Victoria University and expert on the emerging field of ‘cute studies.’ “We get this little hit. Neurological studies have shown that it’s similar to cocaine, to sex and to gambling.”

There are scientific criteria for cuteness; “relatively large head, predominance of the brain capsule, large and low-lying eyes, bulging cheek region, short and thick extremities, and clumsy movements,” says Lacey, who is currently co-authoring a book, My Robot, My Friend: the Promise of Artificial Companionship. “Social robots have taken a number of these formal attributes of cuteness and distilled them … They really push our Darwinian buttons, eliciting care and a nurturing response.”

Lacey is troubled by the new generation of cute domestic robots. Designed to assist with basic tasks like controlling your smart home, these robots collect an unprecedented amount of data about their owners’ day-to-day lives, a concern we’re less attuned to when our risk perception is bypassed by the allure of an adorable robotic companion. “It touches something quite instinctual about us, quite primal,” says Lacey. “And I say that as a mother who knows that the desire to nurture something is often in conflict with some other decision-making processes.”

Dr Cherie Lacey is a media studies lecturer at Victoria University.

Dr Cherie Lacey is a media studies lecturer at Victoria University. Photo: Tom Phillips

A recently released robot called Kuri, which moves around on wheels and communicates through a series of cute bleeps, functions primarily as a personal videographer, roaming around your home and recording the day’s most interesting moments. With big eyes, stubby arms, and a sleek, cylindrical body, Kuri looks like a minimalist, much cuter R2D2. “It’s very much in the hello kitty tradition of cute borrowed from Japanese Kawaii culture,” says Lacey. “She’s been described as like a roaming pixar character.” Mayfield robotics, the company that invented Kuri, actually recruited an animator from Pixar to work on the robot’s design to make it as endearing as possible.

For Lacey, the scariest part of domestic robot ownership is the risk of hackers. “Everything would be up for grabs,” she says. “Most of these social robots have cameras, so they can map space, they are always on, even when they’re sleeping they always are recording and taking images of you and your family … it’s kind of like having a home spy that would know everything about your life.”

Lacey also warns that several domestic robots, Kuri included, have invasive privacy policies, allowing them to sell personal data to third parties to create highly targeted advertising. “They’ll be able to capture really fine-grained, granular bits of information about our everyday lives,” says Lacey. “That kind of information about our dynamic, desiring lives is really, really valuable and potentially highly exploitable.”

Sutherland acknowledges the potential risks that come with the advancement of social artificial intelligence: privacy invasion, job automation, the distortion of human relationships. Regardless, he sees companion robots as essential for the future of eldercare, not only to bolster medical services in the face of an aging population, but to stave off loneliness for some of the most isolated members of society. He remembers the iRobi study participants who didn’t want to part with their robots after the research concluded, “They were like, ‘don’t take my friend away!’”

***

Back at Sevenoaks Retirement Village, Dene Hopkins tells me about the facility’s history with real pets. One cat named Jontie, who is now buried on the village grounds, would attack Hopkins everytime she sat at her desk. “It sort of dominated the hospital and it got very, very fat because it was getting fed by everyone,” she recalls. Another cat had to be rehoused because it “went a bit psycho and it wouldn’t stay in the unit.”

The temperament of Joy for All robots is much more consistent, irrespective of external stressors (Hopkins says one has had an entire cup of juice spilled all over it). While they may not truly reciprocate affection, they offer an alternative for those who can’t take on the responsibilities and hardships of a real pet, a compromise presented as a benefit by the Joy for All slogan: “no vet bills, just love.”