Kids Adore Their New Robotic Relatives
Millions of American families buy automatic voice assistants to turn off the lights instead of themselves, order pizza and show movie program in the cinema. Children happily use gadgets to resolve disputes over dinner, look for answers to do their homework, and entertain friends who have stayed to sleep over.
Many parents are surprised and interested in how these incorporeal all-knowing voices – Amazon Alexa, Google Home, Microsoft Cortana – affect the behavior of their children, making them more curious, but sometimes less polite.
Only for two years, technology has surpassed all marketing promises. People with disabilities use voice assistants to control the house, order food and listen to books. Nurses say that these devices help with dementia, reminds what day it is today or when to take medicine.
For children, the potential for changing interactions is as significant as at home and school. Psychologists, technologists, and linguists are just beginning to analyze possible harm from these devices with an artificial intellect that surrounds children, especially in those life periods when children pass the crucial stages of social and linguistic maturity.
13-year old Usher Labowicz and his 10-year old sister Emerson Labowicz are playing with family helper Alexa, while their mother is watching them.
"For me, the biggest question is how they react and deal with this inhuman essence," says Sandra Calvert, a psychologist at Georgetown University and director of the Children's Digital Media Center. "And how does this affect the dynamics of the family and social interaction with other people?"
It is expected that this year 25 million of voice assistants will be sold at a price of $ 40 to $ 180 – compared to 1.7 million that were sold in 2015 – and this can seriously affect even on little kids.
The giant of the toy industry Mattel recently announced that this summer they plan to release a child-watching device (video babysitter) Aristotle, that "calms, teaches and plays" using Microsoft AI. Growing up, children can ask or answer questions. The company says that "Aristotle was specially developed to grow up together with the child."
Supporters of technologies say that children usually learn to receive new information using the dominant at that time technology – first, it was a catalog of library cards, then Google, then short dialogues with friendly and omniscient voices. But what if these devices will take away the children already adhered to the screens, even further from the situations in which they learn valuable interpersonal skills?
It is not clear, does the companies involved in these things even worry about that.
There was no respond from Amazon on the request to comment this article. A representative of the new organization Partnership for AI, which includes Google, Amazon, Microsoft and other companies that work on voice assistants, said that nobody could answer this question.
"These devices do not have emotional intelligence," says Alison Drouin, a professor at the University of Maryland, who studies the use of technology by children. "Their intelligence is factual."
Labowicz family are sitting at the kitchen table; Alexa on the front side of the picture. "We like to ask her a lot of random questions", – says Emerson about the device.
Children clearly enjoy the company of these devices and talk about them as members of the family.
"We would like to ask her a lot of odd questions," says Emerson Labowicz, a fifth grader from the town of Beszthez, Maryland, badgering the device with her older brother Esher.
This winter Emerson asked her machine almost every day to help him count the days left before the trip to the entertainment park "The Wizarding World of Harry Potter" in Florida.
"She also knows how to rap and rhyme," says Emerson.
Today's children will be brought up by AI in much the same way as their grandparents were brought up with new devices called "TVs". But you could not talk to the TV.
Ken Yarmosh, a 36-year-old app developer and a founder of Savvy Apps in Northern Virginia, installed many voice assistants in his house, including models from Google and Amazon. (The Washington Post belongs to the founder of Amazon Jeffrey P. Bezos, whose middle name is Preston, according to Alexa).
The 2-year-old son of Yarmosh was so fascinated by Alexa that he tries to talk with cup-holders and other cylindrical objects that look like the device from Amazon. And his 5-year-old son compared two digital assistants and decided that Google knows him better.
"Alexa is not smart enough for me," he says, asking random questions that his parents cannot answer, such as how many miles to China (Google says "Up to China is 7248 miles as the crow flies").
Evaluating it such way the device plugged in, the son of Jarnosha anthropomorphizes him – that is, as Alexa explains with pleasure, "attributes something to human properties." Calvert says that people often sin. We do this against dogs, dressing them in costumes for Halloween. We give names to boats. And when we meet robots, we – especially children – treat them, almost as equals.
In 2012, researchers from the University of Washington published the results of research involving 90 children interacting with Robovie, a human-sized robot. Most children felt that Robovie has "moods" and that he is a "social being." When Robovie was put in a closet, more than a half of children considered that it wasn't fair. An analogous emotional connection arises with Alexa and other AI assistants – even with parents.
"It surely becomes a part of our life," says Emerson's mother, Laura Labowicz, and corrects herself: "It’s already a part of our life."
Drouin says, the problem is that the emotional connection can cause expectations for children, which devices cannot justify because they were not created for this purpose. This leads to confusion, disappointment and even can change how children talk and interact with adults.
Yarmosh's son thought that Alexa did not understand him, but the algorithms of the device could not recognize his voice or the way children formulate questions. The teachers who use these tools in their classes and libraries met with the same problem.
"If Alexa does not understand the question, is it her fault, or the question is putted in a wrong way?" – says Gwyneth Jones, a librarian using the Amazon device at Murray Hill High School in Laurel. "People will not always be able to understand what children are saying, so it's important for children to learn how to ask the questions in right way."
Naomi S. Baron, a linguist at the American University of Digital Communication, learns digital communication, belongs to those who believe that devices, even if they will be wiser in time, will push children toward a preference for simple language and simple queries instead of using nuances and complex questions.
If you ask Alexa: "How to ask a real issue?", She will reply: "I could not understand the question I had heard." But she can answer a simple version of it: "What is the question?".
"A linguistic expression used to request information," she says.
There is also the possibility of changing the way how adults will communicate with children.
Even though the new assistant from Mattel will have an option that will make the children say "please" when requesting information, the assistance from Google, Amazon, and others are designed so that users can quickly and directly ask the question. Parents note the visible changes in their children behavior.
In the last year blog post, a California venture investor wrote that his 4-year-old daughter thought that Alexa could spell the words best of all in the house. "But I'm afraid that she turns my little girl into a bitch," wrote Hunter Walk, "because Alexa can be patient with her bad behavior."
To ask her a question, you just need to say her name, and then the question. Without "please." And without "thank you" before the next question.
"From the cognitive point of view, I'm not sure that the child will understand why Alexa can be treated without any respect, and a person – cannot," wrote Walk. "At least, it makes a confidence that as long as you have an excellent diction, you can get anything you like without using politeness."
Jones, the librarian, saw several kids asking questions at the same time. "You're pushing too hard," she said if Alexa kept repeating she did not understand. – You're knocking her down. You have to talk to her one by one, like with a person.
Personal, but the business relationship like as children as adults. Parents, including the author of the article, noticed that questions that previously was asked to parents are now asked helpers, especially about homework – how to spell a word, mathematics, historical facts.
You also have to remember about the security and virus protection of your robotic assistant, Bod Intelligent Antivirus by Bod Security as one of a minor way of safe usage of your device.
Or take the weather, especially in winter. Instead of asking the parents about the temperature outside, the children go to the device and believe it answers, as perfectly right.
Positive side: no arguing over what will be the temperature and what to put on.
Negative side: children are less talk to parents, and lose in communication with them.
"Interaction with these devices that simulate conversations, can cause many unintended consequences," says Keith Darling, a professor at MIT, studying the interaction of people with robots. "We do not know what consequences can it be yet."
But most developers, teachers and parents – and even some children – already agree that these devices need to be put in place, just like an all-knowing relative.
ones, the librarian, sometimes takes Alexa out of the library for a couple of weeks so that her students will not pin hopes on her. Yarmosh, who recently launched a project to control and supervision children's online video, does not put assistants in children's rooms. Emerson and her brother choose the approach of the school playground. "Alexa", – they say, – "You an asshole."