INTERNATIONAL - In late May, the advocacy group Common Sense Media held a summit on “digital well-being.” Attendees gathered inside the Computer History Museum in Mountain View, California, to debate the long-term effects of apps, services and electronic devices once hailed as revolutionary. The final panel was devoted to content deemed “NSFK,” or “not safe for kids.”
What was supposed to be a roundtable discussion functioned more like a public drubbing of YouTube. The video site, owned by Alphabet Inc.’s Google, is in the news every week for the inane, upsetting or harmful videos involving children.
A decade ago, fretful parents worried about video games and slasher films—but today, YouTube incites greater fear. “Now parents say, ‘Bring me the violent movie,’” Jill Murphy, editor-in-chief of Common Sense, said on stage. “It’s better than a Google search box.”
Alicia Blum-Ross, YouTube’s policy chief for kids and family, tried to convince the room that her company was getting quality content to kids. It has spent the past year throwing resources at child safety. YouTube has recruited staff and set up an outside advisory council. Last fall, the company hired Blum-Ross, an anthropologist, and speaking on the panel, she ticked off recent product upgrades – an option for screened kids’ videos only, more parental controls, smarter software. “We’ve actually made a lot of strides,” she said. In the first quarter of 2019, the company removed more than 800,000 videos for violations of its child safety policy.
Blum-Ross then touted YouTube’s supposed panacea: YouTube Kids. The app, created four years ago, filters videos from the main site specifically for children under thirteen, who are protected by federal law from forms of digital data collection. The app has faced criticism – that it’s too addictive, lowbrow and unedited -- but YouTube Kids is, relatively speaking, a haven from the dangers of the open web and YouTube.com. “We strongly encourage parents that the general site is not made for kids,” Blum-Ross said.
What Blum-Ross didn’t mention, however, is that not many kids use YouTube Kids, and those who do don’t stick around. Several of the most popular channels on the main site, which has more than 2 billion monthly users, specialize in programming designed for young kids, but that doesn’t mean they are free of advertising or screened for safety. One, Cocomelon, a channel of nursery rhymes, has more than 50 million subscribers. That’s double the weekly audience for all of YouTube Kids, which is used by more than 20 million people a week, according to a company spokesperson. (Much of the audience for a channel like Cocomelon could be parents trying to keep up with popular rhymes, a spokesperson said.)
Children who do watch YouTube Kids tend to shift over to YouTube’s main site before they hit thirteen, according to multiple people at YouTube familiar with the internal data. One person who works on the app said the departures typically happen around age seven. In India, YouTube’s biggest market by volume, usage of the Kids app is negligible, according to this employee. These people asked not to be identified discussing private information.
Once kids leave, they don’t come back. “Many parents have expressed that their child refuses to go back to YouTube Kids,” said Jenny Radesky. Radesky is a University of Michigan assistant professor of pediatrics and an expert on childhood development, and was also on the panel held by Common Sense Media. “It’s too baby-ish, too restrictive. Now that they’ve let the genie out of the bottle with YouTube main, it’s hard to reverse course,” she said in an interview.
YouTube said it is working to bring its Kids app to as many families as possible, and has created restricted versions of the full video service so parents can screen clips for their kids when they watch together. At the Code Conference last week, Chief Executive Officer Susan Wojcicki addressed safety. “I’ve been really clear that that responsibility is my No. 1 priority,” she said. “There is a lot of work for us to do. I acknowledge that, but I also know that we have tremendous tools at our fingertips that we can continue to invest in to do a better job.”
In an interview with CNN, Google’s CEO Sundar Pichai acknowledged the tension between free speech and hateful content on YouTube. “It is definitely one of the hardest things. In some ways, companies alone aren't fully equipped to handle problems like that, so I think there is a lot of work ahead,” he said.
Solving the kids problem is at the top of a growing list of headaches for the world’s largest video site. In just the past few weeks, the company has been accused of radicalizing young voters and ignoring harassment of gay people. YouTube has spent years chasing engagement on its service and ignored internal calls to address toxic videos, as Bloomberg previously reported, and it’s a habit that continues to irritate rank-and-file staff inside the tech giant. Four people at Google privately admitted that they don’t let their kids watch YouTube unsupervised and said the sentiment was widespread at the company. One of these people said frustration with YouTube has grown so much that some have suggested the division spin off altogether to preserve Google’s brand.
Yet YouTube is under limited pressure to change its ways. While YouTube is facing competition for younger viewers from Walt Disney Co. and Netflix Inc., it isn’t at risk of losing the audience. Some 97 percent of children have used YouTube, either the main site or the kids app, according Insight Strategy Group, a market research firm that polled 1,200 American families about online behavior this year. (The poll did not distinguish between the kids app and the main site; Chumsky said they stopped asking parents about YouTube Kids because many parents didn’t understand the difference.) Children from five to twelve reported spending more time on YouTube than anywhere else, including Fortnite and Instagram. “Basically, every kid who doesn’t live in Amish country,” said Sarah Chumsky, vice president for the research firm.
YouTube said it’s working on digital well-being: curbing kids’ screen time with features like a “take a break” icon that reminds viewers to stop watching. “We’re changing our metrics,” Blum-Ross told the crowd in Mountain View. But she admitted the company hasn’t figured out how to implement those without sacrificing too much of its business.
“How do we measure success when success is actually using our products less?” she asked.
YouTube Kids went live in a moment of optimism for tech, and for Google. The search giant introduced the app, in February, 2015, as the first of its “tech for tykes” initiative. At first, the plan was to hand-pick videos and charge a fee. Google, at the time, was plotting more subscriptions businesses and it planned to bundle the Kids app with other services like music and gaming, according to former staffers involved.
But the deals needed for the bundles never formed. And the plan to limit what videos appeared ran counter to YouTube’s ethos: Users dictate all. A former executive, who asked not to be identified discussing private matters, recalled surveying a father whose son loved watching airplanes take off on YouTube. The clips were unusual for kid’s programming, but didn’t seem harmful. If younger viewers found thousands of these niches, YouTube’s staff couldn’t keep up with manual curation. Software could. So the app launched with algorithmic sorting like YouTube’s main site.
YouTube designed the app with a focus on pre-schoolers. Viewers that open the app are met with dancing cartoons, and most of the library is filled with millions of hours of footage of nursery rhymes and popular “unboxing” toy clips. YouTube staff thought that older kids would still use the main site and there was no reason to target them beyond managing legal liabilities, according to a media executive who consulted with YouTube on the formation of the Kids app.
Less than three months after launching, though, child and consumer advocacy groups found inappropriate content on the app, including explicit sexual language and jokes about pedophilia.
Over the years, YouTube has made several attempts to better handle its ocean of kid’s content. One way it did that was to turn to humans – not as curators or screeners, but as “genre-taggers.” Starting in 2017, staff were assigned to categorize clips based on twelve fields, such as “music,” “play” and “toy unboxing,” according to someone who worked on the project. YouTube has squads of people that sift out videos in violation of its policies, but this team was focused on lumping together videos by subject. Employees usually tagged the videos within ninety seconds, although many took far longer. The former staffer recalled seeing a 10 hour upload of the viral Baby Shark video, on loop. In late 2017, those jobs were moved to Hyderabad, India.
Despite the changes, YouTube has avoided mirroring Disney or Netflix, which rely on established production companies and review videos before making them available. YouTube conducted a trial recently to see the impact of hand-picking every video that appears in the Kids app. It’s something critics have proposed; many of the app’s scandals involve kids stumbling onto user-generated videos that would not have snuck past careful human censors. In the internal trials, however, kids between seven and 12 grew bored of the limited library and went to surf regular YouTube, according to people familiar with the test.
YouTube would like to put the onus on parents to manage their kids, much as it expects copyright holders to flag pirated material and users to flag inappropriate content. Parents can choose to only let their kids watch programming from certain channels, like Sesame Street and PBS Kids, according to a spokesperson. But most parents feel powerless to monitor their kids’ use of YouTube, according to Radesky, the Michigan researcher who studies how young kids use technology, searching for ways to prevent mental illness, chronic pain and improve child-parent relations.
The company also likes to point out how much educational content is consumed. But external research shows that kids, particularly tweens, like very different kinds of videos. Older kids watch music videos, stunts, reaction clips (“Funny Baby Try Lemon First Time”) and movie trailers, Insight Strategy Group found. They prefer “prank and humor” videos the most.
Radesky has managed to limit her two kids’ use of YouTube thus far, directing them instead to PBS Kids and Netflix. “The algorithm is promoting whatever gets clicked on the most and that isn’t going to promote content that is best for kids at that stage of development,” she said. “It will promote the junk food kids love.”
Karen Green, a writer who lives southwest of Toronto, kept her two daughters away from YouTube until they turned 10. At that point, they got iPads, and were allowed to use regular YouTube. Green didn’t understand the point of YouTube Kids. She thought parents shouldn’t limit their kids’ interest to things that are only for children. “YouTube Kids is an unfair thing to do unless kids are super little,” she said.
But it didn’t take long for Green to regret her decision. One day her daughter came to her horrified because following different videos on YouTube had led her to a website for Furries, a subculture of people interested in dressing up as animals (and having sex with people dressed up as animals). “We were angry it was so easy to get to a place where she was so uncomfortable,” Green said.
That experience was the impetus for Green to buy a device called Circle, a rectangular box parents can attach to their computers to limit the amount of time their kids spend online. Green hooked Circle up to her modem, and placed a 30-minute limit on her kids’ screen time. As soon her kids use YouTube for 30 minutes, the computer turns off. The device also has a content filter that blocks certain types of videos, though it is, in Green’s estimation, a loose filter. Green receives updates on her kids’ activity and can control the device using an app on her phone.
Green would like to see YouTube do more to filter out inappropriate content, or deliver warnings when users are going beyond filters. “YouTube is the app parents hate the most and kids love the most,” she said.
Critics and business partners have also called for implementing ratings and review processes. But YouTube doesn’t want to change the way it’s wired, as an open platform largely edited by software. “That’s their operating system,” said Mat Baxter, global chief executive officer for Initiative, an advertising agency. “But when it comes to children, there is no margin of error.”