Christiane C. didn’t think anything of it when her 10-year-old daughter and a friend uploaded a video of themselves playing in a backyard pool.
“The video is innocent, it’s not a big deal,” said Christiane, who lives in a Rio de Janeiro suburb.
A few days later, her daughter shared exciting news: The video had thousands of views. Before long, it had ticked up to 400,000 — a staggering number for a video of a child in a two-piece bathing suit with her friend.
“I saw the video again and I got scared by the number of views,” Christiane said.
She had reason to be.
YouTube’s automated recommendation system — which drives most of the platform’s billions of views by suggesting what users should watch next — had begun showing the video to users who watched other videos of prepubescent, partially clothed children, a team of researchers has found.
YouTube had curated the videos from across its archives, at times plucking out the otherwise innocuous home movies of unwitting families, the researchers say. In many cases, its algorithm referred users to the videos after they watched sexually themed content.
The result was a catalog of videos that experts say sexualizes children.
“It’s YouTube’s algorithm that connects these channels,” said Jonas Kaiser, one of three researchers at Harvard’s Berkman Klein Center for Internet and Society who stumbled onto the videos while looking into YouTube’s impact in Brazil. “That’s the scary thing.”
The video of Christiane’s daughter was promoted by YouTube’s systems months after the company was alerted that it had a pedophile problem. In February, Wired and other news outlets reported that predators were using the comment section of YouTube videos with children to guide other pedophiles.
That month, calling the problem “deeply concerning,” YouTube disabled comments on many videos with children in them.
But the recommendation system, which remains in place, has gathered dozens of such videos into a new and easily viewable repository, and pushed them out to a vast audience.
YouTube never set out to serve users with sexual interests in children — but in the end, Mr. Kaiser said, its automated system managed to keep them watching with recommendations that he called “disturbingly on point.”
Users do not need to look for videos of children to end up watching them. The platform can lead them there through a progression of recommendations.
So a user who watches erotic videos might be recommended videos of women who become conspicuously younger, and then women who pose provocatively in children’s clothes. Eventually, some users might be presented with videos of girls as young as 5 or 6 wearing bathing suits, or getting dressed or doing a split.
On its own, each video might be perfectly innocent, a home movie, say, made by a child. Any revealing frames are fleeting and appear accidental. But, grouped together, their shared features become unmistakable.
“I’m really scared of it,” said Christiane. “Scared of the fact that a video like this fell into such a category.” The New York Times is withholding the family’s surname to protect its privacy.
When The Times alerted YouTube that its system was circulating family videos to people seemingly motivated by sexual interest in children, the company removed several but left up many others, including some apparently uploaded by fake accounts.
The recommendation system itself also immediately changed, no longer linking some of the revealing videos together. YouTube said this was probably a result of routine tweaks to its algorithms, rather than a deliberate policy change.
Jennifer O’Connor, YouTube’s product director for trust and safety, said the company was committed to eradicating the exploitation of children on its platform and had worked nonstop since February on improving enforcement. “Protecting kids is at the top of our list,” she said.
But YouTube has not put in place the one change that researchers say would prevent this from happening again: turning off its recommendation system on videos of children, though the platform can identify such videos automatically. The company said that because recommendations are the biggest traffic driver, removing them would hurt “creators” who rely on those clicks. It did say it would limit recommendations on videos that it deems as putting children at risk.
YouTube has described its recommendation system as artificial intelligence that is constantly learning which suggestions will keep users watching. These recommendations, it says, drive 70 percent of views, but the company does not reveal details of how the system makes its choices.
Some studies have found what researchers call a “rabbit hole effect”: The platform, they say, leads viewers to incrementally more extreme videos or topics, which are thought to hook them in.
Watch a few videos about makeup, for example, and you might be recommended a viral makeover video. Watch clips about bicycling and YouTube might suggest shocking bike race crashes.
Mr. Kaiser and his fellow researchers, Yasodara Córdova and Adrian Rauchfleisch, set out to test for the effect in Brazil. A server opened videos, then followed YouTube’s top recommendations for what to watch next. Running this experiment thousands of times allowed them to trace something like a subway map for how the platform directs its users.
They also followed YouTube’s recommendations on channels, the pages that host videomakers’ work. Though YouTube says these are rarely clicked, they offered a way to control for any statistical noise generated by how the platform suggests videos.
When they followed recommendations on sexually themed videos, they noticed something they say disturbed them: In many cases, the videos became more bizarre or extreme, and placed greater emphasis on youth. Videos of women discussing sex, for example, sometimes led to videos of women in underwear or breast-feeding, sometimes mentioning their age: 19, 18, even 16.
Some women solicited donations from “sugar daddies” or hinted at private videos where they posed nude. After a few clicks, some played more overtly at prepubescence, posing in children’s clothing.
From there, YouTube would suddenly begin recommending videos of young and partially clothed children, then a near-endless stream of them drawn primarily from Latin America and Eastern Europe.
Ms. Córdova, who has also studied distribution of online pornography, says she recognized what was happening.
Any individual video might be intended as nonsexual, perhaps uploaded by parents who wanted to share home movies among family. But YouTube’s algorithm, in part by learning from users who sought out revealing or suggestive images of children, was treating the videos as a destination for people on a different sort of journey.
And the extraordinary view counts — sometimes in the millions — indicated that the system had found an audience for the videos and was keeping that audience engaged.
Some researchers believe that when it comes to some material, engaging certain interests risks encouraging them as well.
“It’s incredibly powerful, and people get drawn into that,” said Stephen Blumenthal, a London-based psychologist who treats people for deviant sexual interests and behaviors.
And YouTube, by showing videos of children alongside more mainstream sexual content, as well as displaying the videos’ high view counts, risked eroding the taboo against pedophilia, psychologists said.
“You normalize it,” said Marcus Rogers, a psychologist at Purdue who has done research on child pornography.
YouTube says there is no rabbit hole effect.
“It’s not clear to us that necessarily our recommendation engine takes you in one direction or another,” said Ms. O’Connor, the product director. Still, she said, “when it comes to kids, we just want to take a much more conservative stance for what we recommend.”
Most people who view sexualized imagery leave it at that, researchers say. But some of the videos on YouTube include links to the youngsters’ social media accounts.
“A lot of people that are actively involved in chatting with kids are very, very adept at grooming these kids into posting more sexualized pictures or engaging in sexual activity and having it videotaped,” said Dr. Rogers.
YouTube does not allow children under 13 to have channels. The company says it enforces the policy aggressively.
For parents, there are no easy solutions, said Jenny Coleman, the director of Stop It Now, an organization that combats sexual exploitation of children.
“Even the most careful of families can get swept into something that is harmful or criminal,” she said.
In reporting this article, when The Times could find contact information for parents of children in the videos, it contacted local organizations that could help them.
After one such organization contacted Christiane, the mother from Brazil, she offered to discuss her experience.
Furious, she is struggling to absorb what had happened. She fretted over what to tell her husband. She expressed confusion at YouTube’s practices. And she worried over how to keep her daughter, now on display to a city-size audience, safe.
“The only thing I can do,” she said, “is forbid her to publish anything on YouTube.”