Blacksmith21 ago

(cont'd)

When The Times alerted YouTube that its system was circulating family videos to people seemingly motivated by sexual interest in children, the company removed several but left up many others, including some apparently uploaded by fake accounts.

The recommendation system itself also immediately changed, no longer linking some of the revealing videos together. YouTube said this was probably a result of routine tweaks to its algorithms, rather than a deliberate policy change.

Jennifer O’Connor, YouTube’s product director for trust and safety, said the company was committed to eradicating the exploitation of children on its platform and had worked nonstop since February on improving enforcement. “Protecting kids is at the top of our list,” she said.

But YouTube has not put in place the one change that researchers say would prevent this from happening again: turning off its recommendation system on videos of children, though the platform can identify such videos automatically. The company said that because recommendations are the biggest traffic driver, removing them would hurt “creators” who rely on those clicks. It did say it would limit recommendations on videos that it deems as putting children at risk. Down the Rabbit Hole

YouTube has described its recommendation system as artificial intelligence that is constantly learning which suggestions will keep users watching. These recommendations, it says, drive 70 percent of views, but the company does not reveal details of how the system makes its choices.

Some studies have found what researchers call a “rabbit hole effect”: The platform, they say, leads viewers to incrementally more extreme videos or topics, which are thought to hook them in.

Watch a few videos about makeup, for example, and you might get a recommendation for a viral makeover video. Watch clips about bicycling and YouTube might suggest shocking bike race crashes.

Advertisement

Mr. Kaiser and his fellow researchers, Yasodara Córdova and Adrian Rauchfleisch, set out to test for the effect in Brazil. A server opened videos, then followed YouTube’s top recommendations for what to watch next. Running this experiment thousands of times allowed them to trace something like a subway map for how the platform directs its users.

They also followed YouTube’s recommendations on channels, the pages that host videomakers’ work. Though YouTube says these are rarely clicked, they offered a way to control for any statistical noise generated by how the platform suggests videos.

When they followed recommendations on sexually themed videos, they noticed something they say disturbed them: In many cases, the videos became more bizarre or extreme, and placed greater emphasis on youth. Videos of women discussing sex, for example, sometimes led to videos of women in underwear or breast-feeding, sometimes mentioning their age: 19, 18, even 16.

Some women solicited donations from “sugar daddies” or hinted at private videos where they posed nude. After a few clicks, some played more overtly at prepubescence, posing in children’s clothing.

From there, YouTube would suddenly begin recommending videos of young and partially clothed children, then a near-endless stream of them drawn primarily from Latin America and Eastern Europe.

Ms. Córdova, who has also studied distribution of online pornography, says she recognized what was happening.

Advertisement

Any individual video might be intended as nonsexual, perhaps uploaded by parents who wanted to share home movies among family. But YouTube’s algorithm, in part by learning from users who sought out revealing or suggestive images of children, was treating the videos as a destination for people on a different sort of journey.

And the extraordinary view counts — sometimes in the millions — indicated that the system had found an audience for the videos and was keeping that audience engaged.

Some researchers believe that when it comes to some material, engaging certain interests risks encouraging them as well.

“It’s incredibly powerful, and people get drawn into that,” said Stephen Blumenthal, a London-based psychologist who treats people for deviant sexual interests and behaviors.

And YouTube, by showing videos of children alongside more mainstream sexual content, as well as displaying the videos’ high view counts, risked eroding the taboo against pedophilia, psychologists said.

“You normalize it,” said Marcus Rogers, a psychologist at Purdue who has done research on child pornography.

YouTube says there is no rabbit hole effect.

“It’s not clear to us that necessarily our recommendation engine takes you in one direction or another,” said Ms. O’Connor, the product director. Still, she said, “when it comes to kids, we just want to take a much more conservative stance for what we recommend.”

Advertisement Children at Risk

Most people who view sexualized imagery leave it at that, researchers say. But some of the videos on YouTube include links to the youngsters’ social media accounts.

“A lot of people that are actively involved in chatting with kids are very, very adept at grooming these kids into posting more sexualized pictures or engaging in sexual activity and having it videotaped,” said Dr. Rogers.

YouTube does not allow children under 13 to have channels. The company says it enforces the policy aggressively.

For parents, there are no easy solutions, said Jenny Coleman, the director of Stop It Now, an organization that combats sexual exploitation of children.

“Even the most careful of families can get swept into something that is harmful or criminal,” she said.

In reporting this article, when The Times could find contact information for parents of children in the videos, it contacted local organizations that could help them.

After one such organization contacted Christiane, the mother from Brazil, she offered to discuss her experience.

Advertisement

Furious, she is struggling to absorb what had happened. She fretted over what to tell her husband. She expressed confusion at YouTube’s practices. And she worried over how to keep her daughter, now on display to a city-size audience, safe.

“The only thing I can do,” she said, “is forbid her to publish anything on YouTube.”

The Interpreter is a column by Max Fisher and Amanda Taub exploring the ideas and context behind major world events. Follow them on Twitter @Max_Fisher and @amandataub.