10-year-old Chrissy and her BFF Ella (not their real names) loved splashing around in the backyard pool. They also loved having a YouTube channel. And they really loved it when they saw their latest video of themselves had racked up 400,000 views.
Chrissy’s mum Christiane was confused. How on earth ...? she wondered. And then she watched the video again - and the horrifying truth started to dawn.
As would be later verified by experts, YouTube’s automated recommendation system - the one that suggests what users should view next - was showing Chrissy’s backyard video to users who had a pattern of watching videos of partially clothed children.
According to a New York Times investigation, “YouTube had curated the videos from across its archives, at times plucking out the otherwise innocuous home movies of unwitting families…. In many cases, its algorithm referred users to videos after they watched sexually themed content.
The result was a catalog of videos that experts say sexualises children.”
In February, YouTube disabled comments on many videos featuring children - in response to reporting that showed the comments section was a magnet for predators.
But the recommendation system - the main engine driving the platform’s billions of views - has remained in place.
Times investigators alerted YouTube that its algorithm was circulating home movies to viewers with a track record of sexual interest in children - and in response the company removed several videos “but left up many others, including some apparently uploaded by fake accounts.”
In many cases, its algorithm referred users to videos after they watched sexually themed content. The result was a catalog of videos that experts say sexualises children.”
YouTube’s product director for trust and safety told the Times “Protecting kids is at the top of our list.”
But it has refused to turn off the recommendation system on videos of kids - even though it can identify this content automatically. It maintained that changing the system would hurt “creators” who rely on clicks.
TikTok's algorithm pushes vulnerable kids toward risky content and risky behaviours, from eating disorders to self-harm.
We love our social platforms - but we also wish we spent less time on them. A new study has found adult users are happy to pay for help in ...
Teachers who've been observing concerning changes in students’ wellbeing aren’t imagining things. The constant overstimulation from screens ...
Aussie kids are sitting ducks for targeted online ads and privacy pirates, and will remain so until we enact protective legislation.