Social media companies are evading responsibility for kids' safety by pointing to official age restrictions - allowing abuse to children to go unaddressed.
True or false? No child under the age of 13 is on social media.
Obviously true - and just as obviously completely false.
Yet the ways in which this paradox endangers underage children is a lot more complicated than it may first appear.
Sure, every platform under the sun - from Facebook to TikTok - has a minimum age requirement of 13. But verification procedures are universally either flimsy or downright non-existent.
The most recent research, based on self-report data from 1,000 children, gives a pretty clear idea of what you no doubt already knew: which is that kids - maybe even your kids - are using adult social media in droves.
Result? Underage users comprise a healthy proportion of the most social popular apps.
For obvious reasons tracking their precise numbers is nearly impossible. Yet the most recent research, based on self-report data from 1,000 children, gives a pretty clear idea of what you no doubt already knew: which is that kids - maybe even your kids - are using adult social media in droves.
Kids' top social media platforms
The study, conducted by the US-based nonprofit Thorn, found that among kids aged 9-12:
Now, none of that may seem particularly surprising or even alarming. But wait, there’s more.
Platforms where the most harm occurs
Because the same study also found disturbing numbers of children reporting potential harm on these same platforms - while their attempts to deal with bullying, grooming and unwanted contact from strangers often ended in frustration.
The platforms with the highest number of minors reporting potential harm were Snapchat (26 percent), Instagram (26 percent), YouTube (19 percent), TikTok (18 percent), and Messenger (18 percent).
Those where the most minors said they had an online sexual interaction were Snapchat (16 percent), Instagram (16 percent), Messenger (11 percent), and Facebook (10 percent).
Reporting and blocking of abuse
The good news was that kids who experienced abuse were keen to take advantage of platform-based blocking and reporting tools. The bad news was how inadequate these in-app tools were to address their needs. As one observer noted, such tools “can feel like fire alarms that have had their wires cut.”
Typically, kids were frustrated that none of the choices “fit the situation” - especially that of being pestered for nudes, whether by an adult or another child.
One in three children who reported an issue said the platform took more than a week to respond. And nearly a quarter said the concern they raised was never resolved. Typically, kids were frustrated that none of the choices “fit the situation” - especially that of being pestered for nudes, whether by an adult or another child.
So kids are experiencing real distress and abuse on social platforms. Yet tech companies have managed to evade responsibility by pointing to their official age restrictions - which, as we have seen, bear little resemblance to real-world usage by kids.
These companies say, in effect, we don’t allow children to be users. Therefore, we’re under no obligation to protect them from harm, or devise reporting tools that are appropriate and kid-friendly.
The time has come, argue the authors of the Thorn report, to expose that self-serving myth, and hold tech companies accountable.
How? For starters, by making age-verification procedures much more robust. (Among the report’s other findings, it revealed 27 percent of 9- to 12-year-old boys had used a dating app.)
By integrating crisis support numbers into messaging platforms.
By getting much more serious about ban evasion, to prevent blocked users from simply creating alternate accounts and carrying on the abuse.
“Let’s deal with the reality that kids are in these spaces, and re-create it as a safe space,” says Julie Cordua, Thorn’s CEO.“When you build for the weakest link, or you build for the most vulnerable, you improve what you’re building for every single person.”
How much screen-time is “just right” for your child? Applying the Goldilocks Principle to children’s device use means figuring out where ...
The automated message “this call may be recorded for training purposes” is all too familiar to most of us. But few are aware that those ...