We don't allow under-age users ... and other myths tech companies hide behind

Social media companies are evading responsibility for kids' safety by pointing to official age restrictions - allowing abuse to children to go unaddressed.

True or false? No child under the age of 13 is on social media.

Obviously true - and just as obviously completely false.

Yet the ways in which this paradox endangers underage children is a lot more complicated than it may first appear.

Sure, every platform under the sun - from Facebook to TikTok - has a minimum age requirement of 13. But verification procedures are universally either flimsy or downright non-existent.

The most recent research, based on self-report data from 1,000 children, gives a pretty clear idea of what you no doubt already knew: which is that kids - maybe even your kids - are using adult social media in droves. 

Result? Underage users comprise a healthy proportion of the most social popular apps.

For obvious reasons tracking their precise numbers is nearly impossible. Yet the most recent research, based on self-report data from 1,000 children, gives a pretty clear idea of what you no doubt already knew: which is that kids - maybe even your kids - are using adult social media in droves. 

Kids' top social media platforms

The study, conducted by the US-based nonprofit Thorn, found that among kids aged 9-12:

  • 45 percent say they use Facebook daily
  • 40 percent use Instagram 
  • 40 percent use Snapchat 
  • 41 percent use TikTok, and 
  • 78 percent use YouTube.

Now, none of that may seem particularly surprising or even alarming. But wait, there’s more.

Platforms where the most harm occurs

Because the same study also found disturbing numbers of children reporting potential harm on these same platforms - while their attempts to deal with bullying, grooming and unwanted contact from strangers often ended in frustration.

The platforms with the highest number of minors reporting potential harm were Snapchat (26 percent), Instagram (26 percent), YouTube (19 percent), TikTok (18 percent), and Messenger (18 percent).

Those where the most minors said they had an online sexual interaction were Snapchat (16 percent), Instagram (16 percent), Messenger (11 percent), and Facebook (10 percent).

Reporting and blocking of abuse

The good news was that kids who experienced abuse were keen to take advantage of platform-based blocking and reporting tools. The bad news was how inadequate these in-app tools were to address their needs. As one observer noted, such tools “can feel like fire alarms that have had their wires cut.” 

Typically, kids were frustrated that none of the choices “fit the situation” - especially that of being pestered for nudes, whether by an adult or another child.

One in three children who reported an issue said the platform took more than a week to respond. And nearly a quarter said the concern they raised was never resolved. Typically, kids were frustrated that none of the choices “fit the situation” - especially that of being pestered for nudes, whether by an adult or another child.

shutterstock_74017765

So kids are experiencing real distress and abuse on social platforms. Yet tech companies have managed to evade responsibility by pointing to their official age restrictions - which, as we have seen, bear little resemblance to real-world usage by kids.

These companies say, in effect, we don’t allow children to be users. Therefore, we’re under no obligation to protect them from harm, or devise reporting tools that are appropriate and kid-friendly.

The time has come, argue the authors of the Thorn report, to expose that self-serving myth, and hold tech companies accountable. 

How? For starters, by making age-verification procedures much more robust. (Among the report’s other findings, it revealed 27 percent of 9- to 12-year-old boys had used a dating app.)

By integrating crisis support numbers into messaging platforms.

By getting much more serious about ban evasion, to prevent blocked users from simply creating alternate accounts and carrying on the abuse. 

“Let’s deal with the reality that kids are in these spaces, and re-create it as a safe space,” says Julie Cordua, Thorn’s CEO.“When you build for the weakest link, or you build for the most vulnerable, you improve what you’re building for every single person.”



Stay on top of the latest trends in digital parenting, with Family Zone, Australia's leading parental control solution.

Why not start your free trial today?

 

 

 

Tell me more!

Topics: Cyber Bullying, Parental Controls, Screen time, Mobile Apps, Excessive Device Usage, Social Media, esafety, underage users, online abuse

    Try Family Zone for FREE

    Sign up now to try Family Zone for 1 month, totally free of charge.

    Free Trial
    Subscribe to our newsletter
    Follow us on social media
    Popular posts
    Parental Controls | Mobile Apps | Cyber Safety | teens on social media
    Can we talk? 100 questions your teen might actually answer
    Parental Controls | Screen time | youtube | smartphones | WhatsApp | suicide | self-harm | momo
    MOMO unmasked
    Parental Controls | Cyber Safety | Cyber Experts | parenting | roblox
    Roblox: What parents need to know about this popular gaming platform
    Parental Controls | Cyber Safety | tinder | Cyber Experts | parenting | yellow
    Yellow: The Tinder for Teens
    Parental Controls | Social Media | privacy | decoy app
    Hide It Pro: A decoy app to look out for
    Cyber Bullying | Parental Controls | Screen time | Mobile Apps | Cyber Safety | online predators | tiktok | paedophile | child predator | Likee
    LIKEE: What parents need to know about this risky TikTok wannabe

    Recent posts

     
    Press the reset button on your kid’s online routine

    COVID blew up our teens’ screen-time. It’s time to get them back on track. In the wake of the COVID pandemic, our children are facing a ...

     
    Bigger families face super-sized screen-time challenges

    If you have more than one child - and statistics show 86 percent of families do - then managing screen-time can be double trouble. Or ...

     
    'Bigorexia' a growing risk for today's boys

    We’re starting to understand how social media can damage girls’ self-esteem - but what about our boys? New research finds disturbing ...

     
    The metaverse: Brave new world - or an upgrade for predators?

    Mixing kids and adult strangers in a self-moderated online environment ... What could possibly go wrong?