Dark patterns are the techniques companies use online to hook customers - whether to sign up, prevent cancellation of subscriptions or fork over additional cash or personal info. And kids are being targeted directly.
Flashing sign-up buttons. Pre-checked boxes. Impossible-to-locate unsubscribe links. Dramatic countdown timers. And let’s not forget the “X” that never closes the page down but only opens up a new one.
Or “confirmshaming” - which tries to make you feel bad about declining an offer. (“No thanks. I’d rather continue to be miserable than sign up for your miraculous and life-changing offer.”)
Not to mention “grinding” - when the user experience on a free app is so terrible it’s not enjoyable until new features are unlocked for pay.
You’ve seen most of these gimmicks already - and have probably fallen for them.
Experts call them “dark patterns” - and they are a form of guerilla marketing unique to the online world - and uniquely effective in ensnaring the unwary.
Part of their power lies in the fact that normal consumer protections don’t necessarily apply in the digital marketplace.
Dark patterns aimed at our kids
Dark patterns are hard enough for grown-ups to negotiate. But more worrying by far are the mindgames being played on, or through, our kids.
Example? The kids’ “educational program” ABCmouse - which was recently fined $US10 million for essentially tricking customers into auto-renewing, and making cancellation a tortuous journey through multiple screens.
Even more shocking, because it directly targeted children themselves, is a video game that blackmails kids into paying $10 - or be accused of “animal neglect” lose their virtual pet to the SPCA.
Then there's Harry Potter: Hogwarts Mystery - a seemingly benign game with scenarios like potion-making and broom lessons. It takes the mindgames one step further, forcing players to pay to save a kid from being strangled.
What's being done about it
The US Federal Trade Commission has recently held a workshop to investigate the impact of dark patterns on consumers, especially children, with a view to formulating policy.
Data protection authorities in Europe are also examining the impact of dark patterns on the validity of consent.
Here in Australia, where existing privacy and consumer protection laws are relatively strong, regulators have not yet focused on dark patterns specifically.
[Main image credit: www.eurogamer.net]
TikTok's algorithm pushes vulnerable kids toward risky content and risky behaviours, from eating disorders to self-harm.
We love our social platforms - but we also wish we spent less time on them. A new study has found adult users are happy to pay for help in ...
Teachers who've been observing concerning changes in students’ wellbeing aren’t imagining things. The constant overstimulation from screens ...
Aussie kids are sitting ducks for targeted online ads and privacy pirates, and will remain so until we enact protective legislation.