Things come and go so quickly in the online world, so you may have missed the recent news about a controversial app called DeepNude - the latest in a new generation of tech trickery the experts are calling deepfakes.
What are they, and why should you care?
DeepNude is - or more accurately was - an app that used AI (artificial intelligence) to create fake naked images of real women’s photos by appearing to strip off their clothing.
Almost as soon as DeepNude came to public attention via the media, the app’s creator pulled it, conceding, amid a global outcry, that the likelihood that it would be used to harass women was “too high" - not to mention the legal implications of the app being used to create revenge-porn images.
But because the internet is forever - and nothing ever truly goes away - DeepNude is still available in a variety of dodgy online spaces.
Targeting girls and women
The images created by DeepNude and other apps like it are easy to identify as fake, for now. But the damage they can inflict is very real. The technology is advancing all the time - and it’s using girls and women as target practice.
“Since the term ‘deepfake’ was coined, the technology has consistently been used to target women,” noted a recent report in The Verge. “People can use deepfakes to create pornographic and nude images of co-workers, friends, classmates, even family members, and the realism of this content has only increased over time. The best images created by DeepNude look real at a glance, and that’s all that might be needed to cause terrible damage to someone’s life.”
It concluded, “DeepNude is just the tip of the iceberg.”
As for the rest of the iceberg, think Titanic, warns technology correspondent Jon Davidson of the Australian Financial Review.
Why? Because deepfake technology has the capacity to create video and audio content that is “genuinely indiscernible from videos of events that actually took place, of words that actually were spoken.”
The term deepfake - coined in 2017 as a mash-up of “deep learning” and “fake” - has been defined as a technique for “human image synthesis” using artificial intelligence.
To date, deepfake technology has been used extensively to create phony celebrity pornographic videos or revenge porn, a la DeepNude. But deepfakes are also being used to create fake news and malicious hoaxes for dissemination on social media.
Davidson cites the recent example of a fake video of Facebook founder Mark Zuckerberg bragging that “Whoever controls the data, controls the future.”
Interestingly, Facebook refused to remove the deepfake video of its chief, which appeared on Instagram and swiftly went viral. (See it here)
Faking the future of democracy?
“Artificial intelligence algorithms that can make any outlandish claim look and feel like the truth does not bode well for the future of … well, for the future of just about anything you may happen to care about. Democracy. Commerce. Investment. Sport. Race relations. Gender relations. The planet,” Davidson maintains.
The only conceivable weapon against the AI that creates deepfakes, he believes, is yet more AI - in the form of screening algorithms that detect the fakes.
“And so begins the AI arms race,” he warns, “in which Big Tech has no option but to fund research into creating an artificial intelligence capable of policing the entire population of the planet in real time. It’s difficult to see that ending well.”
"Before it was a case for youngsters of whether they should really risk it, going on to the street to buy drugs. But now because it's on ...
Technology is opening up the real world for our kids - but are they ready to handle it? And what can parents do to protect them?
Should teaching our kids the ABCs of human emotion join maths, English, science and history as core subjects?