Real Talk on Fake People
Check Twitter these days and you’ll see, in the form of real-seeming faces or funny texts of fake speeches, the potency of powerful computing tools and the ease with which they allow the generation of complete fabrications. Though the use of these media manipulation tools, the interesting verbiage of which has ascended (descended?) to meme-hood, has so far been largely gimmicky, it has papered over a dark reality – of which the 2018 epidemic of deepfakes was a stark reminder. The complicated future of media in the age of neural networks and machine learning has arrived, bringing with it media manipulation tools of endless sophistication and widespread availability. All this is happening at a time when the public’s faith in gatekeepers is low and keeps falling lower – so what could possibly go wrong?
“A lie can travel halfway around the world before the truth can get its boots on” is a fairly common phrase that you, if you’re like me, have spent most of your life believing originated with Mark Twain. Fittingly, however, even that origin story isn’t accurate. Benign inaccuracies like this and more sinister mistruths have always had an easy job getting around, while the truth and those after it have to labor for hours, days, and weeks getting their proverbial boots on. Thanks to the aforementioned advances in certain technologies and their ease of use, however, the job of a lie is only getting easier. For this, too, we can thank technology companies.
Take NVIDIA, for example. The giant known largely for the graphics card it puts in your laptops has displayed astonishing capabilities with its image processing tools. Inside papers that look unremarkably academic, it has demonstrated the ability to simulate different weather conditions of the same picture, and automatically repair damaged digital images. Audio hasn’t been safer either, from tools such as Adobe’s Voco, which elicited concern from experts well before its commercial release, prompting Adobe to consider a form of “watermarking detection” in order to prevent its abuse. Though neither Voco or its competitor WaveNet (another Google product) are yet commercially available, Canadian company Lyrebird has already made its product accessible to users in an online demo, paying little heed to how many bad actors it’s handing over its weapons-grade technology.
There are also academics pushing the bounds of what can be accomplished, like the researchers at the University of Washington who outlined in a paper how they were able to synthesize realistic video of former US President Barack Obama lip-syncing with audio of his older speeches and public appearances. The researchers made it a point to only synthesize video with words that the former President had already said, with researcher Steven Seitz stating, “We very consciously decided against going down the path of putting other people’s words into someone’s mouth.” If these assurances come off as comforting, prepare to be disappointed. Seitz had said in July of 2017, “You can’t just take anyone’s voice and turn it into an Obama video.” Less than a year later, Buzzfeed had done just that.
In one of the most chilling cases of the last year, there was the deepfake problem. Out of nowhere, it seemed, there were hordes of online users, through apps such as FakeApp, creating disturbingly realistic, sexually explicit videos of everyone from popular celebrities to unsuspecting former romantic partners. Recognizing the massive violations of privacy involved and started at the pace at which deepfakes were being generated, prominent websites rushed to ban such content. Over the ensuing days, as the issue spread among the broader public, the practice was met with widespread rebuke and revulsion. Unfortunately, the Internet is forever, and so both the technology and its products both remain far too accessible.
What does this mean for the media we consume? When anything can be warped and reshaped, is there any truth that can be trusted? Were even one of the main mitigating factors – sophistication, availability, public trust – to be meaningfully dealt with, the situation would not seem so dire. Unfortunately, it does not seem like we can close Pandora’s Box on the technology, and our failing trust in each other is a problem no one has yet been able to solve. When these issues in conjunction with each other all bear down on us at once, we must ask as Aviv Ovadya did: “What happens when anyone can make it appear as if anything has happened, regardless of whether it did or not?” If that sounds a little too Bradbury for you, he’s not the only one concerned.
Alex Champandard, AI expert and co-founder of CreativeAI, says there is an urgent need for “identifying and building trust in relationships — including with journalists and press outlets, publishers, or any content creator for that matter.” Among experts the consensus is quickly becoming clear – there is a significant threat posed to the public’s ability to trust its eyes and ears, and the society-wide danger of a growing inability to know what’s real only grows as these media manipulation tools become more sophisticated and widely available. As the integrity of information – the inherent value of it in describing the world as it is – fades, this breakdown of an integral piece of the social fabric will have wide-ranging repercussions.
The rule of law could stall as it becomes increasingly difficult for juries and judges to discern the real from the manufactured. Law enforcement and national defense may spend countless resources in chasing falsified but convincing information. Inter-societal rituals, from the most granular level of individual interaction to the macro level of international diplomacy will be compromised by the threat of malicious actors. The public at large will be unable to decide what has meaning and what doesn’t – conditions that could prove at best detrimental, and at worst fatal, to the democratic system. The likeliest outcome in the years ahead is not one where fact becomes indistinguishable from fiction and where the ordinary person cannot tell real from not, but one where the idea of what is true and what isn’t is made irrelevant, and the ordinary person simply ceases to care.
Feature Image by Zebestrian