Editor's Note: Fakery
Humans are predisposed to believe fake news. In August 2017, researchers at Yale University released a study finding that even one exposure to a false news story, such as on a Facebook feed, predisposed the reader to believe that the story was true. The more times the reader was exposed, the more he or she was sure the information was accurate. This phenomenon held even if the story was clearly labeled as questionable. The study illustrates the concept of illusory truth. Well known to researchers since the 1970s, this tendency to believe clearly untrue information is at playeven if the reader previously knew that the information was false.
Advancements in computer software might soon erase the line between truth and lies. Current controversy swirls around written material online, but audio and video editing technology could make it impossible to believe anything you read, hear, or see.
On a recent episode of theRadiolab podcast, Producer Simon Adler discussed VoCo, a new audio manipulation software developed by Adobe. Dubbed "Photoshop for audio," the new technology needs only 20 minutes of a person's speech to recreate the persons voice. Software puts up a transcript of the audio recorded. Then by changing the text, the speakers words change. The words can be rearranged and they can also be altered. These new words, which were never actually uttered by the speaker, are transmitted in the speaker's voice.
Another new software product, Face2Face, does similar things to video. The software tracks facial expressions on an existing video and can then transpose the expressions of the software operator onto the person in the video. With this software, anyone can alter the video of a real event. For example, a user can take a video of Barack Obama giving a speech to Congress and alter it to contain nothing but golf references. The viewer sees Obama speaking and hears Obama's voice, but it's all fake.
"If you join the video manipulation with VoCo voice manipulation," Adler emphasizes, "you are the ultimate puppeteer. You can create anyone talking about anything you want in their own voice."
This technology is especially dangerous in a time when the news is already manipulated by state actors. In his article in Vanity Fair, "Fake News Is About to Get Even Scarier than You Ever Dreamed," Nick Bilton writes "that this new technology means that "governments can weaponize fake news as an act of digital terror."
An optimist might argue that watermarking and other detection techniques will be employed to tag altered audio and video. The public, meanwhile, will be armed with the knowledge that manipulation exists, making everyone more skeptical than before.
A realist might argue that the future of news is its past. Soon, the only reliable information will be written and produced by trusted sources and delivered to your doorstep. In print.