DeepFakes and the Future of Verified Content

1540129749559
Source: Axios

More people have mobile phones than a toothbrush, which exemplifies how technology has become an indisputable necessity in people’s daily lives. With mobile technology comes normalized access to computing power that was once inconceivable to the average person. In just a few swipes, we can manage our finances, manipulate photos and merge phone calls as if digital convergence was simply part of human nature.

This digital convergence has led not only to a chaotic news cycle, but a degeneration of news literacy that has been exacerbated by unverifiable, user-generated content that gets published more quickly than it can be consumed.

One outcome —though it may have been foreseeable — has been the rise of deepfakes: video footage that has been altered to portray an event differently than it happened in real life.

Deepfakes, as explained by Nina Brown at the Communications@Syracuse immersion in March 2019, pointed out that Hollywood producers have been creating fictional videos for decades. We watch films based on true stories all the time. We see fake news every day. What’s the big deal?

Brown emphasized the difference between deepfakes and fake videos. Deepfakes require an implementation of artificial intelligence to replace the data in an unaltered video with visual data that can completely contort a series of events.

For example, a computer can collect thousands of photos and videos of a highly-photographed person like, say, Steve Buscemi, and superimpose his facial features onto videos of other people to make a series that goes something like this:

So, if you’re a regular person without thousands of photos and hours of video coverage on the internet, what’s the worst that could happen?

Videos are typically validating footage to accompany a news story, because seeing is believing. When we watch a story unfold, we’re more likely to internalize the narrative and take it seriously.  So it’s no wonder that deepfakes are a threat to news literacy. They threaten our very ability to believe what happens before our eyes.

The latest example might also serve as the best example — if by “best,” we mean “disaster.”

Alt-right conspiracy theorist Alex Jones is currently facing multiple lawsuits from survivors and families of the tragic Sandy Hook shooting, because Jones spread falsified videos suggesting the shooting was a “giant hoax.”

The emotional and mental damages toward the families is likely to be irreparable, as is the impact of the conspiracy in communities who are looking for reasons to villanize families who have already been traumatized by mass shootings.

The deepfake videos then serve as an example to aspiring criminals who may want to create this content for the purpose of framing others for false crimes, implicating others or creating chaos around events that didn’t happen.

Of course, creators of deepfakes will always be limited by the size of their datasets, but, as more private citizens increasingly overshare user-generated content, they submit data about themselves into the internet for companies and others to abuse.

A few positive outcomes of deepfake technology are increased opportunities for creating satirical or critical content as the journalism field continues to uphold freedom of speech and accountability as a government watchdog.

In the journalism field, this has extremely dangerous implications for copyright laws, defamation suits and verification of footage. Courts currently require that any photos or videos submitted for evidence be authenticated, but there are no federal laws to mandate the means through which a video must be verified.

As a reporter, this presents challenges for news-gathering, while trying to verify not only the source of the footage, but its authenticity. Hopefully, news publications will continue to explore capabilities for leveraging digital technology to make reporting more accurate.

In the meantime, communicators should focus on leveraging opportunities to engage with communities about media literacy. People with low media literacy have trouble discerning what is real or falsified information, and may often share news stories via social media without taking the time to verify the validity of the story first.

Citizens, of course, will always bear the duty to search for the truth. It is a person’s responsibility to be an informed consumer of the news. But, without the help of a community focused on literacy, and a profession of journalists who are dedicated to delivering only truthful information, readers would be left to their own devices.

Leave a comment