keywest

Blog

Deepfakes Threaten the Veracity of Video

deepfake videos

It’s getting harder and harder to tell what’s real. Filters are distorting our self-perception and fake news is a legitimate concern. What about video? Is what you’re seeing factual? Is that post from an unknown source, or even a repost from a trusted friend, the real deal? Deepfakes are making the veracity of video nearly impossible to determine and we’re all growing increasingly suspicious of whether seeing is believing.

What is a Deepfake and How Does it Work?

A deepfake is named for deep learning, the AI technique used to create false video and audio. This technique makes it possible to literally put words in someone’s mouth. Using video and stills of an individual, deepfake software generates forgeries good enough to fool a machine learning model. A subject’s face a voice are analyzed as they speak, then recreated to conform to whatever the user wants them to say. Although the technology is still in its infancy, some very convincing videos have already been produced and advances are being made all the time.

Implications & Dangers

The possible uses for deepfake video run the gamut. For a video production house like Key West Video, we could fix bloopers in an address to camera and translate English videos to French seamlessly. On the flip side, there are a lot of ways a deepfake could be damaging. These videos could have far-reaching negative effects that could even spark violent social unrest.

The possible implications for the 2020 US presidential race and election have caused concern and compelled groups to work on software to combat fakes. The ability to make convincing fake video of a political candidate saying or doing something damaging is real. Propaganda espoused in a deepfake could lead to information warfare. Consider what happened in 1938 when a radio broadcast of The War of the Worlds caused mass panic.

propoganda definition
There is concern deepfakes could be used for political manipulation

Some consider deepfakes a matter of national security. The Defense Advanced Research Projects Agency (DARPA) is a branch of the US government that develops military technologies. Experts at DARPA call synthetic-media detection a “defensive technology” designed to fight foreign adversaries and domestic domestic political antagonists.

Some actors have been victims of deepfakes when their faces were used in adult films.

Legal Ramifications?

On June 13, the US House Intelligence Committee initiated a hearing on deepfake technology. They wanted to assess the power and potential of deepfakes. Witnesses spoke about the political risks, but also the business threats. A false video of a CEO speaking about their company could affect stock prices. The financial damage would be done long before the fake was exposed.

Fake text
Should deepfakes be identified as “fake”?

Rules and regulations are tricky when it comes to the internet. There are currently no laws regulating the use of deepfakes. Some believe these false representations should fall under libel, defamation, identity fraud, and impersonation laws. Others are concerned about over-regulation with the possibility of parodies and satire banned under such rules.

Deepfake Examples

Fake videos aren’t a new phenomenon, but the fakes are more believable than ever. Living in a digital world means we have real time connections. Consider this: Would the Arab Spring have happened if deepfake videos were mixed in with the videos that helped propel a movement? The veracity of shared video is a vital part of how information is spread and how viewers react.

Here are a few recent examples of altered video that led to the formation of opinion, judgment, and action:

  • Video of US House Speaker Nancy Pelosi (D-Calif) made her appear drunk, thanks to a slow-motion effect. This video was viewed more than three million times.
  • This Mark Zuckerberg deepfake was posted to Instagram
  • A CNN reporter was made to look as if he assaulted a White House intern with the creative editing of press conference footage.

Ask Yourself…

Efforts are being made to identify and expose deepfakes so we can all be made aware when the video we’re watching has been altered. Researchers have designed systems to analyze light, shadows, and blinking patterns that act as telltale signs of fake videos. Digital forensics, also known as computer forensics, is using scientific applications to investigate digital crimes and attacks. It’s also our responsibility as viewers to question material that doesn’t seem authentic and seek out sources and verification. We hope to see the day when deepfakes can be used to enhance video production without endangering democracy.

Related Posts

Request a Quote