‘Deepfakes’: A New Challenge for Trial Courts

Marionettes on video screen

The following was excerpted from an article that will appear in a future issue of NWLawyer. The author was also recently interviewed for the “What’s Next” newsletter on LAW.COM, which you can read here.

The client shows his lawyer a video he says he took on his cell phone. It shows the defendant saying things that, if seen by the jury, will be a slam-dunk for the client’s case. The attorney includes the video in her list of evidence for trial, but the defendant’s lawyers move to strike. They claim it’s a fake. What’s the plaintiff’s lawyer—and the judge—to do?

Welcome to trial practice in the new world of “deepfake” videos. A portmanteau of “deep learning” and “fake,” so-called “deepfake” programs use artificial intelligence (AI) to produce forged videos of people that appear genuine. The technology lets anyone “map” their movements and words onto someone else’s face and voice to make them appear to say things they never said. The more video and audio of the person that can be fed into the computer’s deep-learning algorithms, the more convincing the result. For example, last year University of Washington researchers used algorithms they’d created to make a realistic, but phony, video of former president Obama out of actual audio and video clips. Jennifer Langston, “Lip-Syncing Obama: New Tools Turn Audio Clips into Realistic Video,” UW News (July 11, 2017). But it doesn’t take a UW computer science degree to make a deepfake: the technology is freely available and fairly easy for anyone to use. Its usability, and the verisimilitude of its output, will keep improving over time.

The advent of deepfakes will affect the nation’s lawyers and courts in multiple ways. For one, there will likely be ample litigation by victims of deepfakes relying on various tort or fraud theories. The point of this article, though, is to explore what the courts will do with deepfakes in the evidentiary context. Points where deepfakes could infect a court case run the gamut from clients who fabricate evidence in order to win, to fake videos ending up in archives that have historically been considered trustworthy. In the not-too-distant future, litigators will have to get creative in addressing these challenges, navigate ethical pitfalls, and manage the doubts jurors will have about trusting what’s real.