As we have written before, new techniques for creating false but remarkably realistic videos, known as “deepfakes,” have created a risk that video evidence will be altered and compromised. This creates a need for lawyers who work extensively with video evidence to develop an ability to detect video compromised with deepfake technology.
Norton, a leading company in the field of digital security, has published a useful guide to spotting deepfakes. The article identifies fifteen signs that the video may have been altered:
1. Unnatural eye movement. Do the subject’s eyes move in ways that don’t seem natural? Or is there no movement or blinking?
2. Unnatural facial expressions. Stitching one image over another can result in a mismatch of expressions.
3. Awkward facial-feature positioning. Sometimes grafting an image will result in features being misaligned.
4. A lack of emotion. The visual image may not match the emotion apparent in the voice.
5. Awkward-looking body or posture. Deepfake artists generally focus on facial features, and may overlook discrepancies in body position or posture.
6. Unnatural body movement. Jerky or disjointed movement, especially when turning or moving the head, can signal a deepfake.
7. Unnatural coloring. Abnormal skin tone, discoloration, strange lighting, or misplaced shadows may reveal a fake.
8. Hair that doesn’t look real. Frizzy or flyaway hair is hard to generate, so too perfect hair may be a giveaway.
9. Teeth that don’t look real. Individual teeth are too difficult to reproduce, so absence of outlines of individual teeth could be a clue.
10. Blurring or misalignment. If the edges of images are blurry or visuals are misaligned, such as where the neck meets the body, the image may be manipulated.
11. Inconsistent noise or audio. Deepfake creators usually spend more time on the video images rather than the audio. Poor lip-syncing, robotic-sounding voices, strange word pronunciation, or digital background noise may be giveaways.
12. Images that look unnatural when slowed down. Watching on a large screen, slowing down the video, and zooming images may reveal fakes.
13. Hashtag discrepancies. Video creators insert hashtags at certain places throughout a video to show that their videos are authentic. If the hashtags change, the video may have been altered.
14. Digital fingerprints. Blockchain technology is also used to create a digital fingerprint for videos. When a video is created, the content is registered to a ledger that can’t be changed. This technology can help prove the authenticity of a video.
15. Reverse image searches. A search for an original image, or a reverse image search with the help of a computer, can unearth similar videos online.
Deepfake creators are also aware of these vulnerabilities and are constantly adapting their techniques to make it harder to detect their efforts. As in many areas of technology, the skills needed to engage in competent representation are constantly evolving. The duty of lawyers to upgrade their skills is constantly changing as well. Comment 8 to Rule 1.1 of the Pennsylvania Rules of Professional Conduct, involving competence, states, “To maintain the requisite knowledge and skill, a lawyer should keep abreast of changes in the law and its practice, including the benefits and risks associated with relevant technology.” Lawyers working regularly with video evidence may find it advisable to learn the skills to assure that they are competent to protect their clients from fraudulent evidence.