Videos play a major role in fake news. Online video content can be repurposed from other tragedies to foment outrage (or even from video games). Fake videos can employ actors. In the US, tragedies are often followed by proliferating conspiracy videos.
Unfortunately, the fake video problem is about to get a lot worse.
New tools are allowing individuals to quickly, cheaply, and effectively substitute faces in videos in ways which are extremely believable. The videos, termed “deepfakes”, are popular in specialty communities on Reddit and elsewhere, but are about to become much more mainstream.
Those wishing to sow dissent or chaos now have a new, incredibly powerful tool at their disposal. What happens when an authentic appearing video appears of Donald Trump screaming for a nuclear attack?
Unfortunately, the tools to identify fake videos aren’t keeping up with the technology to produce them. Experts in identifying fakes still need to magnify images frame by frame to analyze shadow patterns, for example.
Educating people about the new technology will be necessary — although unfortunately this will have the side effect of questioning the veracity of all video.