March 02, 2020
Near the end of 2017, a new technology emerged in mainstream media that was described as a weapon for propaganda. While the idea wasn’t exactly new, the ease and availability of the technology was frightening to a lot of people. That technology became known as video synthesis, or “deepfakes.”
Deepfakes – for those unaware – are video algorithms that are able to convincingly edit videos of people to quite literally put words in their mouths. Based on combinations of facial mapping, artificial intelligence, and other technologies, creators are able to take existing video and manipulate the message to be literally anything that strikes their fancy. Deepfakes have been called one of the most serious threats that democracy faces and have widely been condemned by politicians and approached from a position of both fear and excitement.
While the specific technologies being used for deepfakes may be new, we’ve seen similar alterations and manipulations of video for decades. Whether going back to the DirecTV ad campaign recreating classic film moments to sell satellite television – these were accomplished using a combination of recreating the scene and reverse-aging programs – or looking even further back to the insertion of Forrest Gump into historical footage, the idea of creating something new based on existing footage is not a new idea.
So why all the fervor and fear over the technology? Well, in the wrong hands (just like any other technology), deepfakes could be used to do significant damage. We’ve already seen deepfakes used to put words in the mouth of politicians like President Barack Obama, Nancy Pelosi, and Vladimir Putin. While these have largely been used as comedic outlets on YouTube, a deeper concern around political sabotage and propaganda has repeatedly been raised.
Is it feasible that a political attack ad (sanctioned by the opposition or otherwise) could be created to give the impression that a candidate said something they really didn’t? In the ever-sprinting news cycle, would it matter if the deepfake was found and removed? Would the damage already be done? All of these questions are being raised by the likes of politicians and their campaign teams.
While these are serious questions that do need to be addressed, is the fear over a few bad actors causing many to toss out the baby with the bathwater? Perhaps the benefits and possibilities these new technologies offer could do more good than harm.
Recently, the Malaria No More campaign partnered with an early pioneer in the video synthesis space to create a video with David Beckham. Using the baseline video, they were able to make it convincingly look like David Beckham speaks nine different languages. This allowed the campaign to use the same video footage of Beckham in multiple countries and markets to get the message across without needing to shoot separate videos or teach Beckham multiple languages. The audience this reaches with an important message is exponentially increased by simply manipulating the original video.
Beyond the philanthropic angle, the potential for localized targeting and individualization of marketing messages is greatly increased by simply tailoring a message based on audience metrics. To accomplish this without the need to keep talent in a studio all day delivering separate messages for every audience, a single message can be manipulated with tweaks to make it more personal.
This could mean the ability to utilize the same video assets for placements in the U.S., Mexico, and Canada without the need for talent that speaks English, Spanish, and French – saving money on talent fees and time to market. Being able to show the desired talent across an entire campaign also keeps the message consistent for all audiences regardless of language barriers.
For good or bad, the ability to avoid taking any information at face value and to check the veracity of information when it’s presented cannot be ignored. As technology advances what we’re able to present to a potential audience, the more incumbent it is for us to stay on top of those advances and make them work for us. In a world of disinformation, context – honest context – is not only extremely powerful but a necessity.