By Steve Hawley, Piracy Monitor
It caught my eye that a movie promotion was inviting consumers to apply their social media images to place fake characters of themselves into a trailer for the 2021 Warner Bros. movie Reminiscence, giving a first person experience. While it was jarring to see it, studio interest in deepfakes is actually not new. Disney Research published a research paper about it in 2020; noting that smartphone technology places it within reach.
Piracy Monitor made note of a 2019 Samsung collaboration with Russian researchers which demonstrated passable deepfakes generated from a single smartphone image. The Reminiscence promo puts the concept (although likely not by the same developers) into commercial use.
For reference
- Read more about Warner Bros. and Reminiscence in Protocol
- Visit the Warner Bros. promo Web site, which applies the deepfake
- Read about the 2020 Disney study, and read the study itself.
- Check out DeepFaceLab, an open source deepfake creator
- Read about the face-swapping app Zao, which was available in 2019.
Why it matters
Piracy Monitor segments piracy into several categories, including the theft of content, of services, of service delivery infrastructure, of advertising, of devices and software, and the ‘Theft of You.” But by ‘Theft of You,’ I was referring to the theft of your personal information, a personal account, or resources that you use – which is something that the media and entertainment industry increasingly warns consumers about.
This is something different – a media company using you. I imagine it’s legally defensible in this case because the consumer opts-in to placing their image into the site.

In the bigger picture, while deepfakes can be loads of fun and very attractive to advertisers, there’s also a dark aspect. Since social media can be scraped for images – and of course your social media account probably has thousands of picture of you in every imaginable pose – there’s no reason that an unscrupulous individual (or piracy operation) couldn’t use your image to lure you into downloading a malicious app update or subscribing to an illegal service.
I think this situation is a call for media companies to aim higher in taking responsibility for putting ideas out there that can be turned against their audiences – and against the media companies themselves.