The UK’s Online Safety Act Is Not Enough To Address Non-Consensual Deepfake Pornography
The article was originally published in Tech Policy Press.
“What if I told you it only takes as many clicks to get to a site offering tools and tutorials on how to make non-consensual deepfake pornography as it would to get to a page on how to make an omelette. Surely that can’t be true? Unfortunately, it is,” writes our journalist, Manasa Narayanan.
“Despite all the recent furor about pornographic deepfakes after Taylor Swift was targeted, search engines continue to actively serve up sites and forums used to create and circulate deepfake pornography. In fact, if you simply search ‘deepfake pornography’ on Google (without even saying ‘watch’ or ‘create’), the top 3 or even top 5 results would likely be non-consensual deepfake pornography sites; most times, even the Wikipedia entry on this issue does not figure in this list, much less news articles or resources that help victims.
There is a ‘cottage industry based on violating consent,’ said Sophie Compton, the British co-director of the documentary film on non-consensual deepfakes, Another Body, speaking at a recent event on intimate image abuse. Only, I am not sure it is cottage-sized anymore, but something much larger.”