Skip to main content
Article

Deepfakes, Pornography and Consent

Author
  • Claire Benn orcid logo (Australian National University)

Abstract

This is an accepted article with a DOI pre-assigned that is not yet published.

Political deepfakes have prompted out cry about the diminishing trustworthiness of visual depictions, and the epistemic and political threat this poses. Yet, in reality, this new technique is being used overwhelmingly to create pornography, raising the question of what, if anything, is wrong with the creation of deepfake pornography. Traditional objections focusing on the sexual abuse of those depicted fail to apply to deepfakes. Other objections—that use and consumption of pornography can harm the viewer or other (non-depicted) individuals—fail to explain the objection that a depicted person might have to the creation of deepfake pornography that utilises images of them. My argument offers just such an explanation. I demonstrate there are two ways in which an image can be ‘of us’, both of which can exist int he case of deepfakes and can ground a requirement for consent. Thus, I argue: if a person, their likeness, or their photograph is used to create pornography, their consent is required. Whenever the person depicted does not consent (or in the case a child, can’t consent), that person is wronged by the creation of deepfake pornography and has a claim against its production.

Keywords: consent, deepfakes, doctored, pornography, privacy, photography