
Imogen Sadler finds that the current Government’s unfounded ECHR concerns are undermining efforts to criminalise the creation of non-consensual deepfake pornography
What is Deepfake Pornography?
In a recent announcement by the Ministry of Justice, the current Government, after much pressure from campaigners and activists, announced that they would finally make creating sexually explicit ‘deepfake’ images a criminal offence.
For those who are lucky enough not be familiar with this concept, deepfake images are when someone’s likeness is imposed into sexually explicit images with artificial intelligence without their consent. The vast majority of the targets of this behaviour are women, with some estimating the figure being as high as 99% and its use is growing exponentially with the most popular deepfake website receiving up to 17 million hits a month.
One can only imagine the damage caused to women who suddenly find out that without their knowledge and their consent, realistic images of them in sexually compromising positions have been seen by thousands of strangers. The effect is no doubt destructive. The impact has been widely reported including by the BBC and Channel 4 News.
Current legislation
Sharing such images is already an offence, with the previous Conservative government introducing an offence for sharing of intimate images (including deepfakes) which came into force on 31 January 2024 through section 188 of the Online Safety Act as an amendment introducing section 66B Sexual Offences Act 2003.
However, this did not cover the creation of deepfake images and has left a lacuna in the law. It seems fairly self-explanatory that the very fact an individual has created a deepfake of another individual without their consent is still hugely violating. Although the previous Conservative government was introducing legislation to cover this under an amendment to the Criminal Justice Bill, that Bill never progressed through Parliament due to the 2024 General Election.
Proposed legislation
Surprisingly, despite a manifesto commitment to ban the creation of deepfakes, a high-profile Private Members’ Bill in the House of Lords and the voices of many passionate activists and campaigners, it took until 6 January 2025 for an official announcement to come. This promised the introduction of “a new offence meaning perpetrators could be charged for both creating and sharing these images” as well as “creating new offences for the taking of intimate images without consent and the installation of equipment with intent to commit these offences”. These offences, the announcement stated, would be published within the new Crime and Policing Bill, which will be introduced “when parliamentary time allows”.
Whilst the Crime and Policing Bill has not yet been published and will no doubt be examined with interest, the content of the legislation proposed is not looking promising. In the meantime, the Government introduced a proposed amendment to the Data (Use and Access) Bill [HL]. This amendment, a copy and paste of the Conservative amendment mentioned above, created two offences if the following conditions are met:
(1) A person (A) commits an offence if—
(a) A intentionally creates a purported sexual image of another person (B),
(b) A does so with the intention of causing B alarm, distress or humiliation, and
(c) B does not consent to the creation of the purported sexual image.
(2) A person (A) commits an offence if—
(a) A intentionally creates a purported sexual image of another person (B),
(b) A does so for the purpose of A or another person obtaining sexual gratification,
(c) B does not consent to the creation of the purported sexual image, and
(d) A does not reasonably believe that B consents.
This amendment was widely, and understandably, criticised by campaigners due what is referred to by them as its intent rather than consent based nature: namely that in order for an offence under sub-clause (1) to be proved, the prosecutor must be able to establish that the creation of the image is done with the intention of causing alarm, distress or humiliation, or that, under (2) a purported sexual image is created in order to obtain sexual gratification. One can only imagine the expressions of horror in Court as creators of these images would be let off by juries directed by judges that, so long as images of women were not intending to distress the women, or not being used for sexual gratification and they must be found ‘not guilty’.
Lawyers will recognise the difference here as being that between specific and basic intent. Specific intent requires an intention to achieve something beyond the act itself (in this case ‘causing B alarm, distress or humiliation’ or ‘obtaining sexual gratification’). Offences of basic intent require only an intention to commit the act itself. To take a commonly cited example, under the Criminal Damage Act 1971 (CDA 1971) an individual who had set fire to someone else’s house could be charged with basic arson under CDA 1971, section 1(1), or aggravated arson under CDA 1971 section 1(2). In Court, to prove basic arson, only test out in section 1(1) would need to be proven, whereas to prove aggravated arson, the Court would require proof of an ulterior motive to endanger life, whether intentionally or recklessly.
The choice of the Government to go along the specific intent route in this case is a strange qualification; given it is the very act of its creation which causes the effect to its victim, the intention by which it is done seems irrelevant. To take an unlikely hypothetical, say an individual simply creates a deepfake pornographic image of a friend purely out of curiosity – this does not affect the damage done and the effect on the victim. The common law offence of rape does not require specific intent. It is recognised that the act by its nature is causes harm enough. Why, therefore, should the creation of these images be any different?
The Human Rights Argument
On Sunday 26 January 2025, a Sunday Times article stated that this amendment (very sensibly) had been dropped by the Government, who now state they are looking into drafting an offense which does not require this specific intent.
Somewhat alarmingly the Sunday Times reported that the Government felt that the specific intent requirement was necessary as otherwise proposals were not compatible with Article 10 of the European Convention on Human Rights (ECHR), which protects freedom of expression. This argument is astonishing, and a brief overview of ECHR case law makes it clear that any ECHR objections to such a requirement would have very little grounding.
Article 10, which is a qualified right, states the following:
Everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers. This article shall not prevent States from requiring the licensing of broadcasting, television or cinema enterprises.
The exercise of these freedoms, since it carries with it duties and responsibilities, may be subject to such formalities, conditions, restrictions or penalties as are prescribed by law and are necessary in a democratic society, in the interests of national security, territorial integrity or public safety, for the prevention of disorder or crime, for the protection of health or morals, for the protection of the reputation or rights of others, for preventing the disclosure of information received in confidence, or for maintaining the authority and impartiality of the judiciary.
Given Article 10 (2), there is a clear argument that by curtailing the production of the deepfake pornography through legislation, a State would be acting in compliance with the qualification in Article 10 (2). As to its first requirement, they would be acting in a way prescribed by law as long as a law was drafted “with sufficient precision to enable the citizen to regulate his conduct; he must be able – if need be with appropriate advice – to foresee, to a degree that is reasonable in the circumstances, the consequences which a given action may entail” (Delfi AS v. Estonia, application no 64569/09, paragraph 71) And as to its second requirement, Article 10 (2) makes it clear that the right to freedom of expression can be restricted “for the protection of the reputation and rights of others”.
Article 8 ECHR incorporates a right to protection of reputation as part of the right to respect for private life (see Chauvy and Others v. France, application no. 64915/01, paragraphs 43–45) and one can easily see how a deepfake pornographic image or video could damage its victim’s reputation.
The Sunday Times article also references Article 3 ECHR which prohibits “inhuman or degrading treatment” citing an argument that this right, presumably because having a deepfake image being made of a victim could be classed as “inhuman or degrading treatment”, would “outweigh” freedom of expression under Article 10 ECHR, again presumably that the law would constitute the protection of the rights of others as per Article 10(2). At first blush, this argument seems sympathetic, and Article 3 does incorporate with it a positive obligation to protect individuals from “inhuman or degrading treatment”. Case law, such as MC v Bulgaria, application no. 39272/98, makes it clear that this can also apply to rape and sexual assault (paragraph 149) and also the effective investigation of such crimes (paragraph 151). However, the argument should be treated with caution as the bar for an infringement of Article 3 is very high and the case law has not yet grappled with this issue.
But to return to Article 10: a further examination of ECHR case law twists the knife in even further. ECHR case law makes it clear that Contracting States have a wide margin of appreciation where it comes to issues of morality (see Mouvement raëlien suisse v. Switzerland, application no. 39272/98, paragraph 76) including sexual morality. In Müller and others v Switzerland, application no. 10737/84, paragraphs 36 to 43 the Strasbourg Court upheld a conviction of an artist who had exhibited paintings depicting sexual relations between humans and animals, rejecting the artist’s reliance on art 10. Whilst no one is seeking to compare non-consensual deepfake pornography to the “vulgar” artwork of the Müller case (which of course raised no issue of the consent of the subjects), the question remains: if the Court found that it was in Switzerland’s margin of appreciation to subject to criminal penalties artists whose art offended public morals, how can one argue that it would not be within the UK’s margin of appreciation to also subject to criminal penalties individuals whose creations (even further from art than the original trial judge in the Müller case found the Swiss paintings (see paragraph 14) also violate the consent of the individual they purport to portray.
To conclude, and to be somewhat blunt about it: Article 10 of the ECHR was created to protect freedom of speech and expression. It was not created to protect creeps from creating non-consensual deepfake porn.
When the Government comes to further drafting of its proposed legislation, it can be under no illusion that the only way forward is an offence with basic intent, which will ensure that victims do not have to go through the traumatising experience at trial of having to demonstrate the self-explanatory “alarm, distress or humiliation” caused by this behaviour. They must do better.

Imogen Sadler is a barrister at 4–5 Gray’s Inn Square, specialising in public law. She was awarded the Lord Denning Scholarship, Hardwicke Award and European Scholarship by the Honourable Society of Lincoln’s Inn, and has also recently been appointed to the Equality and Human Rights Commission panel of counsel.