Deepfake Pornography and the Law

In the last year, there has been widespread concern internationally about the proliferation of AI-generated ‘deepfake’ pornography. In Australia, some instances of the use of deepfake pornography have resulted in litigation under the Online Safety Act 2021. However, existing laws are generally considered inadequate for dealing with deepfake porn. This page outlines how existing laws can be used to prosecute the creation and use of deepfakes and the proposals for new deepfake-specific laws.

What is deepfake porn?

Deepfake images are realistic images of real persons that have been digitally altered to show the person doing or saying something that they never actually did or said. They can easily be mistaken for real photos or videos.

Deepfake images can be used for a lot of purposes. For example, they can be used to create a video of a politician saying something that they did not actually say. However, the vast majority of deepfake images created and shared are pornographic.

Deepfakes can be created within a matter of minutes using technology that is publicly available.

Online Safety Act 2021

In Australia, section 75 of the Online Safety Act 2021 makes it a civil offence to post intimate images of a person without their consent online. This offence attracts a maximum penalty of a fine of 500 penalty units. However, the Act does not contain offences consisting of the creation of intimate images of a person without their consent.

Section 75 can be used by a victim or by the eSafety Commissioner to take legal action against a person who distributes deepfake pornography. The Commissioner may issue the person with a formal warning as well as giving a removal notice to the social media service or internet service involved. The person can be fined if they fail to comply with the removal notice.

However, this solution has limitations. Firstly, it is costly to litigate. Secondly, it is highly likely that the perpetrator may be a person with limited income and assets and as such, be judgment-proof. It is also often impossible to identify the person who is responsible for distributing the images.

Criminal Code 1995

Under section 474.17(1) of the Criminal Code 1995, it is an offence to use a carriage service in a way that is menacing, harassing or offensive. This offence carries a maximum penalty of three years imprisonment or five years for an aggravated offence.

State legislation

In some Australian states, specific offences involving image-based abuse have been legislated. For example, Victoria and South Australia have introduced summary offences involving the non-consensual sharing of intimate images in response to the issue of deepfake porn.

eSafety Commissioner v Rodondo

in 2023, Anthony Rodondo was sued by the eSafety Commissioner for publishing deepfake pornographic images of several women and published them on his website.

Upon investigating the matter, the eSafety Commissioner issued Rodondo with a removal notice, requiring him to:

  • remove the material from the site; and
  • delete all intimate images of the individuals in question.

Rodondo did not remove the images and subsequently travelled to Queensland from the Philippines, where he resides. The eSafety Commissioner commenced proceedings against him in the Federal Court of Australia for his contraventions of the Online Safety Act. The court ordered Rodondo to remove the images from his site and to refrain from posting or sending any more intimate images.

Rodondo did not comply with the orders and was subsequently charged with three counts of contempt of court, for which he received fines totalling $15,000.

Rodondo was also charged with six criminal offences including obscene publications and exhibitions.

Challenges of prosecuting deepfake porn offences

The sharing of deepfake pornography can be very difficult to prosecute. While deepfakes have metadata that link them to an IP address, this can easily be circumvented through the use of a VPN, which makes it impossible to trace the person responsible.

Many victims of deepfake porn do not know the perpetrator and cannot produce evidence of where the images came from. Furthermore, if the perpetrator was outside of Australia, they would not be subject to Australian laws.

Victims of this type of crime may also be reluctant to report the abuse due to fears of being blamed or a sense that it is not serious enough to warrant investigation as no physical harm has occurred.

if you require legal advice or representation in any legal matter, please contact Go To Court Lawyers.


Fernanda Dahlstrom

Fernanda Dahlstrom has a Bachelor of Laws from Latrobe University, a Graduate Diploma in Legal Practice from the College of Law, a Bachelor of Arts from the University of Melbourne and a Master of Arts (Writing and Literature) from Deakin University. Fernanda practised law for eight years, working in criminal defence, child protection and domestic violence law in the Northern Territory. She also practised in family law after moving to Brisbane in 2016.
7am to 9pm, 7 days
Call our Legal Hotline now