Online Sexual Violence Regulation Australia: What Needs to Change to Keep People Safe Online
Online sexual violence regulation in Australia is failing to keep pace with the scale, speed and seriousness of harm occurring in digital spaces. Sexual violence no longer occurs only behind closed doors. It is increasingly facilitated, normalised and amplified online through platform design, algorithms and profit-driven business models.
A recent issues paper by Sexual Assault Services Victoria, Turning Back the Tide: Exploring regulatory approaches to addressing sexual violence and harm online, makes clear that current regulatory frameworks are not sufficient to address the realities of technology-facilitated sexual harm. Rather than focusing on individual responsibility or reactive takedowns, the paper calls for systemic reform that places accountability squarely on technology companies.
In Australia, the eSafey Commissioner is the independent government regulator responsible for online safety, including tackling serious online abuse, illegal and restricted content, and promoting safer digital spaces for all Australians — adults and children alike. The Commissioner’s powers and role are set out under the Online Safety Act and include investigating complaints about harmful online behaviour and holding platforms accountable.
What follows are the key recommendations from the report, and why they matter.
A clear legal duty of care for technology platforms
A central recommendation is the introduction of a mandatory duty of care on online platforms.
For years, technology companies have framed sexual violence, exploitation and abuse as unavoidable consequences of user behaviour. The paper rejects this position. Platforms actively shape what users see, promote and consume through algorithms and recommender systems that they design and control.
A duty of care would require platforms to anticipate foreseeable harm, design services with safety built in by default, and be held legally accountable when their systems facilitate or amplify sexual violence, misogyny or child exploitation. This shifts responsibility away from victims, parents and children, and places it where it belongs — on the companies that profit from unsafe digital environments.
Regulating pornography content, not just access
Age verification alone is not enough to address the harms associated with online pornography.
The paper recommends mandatory national standards that prohibit pornography depicting sexual violence, strangulation, rape, coercion, incest themes or content that sexualises children. This material should be treated as refused or illegal content, rather than merely restricted.
These recommendations recognise pornography as a public health and gender-based violence issue. Research consistently links exposure to violent and misogynistic pornography with the normalisation of sexual violence, particularly among young men and boys. Regulating access without regulating content fails to address that reality.
Stronger penalties and proactive detection of child sexual abuse material
Despite being illegal, child sexual abuse material continues to circulate widely online.
The paper calls for significantly stronger penalties for platforms that fail to prevent or remove CSAM, including penalties calculated as a percentage of global revenue. It also recommends mandatory proactive detection systems, bans on AI “nudify” tools, and improved coordination between regulators, law enforcement and the financial sector.
Current enforcement mechanisms are not strong enough to compel compliance from global technology companies. Without meaningful consequences, children remain exposed to ongoing and escalating harm.
Holding platforms accountable for harmful algorithms
Algorithms are not neutral. They are designed to maximise engagement and profit, often by promoting increasingly extreme content.
The paper highlights growing evidence that recommender systems actively push users towards misogynistic, violent and exploitative material, including content that excuses or promotes sexual violence. Young men and boys are particularly vulnerable to these pathways.
Key recommendations include greater regulatory oversight of algorithms, transparency requirements, user control over algorithm-driven content, and limits on the use of children’s data. Without regulating algorithms, content moderation will always occur after harm has already been done.
Moving beyond piecemeal regulation
The paper makes clear that fragmented, reactive regulation is ineffective. Sexual violence online is global, fast-moving and embedded in platform business models.
Effective reform requires coordinated national and international regulation that targets systems rather than individuals. It must prioritise safety, dignity and human rights while ensuring survivors, educators and marginalised communities are not silenced in the process.
Online sexual violence is not inevitable. It is the result of regulatory choices and commercial incentives. The recommendations outlined in Turning Back the Tide provide a clear roadmap for reform that centres victim-survivors and places responsibility where it belongs.
Support is available through 1800RESPECT on 1800 737 732 or at www.1800respect.org.au

