Abhorrent Violent Material Act Fact Sheet
Information on the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019.
Sharing of Abhorrent Violent Material Act Fact Sheet
The Christchurch terrorist attack on 15 March 2019 demonstrated the potential for live streaming and video sharing platforms to be abused by perpetrators to amplify their extremist messages and actions in the immediate aftermath of such incidents.
The Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 (the Act) came into effect on 6 April 2019. The Act created new offences in the Criminal Code Act 1995 (Cth), aimed at reducing the incidence of online platforms being misused by perpetrators of violence. This fact sheet should not be considered as a replacement for legal advice. Anyone potentially affected by the Act should seek independent advice.
Overview - key points
- The new offences created by the Act are:
- Failure to report—an offence for internet service providers, hosting and content providers that fail to notify the Australian Federal Police (AFP) within a reasonable time about material relating to abhorrent violent conduct occurring in Australia. A defence is available if the service provider can show that they had reasonable grounds to believe the AFP was already aware of the details of the material.
- Failure to remove—offences for content service and hosting services that fail to remove access to abhorrent violent material expeditiously where that material is reasonably capable of being accessed within Australia. The Act provides a range of defences to this offence. For example, in circumstances where it relates to law enforcement purposes, news and current affairs, public policy advocacy and research purposes.
- The offences target providers who are aware of abhorrent violent material that can be accessed through their service and fail to act—they do not criminalise ignorance.
- Failure to notify attracts a fine of up to 800 penalty units. Failure to remove attracts:
- for an individual, a fine of up to 10,000 penalty units and up to 3 years’ imprisonment, or
- for a corporation, a fine of up to 50,000 penalty units or 10% of the annual turnover of the company, whichever is greater.
- The Act also gives the eSafety Commissioner a new power to issue a notice stating that, at the time the notice is issued, a content or hosting service is providing access to or hosting abhorrent violent material.
- Where abhorrent violent conduct occurs in Australia, providers may have to notify both the AFP and remove the material from their services.
- Abhorrent violent material can be in the form of video, still images or audio.
- The Act does not affect any existing reporting or content removal requirements.
What is abhorrent violent material?
Abhorrent violent material is limited to very specific categories of the most egregious, violent audio, visual or audio-visual material produced by a perpetrator or their accomplice. The definition includes video, still images (including a series of still images) and audio recordings.
It must stream or record conduct where a person engages in a terrorist act (involving serious physical harm or death of another person), murders or attempts to murder another person, tortures another person, rapes another person or kidnaps another person (where the kidnapping involves violence or the threat of violence). This conduct is referred to as ‘abhorrent violent conduct’. The definition does not include material recording animated, re-enacted or fictionalised conduct.
What are providers’ notification requirements under the Act?
Internet service providers, hosting service providers and content service providers must notify the AFP if they are aware that their service can be used to access particular abhorrent violent material that depicts abhorrent violent conduct that has occurred, or is occurring in Australia. This applies to all providers, whether they are based in Australia or overseas.
To be liable for the offence of failing to notify the AFP of abhorrent violent material on their service, a provider must have:
- been aware that their service can be used to access particular material
- had reasonable grounds to believe that the material was abhorrent violent material, and
- had reasonable grounds to believe that the relevant conduct was occurring (or had occurred) in Australia.
The offence does not capture ignorance or negligence, and will not apply where a provider is genuinely unaware of particular material being accessible on their platform. However, it does require action when there is a level of awareness of abhorrent violent material being available (for example, if providers receive user-complaints about the online material).
This offence does not apply if the provider has no reason to believe that the abhorrent violent conduct took place in Australia.
What is “a reasonable time” to notify the AFP?
The provider must notify the AFP within a reasonable time of becoming aware that their service can be used to access a particular piece of abhorrent violent material depicting abhorrent violent conduct that has occurred or is occurring in Australia. The term ‘reasonable time’ is not defined in the Act.
Whether a person has notified the AFP within a reasonable time will depend on the unique circumstances in each case. A range of factors will contribute to an overall determination of “reasonableness.” For example, the type and volume of the material, any complaints received about the material, and the capabilities and resources available to the provider.
What if the AFP is already aware of the material?
A provider does not need to notify the AFP if they reasonably believe that the details of the material are already known to the AFP (for example, if there has already been widespread media reporting about particular material, or if the provider has already referred the material to the National Centre for Missing or Exploited Children or to Interpol).
How do I notify the AFP of online abhorrent violent material?
If a provider becomes aware of abhorrent violent material accessible through their service, they are able to notify the AFP in the following ways:
For providers:
- Email details to—NOSSC@afp.gov.au; or
- Call the National Operations State Service Centre (NOSSC) on +612 5127 0001.
For the public:
- If you are concerned about the immediate safety of yourself or another person, or a crime is in progress, call 000.
- If the platform provides mechanisms to raise concerns about content, a member of the public should raise concerns via these mechanisms in the first instance.
- You can report the content to the eSafety Commissioner.
- You can report Commonwealth offences to the AFP via their online form.
When do providers need to remove material from their platforms?
Hosting service providers and content service providers must expeditiously cease hosting or remove abhorrent violent material that is reasonably capable of being accessed within Australia. This applies to all providers, whether they are based in Australia or overseas, and applies regardless of where the depicted abhorrent violent conduct occurred.
The requirement to remove or cease hosting material requires the provider to ensure that the material is no longer accessible to end users in Australia.
This obligation does not apply to internet service providers.
A provider will only be liable for failing to expeditiously remove material where the provider:
- was aware of a substantial risk that
- their platform was hosting, or could be used to access, specific material
- the specific material was abhorrent violent material, and
- having regard to the circumstances known to the provider, it was unjustifiable for the provider to take both of those risks, and
- intentionally did not remove the abhorrent violent material expeditiously.
Recklessness, as defined in section 5.4 of the Criminal Code, is the fault element for two elements of the offences. These elements are:
- whether the material is hosted on, or accessible through (as relevant), the service, and
- whether the material is abhorrent violent material.
There may be multiple companies who are host and/or content service providers in relation to a given piece of content. While all relevant providers may be subject to the offences if they do not expeditiously remove abhorrent violent material from their services, in circumstances where a notice has not been issued by the eSafety Commissioner, the threshold of recklessness will be easier to establish for providers with a closer link to the content (for example, a provider who administers a website) than providers whose services are more remotely related to the content (for example, providers of business to business infrastructure and cloud services).
Are providers required to monitor all content on their platform to make themselves “aware” of abhorrent violent material?
The Act does not require providers to take steps to make themselves aware of abhorrent violent material accessible on their platforms and does not require that providers monitor all content on their platforms.
There are a number of ways a provider could become “aware” of a risk that their platform can be used to access particular abhorrent violent material. For example, platforms may become aware of material through a notice from the eSafety Commissioner (further details about this process are below), media reporting on particular content or through user complaints.
The offence does not apply where a platform is negligently unaware or ignorant of particular material accessible through their platform.
What is considered “expeditious removal” of material?
There is no specific timeframe that dictates what constitutes “expeditious removal” of abhorrent violent material. Similar to the requirement to notify the AFP of abhorrent violent material in a “reasonable time”, a range of factors, for example, the type and volume of the material and any complaints received about the material, and the capabilities of and resources available to the provider, will contribute to an overall determination of whether the removal was expeditious.
If material is removed, do platforms still have to notify the AFP?
Yes, if the material depicted abhorrent violent conduct that occurred or is occurring in Australia.
A platform must notify the AFP where they are aware that their service can be used to access particular abhorrent violent material that depicts abhorrent violent conduct occurring in Australia (see above). This means that where the abhorrent violent material depicts conduct that is occurring or has occurred in Australia, a platform may need to both notify the AFP about the material and remove it from their platform.
Platforms do not need to notify the AFP in circumstances where the provider reasonably believes the AFP is already aware of the material.
The Australian eSafety Commissioner
The Australian eSafety Commissioner’s role is to promote online safety for all Australians. The eSafety Commissioner coordinates and leads the online safety efforts of the government, industry and the not-for-profit community. It has a broad remit which includes receiving, investigating and taking action to address serious cyberbullying targeting children; illegal and harmful online content; image-based abuse; and, from January 2022, serious cyber abuse targeting adults.
How does the eSafety Commissioner’s notice work?
Under the Act, the eSafety Commissioner may issue a notice formally advising a content service or hosting service provider that their service is providing access to or hosting specified abhorrent violent material.
A notice issued by the eSafety Commissioner does not mean a failure to remove offence has been committed. Rather, it creates a rebuttable presumption in relation to a future prosecution that the service was reckless as to whether the material specified in the notice:
- was abhorrent violent material, and
- could be accessed using their service.
If a prosecution occurs, the service can rebut this presumption by pointing to evidence that suggests a reasonable possibility that it was not reckless as to whether the material specified in the notice:
- was not abhorrent violent material, or
- could not be accessed using the service.
Alternatively, removing the specified material expeditiously after the service becomes aware of it will also ensure no offence is committed.
It is ultimately a matter for the eSafety Commissioner to determine how, when and to whom notices are issued in accordance with the Act.
The Government expects all relevant services to cooperate with the eSafety Commissioner, and to advise the eSafety Commissioner early where they anticipate difficulties removing the specified material.
Will the removal offence prevent journalists and news sites showing footage as part of reporting?
No. A range of defences are available in relation to the removal offence, including where the abhorrent violent material relates to a news or current affairs report that is in the public interest and is made by a person working in a professional capacity as a journalist, where the accessibility of the material is for lawful advocacy purposes, or where the accessibility of the material relates to research or artistic works created in good faith.
In all circumstances, the Australian Attorney-General’s written consent will be required before a prosecution for the removal offence can commence. This requirement is a further guard against any inappropriate prosecutions.
Are there other legitimate reasons not to remove this material?
Defences are available to ensure that providers are not liable for failing to remove abhorrent violent material where maintaining access to that material is necessary for law enforcement purposes, court processes or for scientific, medical, academic or historical research (where the accessibility of the materials is reasonable for the purposes of conducting that research).
Will bystanders who capture footage of events (i.e. terrorist attacks) be penalised by the offences?
No. The offences only apply to internet service providers, content services and hosting services, not individuals who film events. The offences only apply to footage of abhorrent violent conduct filmed by the perpetrators or their associates. Footage captured by innocent bystanders is not captured by the offences.
Does the Act require companies to breach US or other foreign laws?
While there is no explicit exemption from the offences due to conflicting requirements under foreign laws, the Act does not require foreign companies to act in contravention of foreign laws.
As an additional safeguard against inappropriate prosecutions, the Australian Attorney‑General’s written consent would be required before commencing any prosecutions for the failure to remove offence.
For the notification offences, the Attorney‑General’s consent is required if the failure to notify occurred wholly outside Australia and the offender is neither an Australian citizen nor a body corporate incorporated under Australian law.
Does the Act impact on other reporting or removal requirements?
No. The Act does not displace or modify any existing requirements that affect service providers. However, in some circumstances material may be subject to multiple laws. For example, some abhorrent violent material may also be potential prohibited material under Schedule 7 to the Broadcasting Services Act 1992, and subject to the removal scheme found in that Schedule. From 23 January 2022, the Schedule 7 powers will be replaced with the new Online Content Scheme under the Online Safety Act 2021. Where the eSafety Commissioner has chosen to utilise one notification mechanism or another, it should not be assumed that the material is not also subject to another legislative scheme simply because the eSafety Commissioner has not also issued a notice under that scheme.
What are the penalties?
The penalty for committing these offences will be decided by the courts, determined in line with the general sentencing principles outlined in Part 1B of the Crimes Act (Cth). The penalties in the Act are the maximum possible penalties for the offences, and will only be appropriate in the most severe circumstances, for example, repeat contraventions.
The failure to notify offence attracts a fine of up to 800 penalty units (currently up to $177,600 for individuals or up to $888,000 for corporations).
The failure to remove offence attracts:
- for an individual, a fine of up to 10,000 penalty units (currently $2.22 million) and up to 3 years’ imprisonment, or
- for a corporation, a fine of up to 50,000 penalty units (currently $11.1 million) or 10% of the annual turnover of the company, whichever is greater.
More questions?
Contact the Attorney-General’s Department on:
- Telephone: 02 6141 6666
- Email: CriminalLaw@ag.gov.au
- Complete our online form
Disclaimer
Please note that this fact sheet is limited only to the operation of the Act, and obligations under other schemes are not considered. However, the operation of the Act does not limit the continued operation of other schemes including the Privacy Act 1988, and, from January 2022, the Online Safety Act 2022.
This fact sheet is guidance only and should not be construed as legal advice.