
Australia's New Online Safety Industry Codes: What Companies Need to Know
What Are the Phase 2 Industry Codes?
The Phase 2 Industry Codes are legally enforceable standards developed by industry associations under the Online Safety Act 2021. These codes specifically target "Class 1C" and "Class 2" material, which includes online pornography, high-impact violence, and content promoting self-harm or eating disorders.
Unlike Phase 1 codes, which addressed "the worst of the worst" content like child sexual abuse and terrorism material (already in effect since 2023-2024), the Phase 2 codes focus on age-restricted content that is legal for adults but inappropriate for children.
The nine codes cover virtually every aspect of the online ecosystem:
- Internet carriage services (ICS)
- Hosting services (HOS)
- Search engine services (SES)
- Social media platforms (both core features and messaging)
- Adult websites and designated internet services
- App distribution platforms
- AI chatbots and companion services
- Equipment manufacturers and operating system providers
Implementation Timeline
The codes will be implemented in two tranches within Phase 2:
First Tranche (December 27, 2025):
- Internet carriage services
- Hosting services
- Search engine services
Second Tranche (March 9, 2026):
- Social media platforms
- Adult websites and designated internet services
- App stores and distribution services
- AI chatbots and companion services
- Equipment manufacturers
Extended Deadlines:
- Search engine age assurance: June 27, 2026
- App distribution age assurance: September 9, 2026
Which Companies Are Affected?
Adult Content Providers
Any website or service whose "sole or primary purpose is to enable end-users to access high impact materials" must implement age assurance measures. This includes pornography websites, adult entertainment platforms, and services hosting sexually explicit content.
Social Media Platforms
All social media services must prevent children from accessing adult content on their platforms. This runs parallel to Australia's separate under-16 “social media delay”, which is expected to take effect on December 10, 2025. Platforms that already prohibit such content must strengthen detection and removal systems, while those that allow it must implement age verification.
App Stores and Distribution Services
App stores must ensure 18+ rated applications are appropriately restricted and implement age assurance before allowing downloads of adult-rated content.
AI Services and Chatbots
Generative AI services and companion chatbots must prevent sexually explicit conversations with minors and implement appropriate safeguards.
Gaming and Entertainment Platforms
Online games rated 18+ and messaging services used for adult content distribution must implement age verification measures.
Required Age Assurance Methods
The codes are technology-neutral, allowing companies to choose from multiple verification methods as long as they are "appropriate" for their risk profile. The Australian Age Assurance Technology has tested several age assurance methods that can be implemented privately, efficiently, and effectively when properly deployed. k-ID provides the following methods:
- Photo ID verification using government-issued identification
- Facial age estimation
- Digital banking credentials
- Credit card checks
Privacy and Data Protection Requirements
Companies must balance age verification with strict privacy protections. The codes require:
- Compliance with Australian privacy laws
- Minimization of personal information collection
- Proportionate measures relative to safety objectives
- Options for users to minimize data handling
- Secure data destruction following verification
Notably, no Australian will be compelled to use government-issued identification, and platforms must offer reasonable alternatives to users.
Enforcement and Penalties
The codes are legally enforceable with severe financial consequences for non-compliance:
- Civil penalties up to A$49.5 million per breach for corporate entities
- Civil penalties up to A$9.9 million for individuals
- Formal warnings as the first enforcement step
- Compliance directions from the eSafety Commissioner
- Public statements regarding non-compliance published on government websites
The A$49.5 million figure represents the maximum penalty under the Regulatory Powers Act's 5x multiplier for corporate entities. The eSafety Commissioner has already demonstrated willingness to use these powers, recently issuing formal warnings to AI services that create explicit content of children.
International Context and Market Impact
Australia is following similar regulatory trends globally, with the UK having implemented comparable age verification requirements in July 2025 under the UK Online Safety Act.
The regulations complement Australia's separate under-16 “social media delay”, expected to take effect December 10, 2025, creating a comprehensive regulatory framework for youth protection online.
Next Steps for Companies
Companies should immediately assess their compliance obligations by:
- Determining scope: Review whether your service falls under any of the nine code categories
- Risk assessment: Evaluate your content and user base to determine appropriate age assurance measures
- Technology selection: Research and test suitable age verification solutions that balance effectiveness with privacy
- Implementation planning: Develop deployment timelines to meet the December 2025/March 2026 deadlines
- Privacy compliance: Ensure age assurance methods align with Australian privacy law requirements
The eSafety Commissioner is developing detailed regulatory guidance to help companies understand their specific obligations under each code. Companies should monitor these developments closely and begin preparation immediately, as the significant penalties make compliance essential for continued operations in the Australian market.

.png)
