採用情報
2024 in Review: Regulatory Landscape around Age Assurance
A look back at 2024 and the major changes and emerging trends in age assurance laws and regulations around the world, particularly concerning children in online environments.
2024 has witnessed significant advancements in the legal and regulatory frameworks surrounding age assurance, particularly concerning children in online environments. This article reviews major changes and emerging trends in age assurance laws and regulations around the world.
Self-declaration and Single Sign-On (SSO) under scrutiny while many regulators push age assurance
While simple self-declaration (i.e., asking users to tick a checkbox stating "I am over 13") has long been the de facto market standard for age assurance, 2024 marked a major rethinking of the viability of this approach. The self-declaration “age gate” is increasingly being scrutinised — and in some cases, rejected — by regulators, particularly for high-risk services. Instead, regulators are pushing for more reliable forms of age assurance.
- In January, the UK’s Information Commissioner’s Office (ICO) updated its Opinion on age assurance for the Children’s Code and expressly stated that self-declaration on its own is not an appropriate age assurance method for high risk services. The ICO introduced a new term called the 'waterfall technique' which calls for combining self-declaration with other, more demonstrably effective age assurance methods.
- Also in January, the South Korean Personal Information Protection Commission (PIPC) conducted a specific audit of 20 apps (focusing on games, videos and SNS apps) frequently used by children and adolescents, and found that most apps have age verification procedures for minors under 14, but the measures used to prevent children from falsely entering their age were "insufficient". In April, the PIPC then released its Guide to Application of Personal Information Protection Act for Overseas Businesses, which requires overseas businesses to use the specific verification methods to confirm the minor status by: (1) asking users to enter their full legal date of birth; or (2) using one of the identity verification providers at identity.kisa.or.kr. These changes suggest that the method of letting users self-check a box that says “14 years of age or older” as enumerated in the Guidelines for Protection of Personal Information of Children and the Youth dated July 2022 is no longer recommended.
- In July, California's attorney general issued an enforcement action against mobile gaming company Tilting Point Media inter alia for its failure to provide a neutral and effective age screen in its game "SpongeBob: Krusty Cook-Off." The age gate asking users to select their birthday defaulted to “1953,” requiring users under 13 to scroll through more than 50 years to select an accurate birth year. The settlement also noted that players who identified as children were nonetheless prompted to share their personal information with third-party advertisers without parental consent.
- In August, the US Justice Department and FTC sued TikTok for alleged COPPA violations following a previous 2019 court order. The lawsuit claimed, inter alia, that TikTok inadequately relied on third-party sign-in options like Instagram instead of implementing an independent age gate. According to the DOJ filing, TikTok internally labeled these users who used third-party sign-in options as "age unknown" accounts, but did not obtain parental consent for them.
- In October, the Spanish Data Protection Authority (AEPD) published the guidance ‘Safe internet by default for children and the role of age verification’, where they emphasised that simply asking users to state their age isn’t sufficient. The greater the risk posed by a service, the more companies are expected to take further steps to verify the accuracy of this information.
Facial age estimation gaining traction amidst privacy and accuracy concerns
As organisations seek more reliable age assurance methods, facial age estimation technology has emerged as a frequently-discussed solution in certain jurisdictions, though not without concerns about its accuracy and privacy implications.
- In September, the Spanish Data Protection Authority (AEPD) published a blog post examining the use of "probabilistic methods" involved in processing personal data (particularly facial age estimation). The AEPD opined that while exclusive reliance on facial age estimation isn't likely to be sufficient for high-risk processing, partial use of facial age estimation in conjunction with other methods could be helpful. The AEPD highlighted the significance of error thresholds in probabilistic methods. For instance, a 0.01% error in a population of 450 million would affect 45,000 individuals, which is substantial.
- In line with the ‘waterfall technique’, facial age estimation can be combined with other verification methods to enhance accuracy. The UK ICO put forth the following use-case scenario: Facial age estimation is deployed with a 7-year age buffer. Users reported as over 25 pass without further checks. Those reported as under 25 must go through the secondary age verification step (e.g., credit card verification or ID check).
- In November, the Australian government appointed the Age Check Certification Scheme (ACCS) to conduct the Age Assurance Technology Trial. The trial aims to evaluate the effectiveness and privacy-protectiveness of available age assurance technologies to age-limit access to social media platforms for under 16s and prevent access to online pornography for under 18s. It is expected to conclude with a final report mid-next year, before the new social media ban goes into effect in November 2025.
The potential processing of sensitive biometric data in facial age estimation has triggered specific concerns, which are amplified in certain jurisdictions.
- In March, the US FTC was unwilling to adopt a facial age estimation technology as a pre-approved solution for obtaining parental consent under the Children’s Online Privacy Protection Rule (COPPA). The proposed mechanism would have analysed the geometry of a user’s face to confirm that they are an adult. The FTC’s rejection was without prejudice — the Commission stressed that it was not taking a position on the merits of the application, but rather waiting until better data was available describing the accuracy of the technology, such as the results of a review of the technology by the National Institute of Standards and Technology (NIST). Still, this rejection was likely in part motivated by the fact that the FTC received a number of comments from consumers concerned about the privacy and security implications of the processing of face data.
- Likewise, Utah's attempt to impose age verification requirements was halted by a federal judge this September, on the basis that the law violated the First Amendment's protections for free speech. The US Supreme Court is set to weigh in on the constitutionality of age verification laws in Free Speech Coalition v. Paxton this January.
- The UK ICO has emphasised that age assurance technologies are still required to comply with general data processing principles, and expressed specific concern about age assurance methods that collect unnecessary or intrusive amounts of data (in particular biometric and other forms of sensitive data).
Additionally, legal developments suggest a more nuanced understanding of age assurance may be emerging.
- In March, Singapore’s Personal Data Protection Commission (PDPC) published the Advisory Guidelines on the PDPA for Children's Personal Data in the Digital Environment, where they acknowledged that age assurance does not necessarily need to occur at the account registration stage and may be carried out at appropriate juncture(s). For example, if there is an online game that children can play without signing up for an account, and the organisation wants to encourage users to take breaks from extended gameplay, they can use age estimation methods to guess the user's age or age range – if the user is likely to be a child, the game can then remind them to take a break. **
Increased cooperation both cross-border and cross-regulators
There has been increased coordination among regulatory bodies, both cross-border and domestically.
- At the beginning of the year, the European Board for Digital Services began hosting meetings. Among its working groups, the DSA Task Force on Age Verification was set up with the objective to foster cooperation with national authorities of EU member states for the implementation of best practices and standards in age verification. Key initiatives of the Task Force include (i) the development of Guidelines under Article 28 DSA, expected in Q2 2025; (ii) the public consultation on the proposed scope and approach of the Guidelines, which ran this summer; (iii) a call for tender on an age verification solution to establish a common approach at EU level; (iv) a multi-stakeholder workshop, which focused in particular on the DSA notion of “accessible to minors” and (v) the coordination with other regulatory groups, such as ERGA and the EDPB, to ensure consistency on the topic.
- In September, data protection regulators from the UK, Gibraltar, the Philippines, Canada, Argentina and Mexico issued a Joint Statement on a Common International Approach to Age Assurance, outlining their shared expectations. The Joint Statement posits that if there is a high data protection risk to users, an age gate using self-declaration is too easily circumvented. It stresses that age assurance is an important part of a broader toolkit available for protecting children online.
- In October, the European Data Protection Board (EDPB) published their Work Programme 2024-2025. The programme confirms joint efforts by European data protection authorities are underway to deliver Guidelines on Children’s Data, on age verification criteria, and on the interplay between EU data protection law, the DSA, and the DMA.
- Between the end of October and the beginning of November, the Global Privacy Assembly (GPA) hosted their annual conference, coordinating the works of data protection and privacy commissioners around the globe. One of the key sessions featured teens sharing their perspectives on privacy, data protection, and the impact of mobile apps on their lives and mental health. The panel underscored ongoing global efforts to raise awareness about the unique vulnerabilities of children in the digital space and the need for stronger protections.
Concluding thoughts
As we look back on 2024, the landscape of age assurance has evolved significantly. Regulatory bodies worldwide have shown increased collaboration and sophistication in their approach, while technology providers have responded with innovative solutions. Although concerns have been raised over the privacy and security risks associated with age assurance, these risks can be adequately mitigated, such as through privacy preserving and zero-data solutions that process all biometric information solely on the user’s device.
Looking ahead to 2025, we can expect more highly targeted regulation that require the implementation of effective age assurance methods (making self-declaration less and less common), as well as further refinements in both regulatory frameworks and technological solutions as the industry continues to mature and adapt to these important challenges.