
We’ve Identified the Kids… Now What?
In the world of protecting kids and teens online, so much discussion today is focused on how to best identify underage users. But even assuming you’ve identified them properly, there remains a critical and often overlooked question: what do you do next?
A frequent refrain in the world of kids’ online privacy and safety that “self-declaration” – i.e., asking a user directly to volunteer their age information – is generally ineffective, due to the ease with which a child or teen can lie about their age to access content or features that are not age-appropriate.1 A whole host of technological “age assurance” solutions have emerged in recent years in order to try to combat the problem of kids lying about their age.
But let’s assume for the sake of this post that you’ve solved the age assurance problem. You now know with a reasonable deal of confidence who your adults are, who your teens are, and who your kids are. Ok…Now what?
Broadly speaking, we’re seeing the industry responding to the existence of kids on their platform in a few ways, some ways better than others. We’ll go through each option in turn.
Option 1: Do Nothing
This is obviously the riskiest approach. To use the US as one example: even if a service is NOT considered “child-directed” under COPPA, the law still applies if an operator learns (through self-declaration, age assurance, or some other method) that a specific user on their otherwise adult-oriented service is a child.2 The FTC enforced against Yelp in 2014 based on this principle: even though the FTC acknowledged that Yelp’s app was not intended for or even especially appealing to children, it nonetheless held that Yelp had actual knowledge that some users were underage and failed to take action: “People who registered on the app were asked for a date of birth, but regardless of what they entered, the Yelp app allowed them to sign up and gave them full access to all features.” This Yelp precedent is almost a decade old now, but it’s still crystal clear: failing to act on actual knowledge of children on your platform constitutes a legal violation.
Option 2: Kick out Anyone Underage
Companies that take this approach often argue that providing an age-appropriate experience would be too expensive or cumbersome, given that the underage users are not their “intended audience.” Putting aside that in many industries (e.g., video games and social media), kicking out underage users is likely to be bad business – it alienates what might be a game’s most engaged and vocal demographic, and violates a number of platform policies3 – this strategy is also legally risky in many cases. For example:
- In the US, the FTC has made clear that “If your [site or service] is “mixed audience” … you may not block children from participating altogether.” (emphasis added).4 A “mixed audience” game under COPPA is one where the game’s actual audience consists of a substantial number of children, even if the intended audience is only teens or adults.5
- Meanwhile, under Article 31 of the United Nations Convention on the Rights of the Child (UNCRC), which has been ratified by 195 countries (including all the countries in the EU), children have a right to “play and engage in recreational activities appropriate to their age.” The UK Information Commissioner’s Office has stressed that the Age-Appropriate Design Code reflects this right: the Code is intended “to protect children within the digital world, not protect them from it.” Similar to COPPA, the UK Ofcom has added that services “likely to be accessed by children” are covered by the Online Safety Act even if children are not their intended audience, and “[s]ervices cannot decline to take steps to protect children merely because it is too expensive or inconvenient.”
Of course, there is nuance here: there are obviously some experiences that children can and should be blocked from participating in. (No one is saying that children need to be able to access things like dating services, for example!). That said, the bar for “likely to appeal to children” may be lower than many companies expect. Looking at some of the FTC’s past enforcements is instructive:
- TikTok (then called Musical.ly) is a social network that many adults use today, including to share content of an adult nature. However, the FTC alleged the app to be “child-directed” under COPPA because it included the ability to lip sync to Disney songs, and because allegedly kid-friendly celebrities like “Katy Perry, Selena Gomez, Ariana Grande, [and] Meghan Trainor” had active Musical.ly accounts.
- Epic Games’ Fortnite was explicitly rated “T for Teen” by the ESRB (PEGI 12 in the UK) and is a game where players (among other things) use weapons like rocket launchers and sniper rifles to shoot each other in a “battle royale” format. However, the FTC nonetheless held it to be child-directed under COPPA, citing to (among other things) the bloodless nature of the combat and the existence of kid-friendly promotional tie-ins like Nerf guns and kids’ Halloween costumes.
As part of each of these settlements, the companies involved had to comply with COPPA’s parental notice and consent requirements – they couldn’t simply kick out the kids and call it a day.

Option 3: Provide an Age-Appropriate Experience
This is what we believe is the best approach: dynamically tailoring the online experience based on each child’s age and jurisdiction. A number of video game companies already do this to some degree.
Even for companies using this approach though, there are nuances and questions: for example, how many jurisdictions and local laws does your internal compliance logic support? How do you adapt when laws change? Do you simply provide a “childlite” experience where anything risky for children is turned off, or do you actually loop in the parent and get their consent to individual game features? What happens when a user has a birthday and is no longer considered a child in their jurisdiction?
Answering these questions historically required bespoke expensive technological solutions, not to mention hundreds of thousands of dollars in legal fees to stay current on a constantly expanding patchwork of local and international rules and regulations around kids’ data. However, third-party solutions are starting to emerge that distill all this complexity down in a simple and accessible format, so it’s as accessible to as many companies and consumers as possible.
Conclusion
There’s no one-size-fits all approach here, and every company’s situation and risk tolerance is different. Still, we at k-ID believe that the next generation of online services can do better than ignoring kids or treating them as undesirables to be removed. Instead, we think the best approach is to empower underage users by giving them a way to participate meaningfully in online worlds while still shielding them from more risky features as appropriate.
References:
- See ICO Guidance on Age Assurance. Self-Declaration is “based entirely on trust and can be easily circumvented and therefore doesn’t significantly mitigate risk. You should avoid using a self-declaration age assurance method as it is unlikely to be accurate and effective, if: there are significant risks to children from the data processing on your site" i.e. if children are likely to access your service.
- See FTC’s COPPA FAQ (even if an adult-focused service does not solicit users’ ages, if an “operator later determines that a particular user is a child under age 13, COPPA’s notice and parental consent requirements will be triggered.”)
- Microsoft’s XR-013 policy for Xbox is that “sign in and sign up must support all user types, ages, and regions where the game title is offered and where those features are allowed by local/regional laws irrespective of age rating.” Meanwhile, the Apple Developer Guidelines say that “Apps may ask for birthdate and parental contact information only for the purpose of complying with these statutes, but must include some useful functionality or entertainment value regardless of a person’s age.”
- COPPA FAQ D.6.
- COPPA FAQ H.2. (“although you may intend to operate a “teen service,” in reality, your site may attract a substantial number of children under 13, and thus may be considered a “Web site or online service directed to children” under the Rule.”).