採用情報
Rollercoaster
On rollercoasters, early online gaming and regulations
1959
When Matterhorn Bobsleds first opened at Disneyland Park in California in 1959 it was the first tubular steel roller coaster in the world. It was faster and more agile than anything before it.
It was also, reportedly, the first in 150 years of roller coaster history, to apply a minimum height requirement.
The concern was that the new speeds and sudden turns could run the risk of certain smaller riders – generally kids and teens – being injured or ejected from the cart.
The idea that those kids and teens would be blocked from Disneyland entirely, because of the roller coaster, were of course absurd. The pragmatic option was to simply apply a height restriction and check it before riders boarded.
For everything else in the park, the usual rules applied: kids and teens could ride and play everything to their heart’s content (in the case of younger kids, with parental supervision).
EverQuest and Early Online Gaming
When EverQuest, the first true commercially viable online multiplayer game, launched in 1999, it didn’t distinguish between players of different ages.
Every player had access to everything: the ability to create custom avatars and names, PvP, guilds, chat, player support (volunteers who helped out new players, chatting them through stages of the game) as well as customer support.
Keep this in mind as we’ll come back to this in a minute.
Early “Kids” Regulation
Early regulation designed to protect kids online was less-focused on these specific features and the risks that might exist in using them. Instead it was focused on what was then considered the most frustrating element of the internet: pop-up and banner advertising.
It was everywhere, and as a result, the idea of kids being tracked across the internet and advertising proliferating throughout their browsing activity was of significant concern.
Of course, there were emerging businesses and entirely new commercial models arising from the dotcom boom, so lawmakers didn’t want to put a stop to that entirely. The result was the creation of new regulation that applied restrictions if a website was directed to or targeted at kids (“kids” meaning anyone under the age of 13, at least in the most impactful regulation of the time).
Regulation popped up in markets all throughout the world to restrict websites that were directed to kids: don’t profile, don’t advertise, tell parents what you’re doing, and in some cases set up a “hotline” to check that a parent is a real adult.
Complying with the regulation was challenging, expensive, time-consuming, and in many cases considered unnecessarily burdensome for websites that “weren’t really directed at under 13s”.
13+
Some popular online games start as indie projects or with a small dedicated fanbase that grow over the years into huge communities. In the ‘early days,’ there is no budget for the compliance efforts necessary to manage regulations like the above.
Even as games grow – and even with games from larger studios – the cost of compliance relating to a game ‘directed at under 13s’ runs into the millions.
This expensive, time-consuming and confounding challenge is why the majority of online games today – in fact, why the majority of online services whether they be social, video, e-commerce, or otherwise – say that they are “only for 13 and over”.
The fundamental obligation that an online platform has, if it takes this approach, is to delete an account once it becomes known that the user is under the age of 13.
Did we teach kids to lie?
If everything on the internet requires someone to be at least 13, then no one is ever under 13.
Over the course of the last two decades the online environment has conditioned everyone to know that they must be over 13 if they want to sign up to anything.
Every time someone is asked for their age, or their date of birth, they all know that what it’s really doing is asking whether they are at least 13 (yes, that has become more nuanced in recent years to account for regulation that uses different threshold ages).
The current generation of kids is the most digitally native: they are savvy online consumers and know the consequences of this type of pop-up.
Our greatest failure in building regulation for the internet is that we taught every kid they need to be invisible just to use it.
So kids don’t game online?
Back to EverQuest.
EverQuest was one of a number of hugely successful online games in the early days of higher bandwidth internet. It brought players together and gave them a sense of community. Communities are, by their very nature, social.
With the backdrop of the regulation that forced every website to declare whether they were directed at kids or not, it made little sense for any online game to do so (and as history has shown, very few mass-market online platforms determined that under 13s were a significant audience). After all, the majority of these communities skewed a little older, or at least more digitally mature (11-year-olds who could grind quests far better than any 21-year-old).
For much of the Internet’s history, this is exactly as things have played out:
- most platforms say that they are for people 13 and older
- anyone under 13 who wants to use it often says that they're at at least 13
- from this point, the platform assumes the user is at least 13
- this means that the child is "invisible" to the platform
- this means that the child is invisible to the parent (because they can't very well build a solution to connect a parent to a child if your position is that there are no kids)
Next-gen regulation
With the heightened awareness of the dangers posed to kids and teens online today, we have seen an abundance of regulation introduced throughout the world to try and protect Gen Z, A and beyond.
Before we get into this new regulation, it is incredibly important that we concede that the laws designed two decades ago, as admirable as they seemed at the time, have in some ways endangered the very kids they sought to protect. It has done so by encouraging the development of an online world that disincentivises integrity. This also disadvantages platforms, apps and games: they are unable to build effective systems, processes and tools to protect young users, if all young users say they are not kids.
With the huge variety of regulations being announced, passed, and enforced in this space today, it is so important that we do not make the same mistake.
Much of the discourse today is consumed by trying to address the “gate” through which audiences enter an online world. You hear copious commentary on more robust forms of “age verification” and “age assurance”.
The Park Gate Problem
Everything we hear about the issue of kids online seems to be hit with the same hammer: more robust verification, stricter age assurance, making that gate impenetrable.
If we are not careful, we not only fail to learn from the unintended side effects of regulation introduced two decades ago, but perhaps even double-down on it.
If the regulatory environment we create for the online world of today is one where it is simply too hard, too risky, and too expensive to accommodate kids, then every online platform will continue to do what it always has done. They will not allow kids; and every kid will continue to do what they have always done: find a way in. And today that can be anyone from 7 to 21 depending upon where you are in the world.
Let’s not kid ourselves that we can outsmart two generations who are more digitally savvy than we ever were at that age.
Matterhorn Bobsleds
Here’s the thing: kids love rollercoasters and all of the other rides in a park.
By their very nature, online games are designed to be fun. They’re designed to be an outlet for players to immerse themselves in another world: for a minute, an hour, or more.
Kids, and teens, want to play them.
In 1959 Matterhorn Bobsleds adopted a system that we would do well to learn from. It didn’t block kids at the gate. It adopted a height restriction at the ride.
The restriction didn’t focus on age, it focused on the potential danger.
The rules didn’t stop kids from having fun elsewhere in the park, they focused on the potential danger.
And when that kid who was a little too short grew and could finally ride, nothing in the park needed to change: the restriction had solely focused on the potential danger.
This is the approach we’re championing at k-ID: limiting restrictions based on specific potential dangers so younger audiences can still enjoy the rest of the park, whether that’s a game, a social platform or an entertainment app. Let’s stop keeping kids out - let’s let them into age-appropriate experiences so they can play, learn, and grow online.