Endgame due to negligence: what gaming companies need to know about new safety and trust regulations that are fast approaching

At GamesBeat Summit 2023, trust and safety issues, especially for diverse player populations, were top of mind, and nailing it was the focus of the panel, “How to be confident and safe right before you’re forced to.”

“The games industry has come of age,” said moderator Hank Howie, games industry evangelist at Modulate. “We are no longer this ancillary form of entertainment – ​​we have the 800-pound entertainment gorilla. It is time to fully assume the mantle of leadership in the field of trust and security, at the CEO level of each company. Doing anything less risks putting your company in financial jeopardy, as well as being in a position of moral bankruptcy.”

He was joined by leaders of Take This, a nonprofit mental health advocacy organization, Windwalk, which focuses on building online communities, and “web3” law firm, Gamma Law, to discuss the state of trust and safety, regulatory changes affecting game companies, and what developers can do now to put guardrails in their communities.

Here’s a look at the highlights of the discussion, and don’t miss the full panel, available free on demand here.

A small but violent faction.

“Frankly, it’s really, really hard to moderate a third-party platform, especially a pseudo-anonymous one,” said Richard Warren, a partner at Windwalk. “What’s working really well is self-restraint, but also establishing a culture.”

Being intentional with your moderation programs and setting a standard of behavior, especially among die-hard fans, is what sets the tone for any close-knit community.

But the challenge, said Eve Crevoshay, executive director of Take This, is that while we know how to create good spaces, some unsavory norms, behaviors and ideologies have become incredibly common in these spaces. It’s a small but very noisy problem, and that volume means the behavior has normalized.

“When I say toxic, I am specifically referring to misogynistic white supremacist, neo-Nazi and other xenophobic language, along with bullying and mean behavior,” he said. “We haven’t seen a space yet where those things are really actively banned or actively kicked out of a community. We’re identifying those solutions on how to address that, but right now, we’re seeing really high incidences.”

It’s alienating not only players who feel uncomfortable in those spaces, but also industry professionals who don’t feel safe in their own game community. And there is evidence that children in these spaces are learning toxic behaviors, because the environment is so full of it, she added.

“Every white youth, a child in the US, is on an explicit path to radicalization unless it is taken away from them,” he said. “And so I want to be very clear. They are not just games. We have solutions, but we have to use them. We have to implement them. We have to think about this. And that’s why we do the work that we do, and that’s why we’re getting regulatory attention.”

What you need to know about upcoming legislation

In April, the EU Digital Safety Law went into effect, and California’s Age-Appropriate Design Law was passed in September and will take effect on July 1, 2023. It’s crucial that developers take note, because other states will not be left behind.

“I think the regulatory landscape not just in California, but at the federal level in the US is heating up substantially,” Crevoshay said. “We have been speaking with the Senate Judiciary Committee, with Representative Trent Hahn of Massachusetts. Everyone is barking up this tree not just for the protection of children, but for the broader issue of extremist behavior in online spaces.”

Both the EU and California law introduce new restrictions and privacy rules around information collection, targeted advertising and dark patterns, which means that a company cannot take any action that it knows or has reason to know is “materially detrimental” to the physical health, mental health, or well-being of a child. Second, they will regulate the type of content that appears on a platform.

“As gaming platforms, we must not only follow these procedures regarding information collection etc., but we must also take steps to protect children from harmful content and contacts,” said David Hoppe, Managing Partner at Gamma Law. .

But it’s unclear exactly how that will transfer to the real world, and what guardrails game companies will need to put in place, he added. It’s also likely that the EU’s Digital Services Law will pass over the summer, which asks platforms to put in place measures to protect users from illegal content by asking adults to choose what kind of content they want to watch. Non-compliance will see companies receive substantial fines. For example, California law starts at $2,500 per child.

What game companies can do now

The unfortunate fact is that it’s easy to start a community nowadays, and unofficial third-party communities are flourishing. And that’s what you want, of course, said Warren. But it’s also a curse, as moderating these communities is completely untenable.

“All you can really do as a first game is understand the culture that we want to establish around our player base,” he said. “We want to design a game that reinforces this culture and doesn’t lead to these negative events where users can get very, very angry with each other, and try to reduce the kind of hateful content that people will make or hateful discussion. points that users have in the game and contribute to the community”.

A culture around regulation and moderation requirements, whether human or AI, is essential to the task of creating safe spaces, Crevoshay added, as well as the consequences of harmful behavior.

“A carrot and stick approach is needed,” he said. “Good design is very useful, both in a community and in the game itself, to increase prosocial behavior, increase shared positive norms and ambitious ideas. But if you don’t also have the stick, it can easily become a problem space.”

“The days of anything goes and turning a blind eye, that’s not going to work anymore even in the United States, and certainly not in Europe,” Hoppe said. “Take a territorial approach first and assess, based on the budget you can allocate at this stage, where those funds should be spent. California law actually lays out very precisely what steps you need to take in terms of assessing the current situation and determining where you need to focus.”

There are also game design tools currently available that help developers create safe spaces. The Fair Play Alliance offers the Interruption and damage to the framework of online gamesa detailed and comprehensive catalog of what we know about problematic gaming behavior today, with the goal of empowering the gaming industry with the knowledge and tools to support player well-being and foster healthier gaming spaces and welcoming all over the world.

“If you build from the ground up with the intention of creating spaces that are more welcoming to everyone, it’s really possible to do that,” Crevoshay said. “It just has to be integrated early in the process into the design of spaces.”

And despite the fact that there are regulations imposed on developers, “you can do it simply because it’s the right thing to do,” Howie said.

Don’t miss the full discussion: Watch the full session here.


Scroll to Top