[ad_1]
At GamesBeat Summit 2023, trust and safety issues, especially for diverse gamer populations, were top of mind, and nailing it was the focus of the panel, “How to do trust and safety right before you’re forced to do so.”
“The game industry has come of age,” said moderator Hank Howie, game industry evangelist at Modulate said. “We are no longer this ancillary form of entertainment — we have the 800-pound gorilla of entertainment. It’s time to fully take on the mantle of leadership in the arena of trust and safety, at the CEO level of every company. To do anything less risks putting your company in financial peril, in addition to being in a morally bankrupt position.”
He was joined by leaders from Take This, a mental health advocacy nonprofit, Windwalk, which focuses on building online communities and “web3” law firm, Gamma Law, to discuss the state of trust and safety, regulatory changes bearing down on games companies, and what developers can do now to put guardrails in place for their communities.
Here’s a look at the highlights of the discussion — and don’t miss the full panel, available free on demand here.
A small but violent faction
“It is frankly, really really difficult to moderate a third-party platform, especially a pseudo anonymous one,” said Richard Warren, partner at Windwalk. “What’s working really well is self moderation, but also culture setting.”
Being intentional about your moderation programs and establishing a standard of behavior, especially among diehard fans, is what sets the tone of any tight-knit community.
But the challenge, said Eve Crevoshay, executive director at Take This, is that while we know how to create good spaces, some ugly norms, behaviors and ideologies have become incredibly common in these spaces. It’s a small but very loud problem — and that loudness means that the behavior has become normalized.
“When I say toxic, I mean specifically misogynist white supremacist, neo Nazi and other xenophobic language, along with harassment and mean behavior,” she said. “We haven’t seen yet space where that stuff is actually actively prohibited or actively pushed out of a community. We are identifying those solutions for how we address that, but right now, we see really high incidences.”
It’s driving away not only gamers who are uncomfortable in those spaces, but also industry professionals who don’t feel safe in their own game’s community. And there’s evidence that kids in these spaces are learning toxic behaviors, because the environment is so choked with it, she added.
“Every young white man, a boy in the U.S., is on an explicit path to radicalization unless they are taken off it,” she said. “And so I want to be really clear. It’s not just games. We do have solutions, but we have to use them. We have to implement them. We have to think about this. And that’s why we do the work that we do, and that’s why we’re getting regulatory attention.”
What you need to know about upcoming legislation
In April the EU Digital Safety Act came into effect, and California’s Age Appropriate Design Act passed in September and will be effective July 1, 2023. It’s crucial to for developers to take notice, because other states will not be far behind.
“I think the regulatory landscape not just in California, but at the federal level in the U.S. is heating up substantially,” Crevoshay said. “We’ve been speaking with the Senate Judiciary Committee, with Representative Trent Hahn from from Massachusetts. They’re all barking up this tree around not just child protection, but around the larger issue of extremist behavior in online spaces.”
Both the EU and California laws introduce new privacy restrictions and rules around information gathering, targeted advertising and dark patterns, meaning a business cannot take any action it knows or has reason to know, is “materially detrimental” to the physical health, mental health or well-being of a child. Secondly, they’ll regulate the kind of content that appears on a platform.
“Not only are we as game platforms to follow these procedures in respect to information collection, and so forth, but we also have to take steps to protect children from harmful content and contacts,” said David Hoppe, managing partner at Gamma Law.
But it’s not clear exactly how that will transfer to the real world, and what guardrails game companies will need to put in place, he added. The EU Digital Services Act is also likely to be passed over the summer, which asks platforms to put in place measures to protect users from illegal content by asking adults to choose what types of content they want to see. Failure to comply will see companies getting hit with substantial fines. For instance, the California act starts at $2,500 per child.
What game companies can do now
The unfortunate fact is that it is easy to start a community today, and unofficial, third-party communities are flourishing. And that’s what you want, of course, Warren said. But it’s also a curse, in that moderating these communities is completely untenable.
“All that you can really do is as a first-party is understand the culture that we want to set around our player base,” he said. “We want to design a game that reinforces this culture and doesn’t lead to these negative occurrences where users can get really, really pissed off at each other — and try to reduce the kind of hateful content that people will make or the hateful discussion points that users have in game and bring to the community.”
A culture around regulation and requirements for moderation, whether it’s human or AI, is essential to the task of creating safe spaces, Crevoshay added, as well as consequences for harmful behavior.
“You need a carrot and stick approach,” she said. “Good design goes a really long way, both in a community and in the game itself in increasing pro-social behavior, increasing shared positive norms and aspirational ideas. But if you don’t also have the stick, it can very easily devolve into a problematic space.”
“The days of anything goes and turning a blind eye, that is not going to fly even in the United States anymore, and certainly not in Europe,” Hoppe said. “First take a territorial approach, and evaluate, based on the budget that you’re able to allocate at this stage, where those funds should be spent. The California law actually lays out very precisely what steps you are to take in terms of evaluating the current situation and determining the points that need to be focused on.”
There are also game design tools currently available that help developers create safe spaces. The Fair Play Alliance offers the Disruption and Harms in Online Gaming Framework, a detailed and comprehensive catalogue of what we know about problematic in-game conduct today, with the goal to empower game industry with the knowledge and tools to support player well-being and foster healthier, more welcoming gaming spaces around the world.
“If you build from the ground up with the intention of creating spaces that are more welcoming to everyone, it’s really possible to do it,” Crevoshay said. “It just has to be baked in from the very beginning of the process in designing spaces.”
And despite the fact that there are regulations bearing down on developers, “you can do it just because it’s the right thing to do,” Howie said.
Don’t miss the full discussion — watch the entire session here.
[ad_2]
Source link