From Bloomberg News and I Heart Radio. It's the Big Take. I'm West Cassova today. Confronting the rise of extremism in the world of online video games, one study finding around four in ten Americans personally experience some sort of harassment online, particularly those bad for gamers of color. They say this is typical of what happens on the online gaming community.
They call it a racist, toxic environment, and they're doing it all well, They're just trying to do the simplest of things, have a good time and play a video game. I listened in on a gaming session and the black gamer was told by someone he was playing with online, you need handcuffs and a knee on your neck. In December, the Anti Defamation Lead released a study that found white supremacist ideas have gained significant exposure in the past year
through online video games. One in five adult game players in the A d L survey reported encountering other players who promoted the superiority of white people over people of other races, and the report also found fifteen per cent of teen and pre teen gamers said they interacted with
white supremacists. The A d L study and other recent research about extremists using video games to spread their ideology has led to calls for video gamemakers to do more to stop hate on their platforms, and it sparked concerns that arise in hateful ideology online will lead to greater acceptance of racism and extremism in the real world. These questions have even caught the attention of the U. S. Government. Cecilia donnas Ascio covers the video game industry for Bloomberg,
and she joins me now from New York to explain. Cecilia, over the years, we've seen video games blamed for any number of social ills. Politicians have said violence and video games leads to school shootings and other violent crimes, even though there's not a lot of evidence that it's true. So now we're seeing the sharp increase in people saying gamers are spreading white supremacist ideology. Do we know if hate groups that spread their views in gaming communities are
having success in actually turning other players into white supremacists. Yeah, I mean, there's been such a long history of this with video games. There's the Satanic panic, even with Dungeons and Dragons, which is a pen and paper game in the seventies. There's this idea that video games um inspire people to commit violent acts in real life, which has been disproven over and over and over again in decades
of studies, including amazing longitudinal studies by amazing researchers. And now there's this idea that there are people who are getting radicalized through video games. And that's I don't think there's anyone out there who study this who would say that's not happening at all. The question is what is the scale And the reason why it's so hard to measure is because video game companies really don't share their
data with a lot of people on this topic. There's only one video game, Roadblocks, that has a really specific anti white supremacist, anti neo Nazi policy on its platform. And that's not to say gaming companies don't take this seriously. They do. Activision, Blizzard, Riot Games, all of these top gaming companies have issued statements to Bloomberg saying that this
is something that they are paying attention to. But when it comes to handing that data off to outside researchers who really do have expertise in this topic, it's just it's a black box. So let's say they were to hand over this data. What is it I'm playing a game and what happens to me in the scenario where someone who is trying to fill my mind with these ideas? What does that look like? First, I want to tell you what it probably doesn't look like, and then I'll
tell you what it does look like. So say I'm playing Fortnite and I am in a game with a random person. We both drop down onto the map, we get our guns, and we go out and we try and find people to shoot. Maybe we're on voice chat, and if I'm good at that game, I'm going to be playing it for maybe like ten minutes before I'm
in a new game. But if I'm the average person, this stranger that I meet in the video game has like three to five minutes to pitch me on their views before you know, we are just kind of gone forever into the abyss and we never encounter each other. So it's as you're playing the game, you're carrying on a conversation. So it's like that's part of the appeal, is that you're actually reaching out to real people as
you're playing this game. Video games are communication channels, that's correct, and it's a great way to meet people online, but at the same time, it's unlikely that you are going to in the span of three to five minutes, be converted to have these really heinous and dangerous beliefs. What's more likely is where you do meet people through a video game who also have these beliefs, and over time,
being exposed to them, they become normalized for you. And depending on your background, depending on your beliefs, your family situation, these beliefs might appeal to you and you might adopt them yourself. And that's something that can happen through playing a video game with someone, through talking to them on discord, through communicating with them over Reddit or Facebook or Instagram
or anything. The question that researchers are trying to grapple with right now is to what extent is this a video game specific issue versus a social media issue that's more generalized. Have you found in your reporting that the kind of approaches that you describe can draw people in Absolutely. I published a feature with Wired magazine in two thousand twenty one about an individual who goes by Ferguson and he was playing Roadblocks, which is one of the most
popular games around right now. Million people play roadblocks every day, and when he was eleven he's now in his twenties, he encountered somebody on roadblocks who was running these military drills that he would just sort of run around in circles for hours and hours and hours and he could
only say yes no, sir. And he became connected to this individual who in the story called Malcolm, and over time they begin to kind of engage in these role playing scenarios through avatars in these video games, and they
really liked playing the bad guys. This individual, Malcolm, who Ferguson was introduced to, was also someone who was very much an Internet edge, somebody who generally is very much online who kind of trades in inappropriate memes and has a really edgy sense of humor and is very intentionally irreverent, spends a lot of time on four Chance, spent a lot of time trading in very inappropriate and offensive memes like four Channa courses, this online uh Chad group which
is sort of overrun with some pretty nasty stuff, extremely nasty stuff, really really heinous stuff. In some instances, Ferguson, Malcolm, and several other people ended up being a part of this group that included at some point twenty thousand people role playing ancient Rome, and they had all of these laws that kind of in the end amounted to fascism. They had degenerously laws that outlawed homosexual relationships. They were
anti Semitic in a lot of ways as well. And I interviewed a lot of people who were involved in this group about the impact that participating in it had on them, and you know, some of them look back on it and say, wow, that was quite literally us role playing fascism as children. And other people look back on that group and are actually in law enforcement now and are in the t s A. And some people also took something different from that and did become acclimated
to those beliefs later on. So see one of the problems here, and you're grappling with it here in and you're reporting is how concerned we should be about this. Years ago, there was the gamer Gate revelations where women gamers, women game developers were under just absolutely vicious attack from certain male gamers who didn't want them in their world. That had real world consequences, and there was a backlash
from it. But as you also say, sometimes it's overblown the amount to which what happens in the gaming world then comes into real life. So when you look at this as you do all the time, how do you weigh and measure how concerned people should be about white supremacy in the online gaming world. I think it's husually alarming that teen and preteen gamers have been exposed to white supremacist views of a video games. And by the way, it's one in five adults um and that was double
the rate in two thousand one. So this is clearly an issue, and it's something that a lot of researchers
are begging gaming companies to take more seriously. I think that parents should be extremely concerned about the people who their kids are associating with over the internet in online games, because online games are communication channels, and you're not just playing an online game with someone when you really click with them, you're also friending them on Instagram, communicating with them over the gaming chat app discord, maybe you're following
them on Twitter. You know, it's not limited to the video game. So I think we should take it seriously. At the same time, I think it's really important to understand exactly what we're talking about, which is not an issue that's localized to video games, but an issue that
is integral to the nature of social media. There are a lot of companies out there that have huge populations of users and don't think about moderation upfront, don't think about how to cultivate communities that are inclusive and not harmful. And that's a major design problem and a major philosophy problem that we've seen on Twitter, on Facebook, everywhere on the internet, and gaming companies are not at all excluded from that. Cecily, you don't just cover to the gaming industry.
You play games a lot yourself. Have you encountered this sort of thing just in you're playing of these games? Yeah, I mean you can tell that I have like a feminine voice, and so when I play video games over voice chat, and I really like playing competitive games in particular, I have received a lot of comments that I would describe as heinous and unwelcome when I'm just trying to sort of hang out after work. And a lot of people experience that based on the way that there voice
sounds or details that they share over video games. And it's on gaming companies to create more inclusive environments where people can feel safer just enjoying their hobby after work. It's something that's pushed a lot of women and minorities of gaming culture, and that's not fair because half of gamers are women, and a lot of us will just sort of do it in private and playing a single
player games. Um have talked to a lot of women who don't play multiplayer games, even though they enjoy them, because in order to be good at the game, a lot of the time you do have to be communicating with your voice, and they just don't want to deal with it. And so how do you design a game to do that? Is it that it shuts you down if you use a certain language or behavior. It's kind of the million dollar question. It's probably a multimillion dollar
question at this point. But what scientists say who study this is that if you make it really easy to report people in video games and make the report forms extremely easy to decipher, both for the person who is reporting and the person who takes the report, more people
will actually participate in that system. Plus, if you give people positive reinforcement for doing a report, So for example, if you tell them like, hey, that person you reported was actually banned later, they're going to feel like the loop was closed and they were reward for doing a good thing. Where do you see this heading. Do you think that gaming becomes a kinder, gentler place in the next three years, five years, or do you think that
things become more abrasive. I think it's going to go in both directions, because, as we saw with this Anti Defamation League survey, exposure to white supremacists beliefs inside of online games doubled between two thousand twenty one and two thousand twenty two. That is a huge issue, in alarming and definitely points to the need for gaming companies and potentially even politicians to address this at the same time.
Never before in history have gaming companies taken as seriously the need to moderate their platforms on the level that social media companies moderate their platforms. Gaming companies have hired cognitive scientists, sociologists, experts, and extremism to keep their platforms inclusive, um so everybody can enjoy their games and everyone can
also buy their games. It doesn't seem, though, into this Anti Defamation League survey, that gaming companies are doing anywhere near enough to mitigate the exposure of white supremacist beliefs to gamers and young teens. Cecilia Denastasio, thanks so much for talking with me today. Thank you for having me. When we come back showing game makers how to stamp out extremism. Let's continue this conversation with someone who studies
online extremism for a living. Alex new House is deputy director of the Middlebury Institute Center on Terrorism, Extremism and counter Terrorism, and he joins me, now, can you tell me first, what's is the Center on Terrorism, Extremism and counter Terrorism? Do so are sort of like top level elevator pitch is that we're a mixed methods research center that's operating out sort of the emerging tech, emerging trends space in especially right wing extremism, but also extremism broadly.
And what that means is that we specialize in primarily the study of online extremism on things like social media platforms, alternative tech platforms, and also video games. So today we're of course talking about extremism um on online gaming. And last September you got a grant from the Department of Homeland Security to actually study this. What is that money
being used for? Yeah, we got awarded almost seven thousand dollars to run a project that is actually going to developers game developers across the country and also in Canada to talk with them about their processes and knowledge and
awareness of the issues of extremist exploitation and games. So broadly, what we're trying to do is create basically a baseline for understanding one, how prevalent is extremism in games, because that's still a pretty open question, and too, how prevalent is knowledge of these things at the game developer levels, How do they think about the issue of extremism and games, And is there's some way that we can actually help them get better and build capacity and understanding how to
resist that. So let's take those two things a part, because they kind of have the big the twin problems. In your research, how prevalent do you find this online extremism,
these ideologies are in gaming. So one thing we absolutely know for sure is that white supremacists and far red extremists in general, especially the very very extreme ones we're talking like accelerationist actors, the ones who literally believe in bringing about the apocalypse, are actively using all sorts of different games for the purposes of identity creation, communication, organization, recruitment.
Those kinds of things. But one of the things that we're trying to do in this project is just figure out, like what is the extent of that, because right now we know that they exist, we do not know the actual sort of extent of it, the volume of it, and the prevalence of it in these games. So when you say they are participating in these games, using these games, what exactly are they doing when they're on the gaming platforms.
It really depends on the type of games. So a lot of games, especially the creation ones, will have some types of social networking features in them. For instance, the use of groups which are like clans or you know,
like small, small, you know, sets of individuals. The names of those groups are references to white supremacist terrorist organizations, and these are the show of characters that you're role playing in the game, right in the types of like organizations that the players are creating and coming together to form.
So we've seen everything from like an individual users user name that references the christ Church, New Zealand mat shooter all the way up to like entire groups of hundreds of individuals that are references to like nineteen seventies white supremacist terrorist organizations or designated far right terrorist organizations throughout
the world. So that's obviously like the big one, like this sort of use of all of the different features of a game for the purpose of promoting and organizing their activities, but we also see some like more minor indicating years. For instance, I've found references on like the Call of Duty leaderboard justin user names on the leaderboard to like Heinrich Himler, Adolf Hitler, Nazi regime. So it
really runs the gamut. One of the big challenges for me and tracking this is like there are so many different vectors of attack, of vectors of misuse on these games that it's actually hard to kind of get a handle on. So that's what they're doing online and how they're using these games. And the second part you mentioned was how aware the gaming platforms themselves are of it and what they're doing to counteract it. Absolutely, and this
also is a huge spectrum. There are many different types of game developers that range in size from a single person all the way up to the big ones that have tens of thousands of employees across the world, and so sort of correspondingly, there's going to be a huge range and awareness of these issues, But overall, what we find is that video game developers as a whole aren't thinking about these issues in the same way that like
Facebook and Google and Microsoft are in general. And that's because the social networking aspects of games, the vectors that they exploit in games, are sort of secondary to the central aspect of games, which is actually like the playing of it and the mechanics of the game that you're
interacting with. And so as a result, the moderation of those social network features, the awareness, and the policies that are in place to mitigate exploitation of them has been sort of secondary compared to the big social media companies. A lot of this project is filling in those gaps, is trying to connect different companies together and then also show them that like, hey, this is a huge problem. It's probably increasing in prevalence, and this isn't going to
go away on its own. And what's the response from the gaming industry when you go to them and say, hey, we're doing this project, We're gonna try and come up with best practices to get rid of this, help you out. Are they very eager to work with you? It is? Uh, it's a mixed bag. We have found um some developers are very very eager to work with us. Others sometimes have to be a little bit scared into it, and this is something that my research center works in the
tech sector as well. As you know. There are some companies that are fully aware of the problem and actually actively come to us and ask us for help and getting a handle on it. And there are others that we have to put together a suite of examples that say, hey, do you know that there's a designated terrorist organization operating on your platform currently? And they say no, and then we're like, hey, we can help you deal with that.
If we don't have that sort of the aspect of giving them examples and and telling them we're here to help you, We're not here to tear you down, a lot of times we will get a bit of a cold shoulder. As awareness of these issues increase, in coverage
of them, in public, pressure on these companies increases. Over time, that has been changing and we have seen a lot more cooperation over the past even six months relative to the last year, but we do still find that there is a reticence on the part of some people in the games industry to address these issues. It's easier in some cases to ignore it and hope that it doesn't exist than it is to take it on and ignore ug that you have a bunch of potential terrorists using
your platform. You would think that a gaming platform would really want to stamp this out rather than allow it's a game to become a place where toxic culture spreads, which can't really be very good for business, right And that's one of our big arguments is that this is absolutely a huge negative aspect and it has a massive, massively negative impact on the health of the community that
they're trying to build. The problem is that, you know, putting together a trust and safety team from scratch is very expensive, and building content policies to try to address these kinds of things in a more nuanced way can be very expensive. One of the things that we've noticed is that with this Homeland Security grant, with the backing of the US government, and with the space to actually go to companies and say, hey, we want to help you.
You don't have to give us money, we want to help you build these best practices, there's more appetite for it. Turns out people like to get some help if, especially if they don't have to do it alone, and they don't have to spend a ton of money on it. The reaction has been pretty positive. We found a lot of people in the games industry to be willing to confront that kind of process. We'll be right back. Why
do you think extremism is so common on games? There's so many different kinds of games, and yet it seems to be very widespread across genres. What it is it about online gaming that makes it such an attractive place for extremism. There's two elements to this. So the first one is the more straightforward one, which is that you know, generally most of the individuals who are participating in them are young. They're under thirty in the vast majority of cases.
A lot of the accelerationist groups that we study were founded by teenagers and recruit teenagers right out of high school. And ultimately, teenagers are spending a lot of their time on all sorts of different games, and it's just a nature of where the mainstream youth culture goes so to
do extremists. This idea that games provide some sort of unique social bonding opportunities is a very powerful one and it has been used and studied for years on the positive sense of like friendships are made in games more easily than they are in other types of social media and that kind of thing. But we've just recently started untangling the negative aspects of that, the potential for those really strong social bonds to actually have that sort of
darker element to it. Extremists don't have, you know, all of this social science data to back them up, but there is this sort of sense among them, and I've seen this in chat rooms that games provide this sort of particularly amenable area for recruitment, for mobilization, for radicalization, for getting people into that frame of mind to be more amenable to carrying out violent action or to be
more dedicated to the cause of the extremist organization. You know, one of the arguments that people in the gaming community made for years, and I think it's backed up by research, is that playing video games doesn't necessar ssarily caused violent behavior in real life. Is the idea of recruiting people into extremist groups to bring them into extremist ideology is
different from that. The studies have backed up this idea that like playing violent video games, the consumption of violent content does not have a direct link to violence in the real world. What we are currently working on right now is a more indirect, more sort of amorphous relationship. We're not focusing on content necessarily really at all. Um we may in the future start looking at content a little more seriously. We think that there may be some actual,
some like nuanced indirect relationships with the actual content. But for the moment, what we're looking at is actually the relationships that are built in game, and that is very different than what the type of game actually is. And it's what we find is that, like it's really sort of the nature of games broadly, the way that that social interactions within games happen that is driving this possibility
of like an increasable ability to extremism and radicalization. That is very very different than the old school theory of like you play Call of Duty, you become a mass shooter. Really, what it's talking about is like you are more likely to find the type of community and to be primed mentally to be integrated into those types of communities that have a pension for promotion of white supremacy or inciting real world violence, then you would if you weren't playing
games to that level. We're still in the very nascent stages of untangling these types of correlations, and they're probably going to end up being very complex, but we're starting to get a sense of like, there is definitely something going on in the video games space that is unsavory and that is not something that we want to promote.
There's this whole setup, there's this whole mental state that you enter, There's this whole set of social interactions that you encounter in games that may predispose you to being more likely to be radicalized. It is estimated that women now make up about half of all gamers, and yet women and girls say they're still treated like a minority and often face harassment in addition to extremism. For many years, there's been a problem of hostile and environment to women,
to l g B t Q game players. Is that something that you're also studying. Is a part of this look at the culture of games? Yeah. Absolutely. What's really fascinating is that we've known for decades that game culture has an unfortunate close proximity to toxicity and especially sexism, misogyny, anti l g B t Q action, those kinds of things.
The study of extremism and games is relatively new. The study of toxicity and games has substantial history, so we can kind of use the study of that of toxicity as a proxy for fighting indicators of extremist activity twenty years ago, thirty years ago, even though people weren't thinking about it in the same way then, um. And the reason for that is because toxicity is, in a lot of cases a tactic used by extremists for a whole host of things, for encouraging participants to be more primed
to getting radicalized, for undertaking their own ideological goals. Harassment and hate are tools used by white supremacists for the purposes of intimidating their enemies and encouraging their sort of in group to be more collaborative with each other and more loyal to the end group rather than the out group.
That isn't to say that all toxic players are extremists. Rather, toxicity is just one set of tools that are used by extremists pretty frequently and often very frequently in games and the two thousands, when coordinated harassment campaigns were first getting innovated, a lot of times they were getting innovated in games spaces like early virtual world SIMS, early social networking games like Second Life and other types of games like that, that was where these like decentralized networks of
of toxic actors would be coordinating and carrying out their campaigns of harassment. And so a lot of what we see in toxicity broadly across the Internet was really innovated in video games. So there's a really fascinating relationship there, and a lot of those people early on who innovated that the toxic actions ended up becoming rightly and stream
most later on. I suppose if you're a person, especially a young person, who does not feel very powerful in your regular life, coming into an online world where you can essentially be a bully of others can make you feel powerful. Yeah, and you have all these networks both in game and in the game ad Jason spaces that are telling you that what you're doing is right and celebrating you and giving you friendships and giving you that sort of like those dopamine hits of positive social interactions.
One of the things I always remind games industry people is that this focus on toxic content is great and we need to crack down on it. But the problem is toxicity is a negative relationship like the person carrying it out is trying to make the target feel intimidated, is make them feel bad, and so self reports are easier to generate and take action against radicalization. So the person who carries out the harassment then gets radicalized by
a community who's celebrating them. All of those radicalization interactions are very positive. The target of the radicalization is supposed to feel welcomed, they're supposed to feel connected, they're supposed to feel rewarded, and that is really really hard to detect solely from like self reports or from a reactive sense, because if you feel good, you're not going to report
radicalization attempts to the game developers. So absolutely, there's this huge interplay there between toxicity on the one hand, and then the networks of people who are trying to recruit the toxic actors into their networks. You spent a lot of time looking at the darkest side of online gaming. Is there any reason for optimism? Is there anything that makes you actually feel hopeful about what can be done.
I'm a huge gamer, Like I've been a gamer in my whole life, and one of the things we've seen borne out by the research is that games are incredibly positive. Like a lot of the interactions, a lot of the unique characteristics that I was just describing are more generally and more often very very positive. Extremism is a fringe thing. It is a minority of individuals who end up in extremism. So one of the optimistic sides here is that for
most people, the interactions in a game are positive. Like you, you end up in good pro social relationships and connections. And for people who might be you know, lonely or introverted or having trouble making friends in the real world, games can be a really really powerful tool for you know, surmounting some of those obstacles, and they have been for for you know, tens of millions of people throughout the world.
And so the big challenge then is how do we make sure that we're maximizing that that positive impact, maximizing the conversion of people into pro social actors, and minimizing those types of people who might be more vulnerable to the darker sides of it. And I think that question
is solvable. Like I think we are seeing a lot of companies do a lot of investment in in sinimizing pro social behavior, uh and and having some of the more like more types of condom moneration that are signed to encourage and reward good behavior rather than just crackdown on bad behavior. And I think there's some really encouraging signs from the industry that they're moving in the right direction there, and my hope is that with this project
we can help galvanize that a little bit more. Alex Newhouse, thanks so much for taking the time to talk to me. Thanks for having me on. You can read more of Cecilia Dona SCIO's reporting at Bloomberg dot com. Thanks for listening to us here at The Big Take. It's a daily podcast from Bloomberg and I Heart Radio. From more shows from my Heart Radio, visit the I Heart Radio app, Apple Podcast, or wherever you listen, and we'd love to hear from you. Email us questions or comments to Big
Take at Bloomberg dot net. The supervising producer of The Big Take is Dicky Burgolina. Our senior producer is Katherine Fink. Our producers are Mobarrow and Michael Fallero. Raphael M. Seeley is our engineer. Our original music was composed by Leo Sidrin. I'm Westcasova. We'll be back tomorrow with another big take,