How harmful content is evading detection on popular video gaming sites


Credit: UNSPLASH / CC0 public domain
New research published in the journal Borders in psychology reveals how extremist groups use the popularity of video games to recruit and radicalize impressable users.
The study shows that the platforms adjacent to the games, which allow users to discuss and distribute live while playing, are used as “digital playgrounds” for extremist activity and that video game players are deliberately “channeled” by extremists from traditional social media platforms to these sites, partly due to the challenges encountered to moderate them.
Research was carried out by Dr. William Allchorn and Dr. Elisa Orofino, main research scholarship holders at the International Research Institute on the Police and Public Protection of the Anglia Ruskin University, and includes interviews with platform content moderators, experts in the technological industry and people involved in prevention and the fight against violent extremism.
He found that extreme right extremism is the most common ideology shared on these platforms adjacent to games. This includes the content of the promotion of white supremacy, neonazism and anti -Semitism, often accompanied by misogyny, racism, homophobia and conspiracy theories, including references to Qanon.
Islamist extremism has also been reported, although less frequently, alongside the “adjacent” extremist “equipment such as the glorification of school shots – all the content which violates the terms of the use of traditional platforms but often eludes detection.
The study explains that hyper-masculine game titles, such as first-person shooting games, have a particular attraction to extremists and emphasizes how the unique nature of online games brings together foreigners with a common interest.
After the initial contact, the funnel takes place where the interactions move towards the platforms adjacent to the least regulated games, providing an environment where extremists can socialize, share propaganda and subtly recruit.
A person interviewed in the study explained how the grooming could start: “This is where you have matchmaking. This is where you can establish a rapid relationship with people. But this is what moves very quickly on the adjacent platforms, where there is a kind of surveillance.”
A recurring concern among the participants was the danger that young users are the influence of extremist influencers, who combined the game live live with extremist stories.
Participants stressed that the police must better understand how these platforms and their subcultures work and also stressed the importance of educating parents, teachers and children on the risks of online radicalization.
The moderators who participated in the study expressed their frustration in the face of inconsistency policies on their platforms and to decide whether the content or users should be reported to the local law enforcement organizations.
The cat in the game is not moderate, but the moderators always declare that they are overwhelmed by the volume and complexity of the harmful content, including the use of hidden symbols often used to bypass the prohibited words.
AI tools are used to help moderation, but they find it difficult to interpret memes or when language is ambiguous or sarcastic. Sentence such that “I’m going to kill you” can be common in the gameplay, but difficult for automated systems to be interpreted in context.
The co-author of the study, Dr. William Allchorn, principal researcher at Anglia Ruskin University (ARU), said: “These platforms adjacent to the games offer extremists to extremists to a large, often young and impressionable audience and they have become a key tool for extremist recruitment.
“Social media platforms have attracted most of the attention of legislators and regulators in the last decade, but these platforms have largely stolen under the radar, while becoming digital playgrounds for extremists to be exploited.
“The nature of radicalization and the dissemination of extremist content is not limited to any unique platform and our research has identified a general lack of detection tools and effective relationships.
“Many users do not know how to point out extremist content, and even when they do, they often believe that their concerns are not taken seriously. The strengthening of moderation systems, both IA and human, is essential, as is the update of platform policies to combat harmful but technically legal content.
More information:
Police extremism on adjacent platform games: horrible but lawful ?, Borders in psychology (2025). DOI: 10.3389 / FPSYG.2025.1537460
Supplied by the Anglia Ruskin University
Quote: How the harmful content escapes detection on popular video game sites (2025, July 31) recovered on July 31, 2025 from https://phys.org/news/2025-07-content-evadeing-populalar-video-gaming.html
This document is subject to copyright. In addition to any fair program for private or research purposes, no part can be reproduced without written authorization. The content is provided only for information purposes.




