Microsoft has released its third Xbox Transparency Report, highlighting the improvements made to its internal processes to protect players from toxicity and harassment.


For the six months ending June 2023, Xbox reported that there were a total of 19.6 million enforcements issued during the last period – almost double than the 10.2 million enforcements during the first half of 2022.


This included 84,000 harassment and bullying ‘proactive’ actions, an area that was strengthened by the launch of Xbox’s voice reporting feature earlier this year “to capture and report in-game voice harassment,” the company said.


Of the total enforcements, 17.1 million were based on ‘proactive’ enforcements and 2.5 million (13%) were based on ‘reactive’ reports.


27.3 million player reports were made during H1 2023, with half concerning communications such as platform messages and comments on an activity feed post.


10.5 million (39%) reports were issued regarding conduct, and 3.1 million (11%) related to user-generated content.


Since H2 2022, Xbox has implemented a new “AI-powered and human insights-driven content moderation platform” called Community Sift, which aims to identify toxic content among players.

It has also been using a model that scans “user-generated imagery to ensure only appropriate content is shown.” This has led to blocking over 4.7 million pieces of content.


In August, Microsoft launched the Enforcement Strike System to streamline community standards. Following the implementation of this new system, there were over 280,000 appeals against the suspension of accounts, with only 4% successfully doing so.


Finally, data from Microsoft’s Digital Safety Content report showed there were 766 referrals to the National Center for Missing & Exploited Children, up 39.5% compared to last year.


2,225 referrals were made to the Crisis Text Line, an increase of 63.4% from the 1,361 referrals made in H2 2022.

Sign up for the GI Daily here to get the biggest news straight to your inbox