With a growing community of more than 3 billion gamers around the world, continuing to invest in trust and security is critical to fostering a safe and inclusive online environment. Keeping players safe from harm is an essential part of the Xbox Security Team and our work. Often, players do not see or know what content moderation measures are at work behind the scenes, helping to make their experience safer and more welcoming. Today we release our second Xbox Transparency Reportdetailing our ongoing efforts to better protect our players and showing our security measures in action.
Our diverse security package includes our proactive and reactive moderation efforts, community standards, parental and family controls like the Xbox Family Settings app, and our ongoing collaboration with industry partners and regulators. Our critical investments in content moderation combine AI and human-powered technologies to capture and filter content before it reaches and impacts players. We use a variety of measures that give us the scale, speed, and breadth to keep pace with growing interactions and activity from our players. As noted in the Transparency Report, 80% (8.08M) of all enforcement actions during this period were due to our proactive moderation efforts. The data illustrate the impact of this approach.
As player needs evolve, so do our tools. Keeping our players safe is our top priority – and to drive safer online experiences, we will continue to invest in innovation, work closely with industry partners and regulators, and seek feedback from the community. We look forward to sharing more.
Key findings from the report include:
- Proactive measures are a key driver for safer player experiences. in this span of time, 80% All of our enforcement efforts have been the result of our proactive moderation efforts. Our proactive moderation approach includes both automated and human measures that filter out content before it reaches players. Automated tools like Community Sift work with text, video, and images and identify objectionable content in milliseconds. In the last year alone, Community Sift has evaluated 20 billion human interactions on Xbox. Proactive measures were also recognized and enforced 100% Account manipulation, piracy, phishing, and fraudulent/inauthentic accounts occur.
- Increased focus on inappropriate content. We understand that the needs of our players are constantly evolving, and we continue to listen to player feedback on what is and is not acceptable on the platform in line with our community standards. In this final phase, we expanded our definition of vulgar content to include offensive gestures, sexualized content, and crude humor. This type of content is generally considered distasteful and inappropriate, and detracts from the core game experience for many of our players. This policy change, coupled with improvements to our image classifiers, has resulted in a 450% increase in enforcing vulgar content, with 90.2% be proactively moderated. These enforcement actions often only result in the removal of inappropriate content, which is reflected in the 390% increase in “substantive-only” enforcement actions during this period.
- Persistent emphasis on fake accounts. Our proactive moderation, up 16.5x from the same period last year allows us to detect negative content and behavior before it reaches players. The Xbox Security Team issued more than 7.51 million proactive enforcement against fake accounts, representative 74% of total enforcements in the reporting period (from 57% last reporting period). Inauthentic accounts are typically automated or bot-created accounts, which can level the playing field and hamper positive player experiences. We continue to invest in and improve our technology so players can have safe, positive, and welcoming experiences.
Around the world, our team continues to work closely with key industry partners to work on our approach to security, including increasing awareness and enhancing our security measures to exceed standards:
Together we create a community where everyone can have fun. Every person, whether a beginner or a seasoned pro, contributes to building a more positive and welcoming community for everyone. Player feedback and reporting helps us improve our security features. If you see something inappropriate, please report it – we couldn’t do it without you!
Some additional resources: