Quick Brief
- The Partnership: Sony Interactive Entertainment, Nintendo, and Microsoft published updated shared principles for gaming safety on January 13, 2026, expanding their 2020 collaboration
- The Impact: Addresses gambling-style mechanics, third-party marketplaces, and child exploitation across PlayStation Network, Xbox Live, and Nintendo Switch Online
- The Context: Follows European regulatory pressure on loot boxes and rising concern over child exposure to skin-gambling operations
- The Tech: Partnership with Tech Coalition’s Lantern program, which processed 296,336 child safety signals in 2024 and resulted in 102,082 enforcement actions
Sony Interactive Entertainment, Nintendo, and Microsoft announced an expanded commitment to player safety on January 13, 2026, updating principles first established in 2020 to address gambling-style mechanics, third-party skin markets, and child protection across console platforms. The joint statement, released simultaneously by all three companies, explicitly targets features that “can harm minors and erode player trust,” marking the first time the platform holders have publicly addressed gambling mechanics in a unified framework.
Three-Pillar Safety Architecture
The updated framework structures safety efforts under Prevention, Partnership, and Responsibility principles, each addressing specific technical and policy gaps identified since 2020.
Prevention centers on parental control systems that allow granular restrictions on purchases, social interactions, and exposure to gambling-style content. All three platforms now support restrictions on microtransactions, cosmetic bundles, and limited-time offers that mimic gambling mechanics. The companies commit to promoting these tools through platform interfaces, retail channels, and dedicated support portals.
Partnership formalizes collaboration with the Tech Coalition, Entertainment Software Association, ESRB, PEGI, and Family Online Safety Institute. The companies participate in the Lantern program, a cross-platform signal-sharing system that identified 296,336 new threats in 2024 and blocked 135,077 child sexual exploitation URLs. The framework invites other publishers and platform holders to adopt the principles, signaling industry-wide standardization.
Responsibility establishes transparency requirements for enforcement actions, reporting mechanisms, and compliance with law enforcement requests. The companies commit to “escalating restrictions for egregious or repeat violations” and prompt notification to authorities when unlawful conduct or imminent harm is detected.
Regulatory and Market Implications
The updated principles arrive as European regulators increase scrutiny of loot boxes and skin-gambling operations. Court rulings in multiple jurisdictions have classified certain microtransaction systems as gambling, forcing publishers to modify monetization models or face regional bans.
The framework directly addresses third-party skin-gambling marketplaces, an unregulated sector that operates outside platform controls. The CS2 skins market alone reached $5 billion in valuation in 2025, demonstrating the scale of cosmetic trading ecosystems. By coordinating enforcement with publishers and regulators, the platform holders aim to reduce minors’ access to these markets and limit cross-platform exploitation.
For publishers operating cross-platform titles with microtransactions, the commitment introduces compliance pressure. Games with “loot box-like features, cosmetic bundles and limited-time offers” will face requirements for clearer drop rates, odds disclosure, and purchase controls. Esports operators may encounter new certification requirements for events involving cosmetic prizes or third-party integrations.
Technical Implementation: The Lantern System
The companies’ partnership with Tech Coalition’s Lantern program represents the technical backbone of cross-platform enforcement. Lantern enables secure sharing of intelligence on child exploitation threats between platforms, allowing companies to identify policy violations across ecosystems.
In 2024, Lantern processed 1,064,380 cumulative signals, resulting in 102,082 account enforcement actions. The system flagged 81 contact offenses and 45 trafficking instances, while removing or blocking 7,048 pieces of child sexual abuse material. Participating companies upload signals about policy violations, which other platforms can query to identify similar activity on their services.
The program employs safety and privacy by design principles, with Tech Coalition managing eligibility vetting, legal agreements, and data audits to address concerns about cross-platform intelligence sharing.
Industry Adoption Timeline
The framework invites broader participation from publishers, indie studios, tournament operators, and community marketplaces. Implementation will be incremental and tied to ongoing regulatory developments, particularly in Europe where Digital Services Act requirements take full effect in 2026.
Players should expect platform-level feature updates in coming months, including simplified parental controls, clearer monetization disclosure, and enhanced reporting tools. Publishers will face pressure to modify storefront presentations, flag gambling-adjacent mechanics, and increase transparency around drop rates and odds.
The collaboration builds on cross-industry efforts through the Digital Trust & Safety Partnership, which unites companies across gaming, social media, streaming, and messaging around shared trust and safety practices. The gaming industry has historically lagged other digital sectors in participating in trust and safety forums, but engagement with Tech Coalition and TrustCon events has accelerated since 2023.
Frequently Asked Questions (FAQs)
What is the Sony, Nintendo, and Microsoft gaming safety partnership?
A collaboration launched in 2020 and updated January 13, 2026 to standardize player protection, parental controls, and enforcement against gambling mechanics and exploitation across PlayStation, Xbox, and Nintendo platforms.
How does the Lantern program protect children in gaming?
Lantern allows gaming platforms to securely share intelligence on child exploitation threats. In 2024, it processed 296,336 signals, resulting in 102,082 enforcement actions and removing 7,048 pieces of child sexual abuse material.
What are gambling-style mechanics in video games?
Features that mimic gambling, including loot boxes with randomized rewards, limited-time cosmetic offers, and third-party skin-gambling marketplaces. The partnership targets these for stricter regulation and transparency.
Do parental controls block microtransactions on consoles?
Yes. All three platforms now offer granular controls to restrict purchases, hide storefront features, limit social interactions, and block exposure to gambling-style content.

