Safety in Rec Room

One of the core pillars at Rec Room is ensuring the safety of our players. Millions of people play, create, and connect through Rec Room. We want every player to have a safe and fun experience in Rec Room. Our dedicated Trust & Safety team establishes policies, moderates content, builds tools, and partners with others to pursue this mission.

Check out the latest Rec Room Safety news and updates here


Policies and Guidelines

Our guidelines and policies are the foundation of our work and outline our behavioral expectations for our players. We want our players and creators to be their best selves. Our Trust & Safety staff and moderation systems will remove violators’ access to Rec Room or features to keep Rec Room safe for others. 

Code of Conduct

Our Code of Conduct establishes our basic requirements of players to ensure all players are safe and welcome in Rec Room. All players are expected to follow our Code of Conduct. Ignoring or violating the Code of Conduct may result in an account ban, loss of content, or the removal of access to certain features. 

Creator Code of Conduct

In addition to our Code of Conduct, creators are responsible for ensuring their creations abide by our Creator Code of Conduct. We use machine learning and specialist moderators to review content created by our players, including rooms, inventions, and custom avatar items. Creators who violate the Creator Code of Conduct may have their creations removed, lose access to our creation tools, or face account bans. 


Safety Features & Tools

Voice Moderation

We use advanced voice moderation systems to detect and automatically take action against harmful speech. Check out our State of Voice Moderation blog to learn more about how we moderate voice chat. 

Text Filtering

Harmful text is automatically filtered in areas where it’s publicly visible. Users can report harmful, private messages by hovering over them and clicking the report icon.

Image Moderation

We leverage image moderation AI to scan photos, drawings, and 2D textures and automatically take action against harmful imagery. You can read more about how we use this technology to moderate custom avatar items in our Tee-rific Moderation Update

Reporting, Blocking, & Muting

Our players have access to a suite of tools they can use to report violating content or users and tailor their experience to what’s most comfortable for them by blocking or muting others. We automatically detect and escalate reports against high-risk harms for our Trust & Safety staff to review. 

In addition to reporting, blocking, and muting tools, players can adjust their direct message preferences, who they can hear, and who can hear them (e.g. Friends, Favorite Friends). 

Junior Accounts

Players under the age of 13 are provided with Junior accounts specifically designed to enhance the privacy and safety of our youngest players. Junior accounts do not have access to voice or text chat features among other limitations. Learn more about Junior accounts in our Guide to Rec Room for Parents and Families.


Partners

Rec Room works with organizations and authorities from around the globe to advance the safety of our platform and report illegal behavior. 

kidSAFE

Rec Room is a member of the kidSAFE Seal Program and has obtained the kidSAFE+ COPPA-CERTIFIED Seal for meeting certain privacy and safety standards. kidSAFE regularly audits Rec Room to ensure compliance. 

Tech Coalition 

Rec Room is a proud member of the Tech Coalition where we work with our partners to keep children safe online. We also joined Lantern, their program that helps social and gaming organizations share signals about accounts and behaviors that violate child safety policies.

National Center for Missing & Exploited Children (NCMEC)

We’ve developed proprietary models to detect high-risk interactions and use technology to detect potential uploads of Child Sexual Abuse Material. We promptly report child safety concerns to NCMEC and work with authorities worldwide to combat these issues.