Our Approach to Content Moderation

We take a safety-by-design approach, putting the safety of our users at the center of the product design process. Our dedicated Trust & Safety team has deep experience working to create effective and thoughtful safety policies and operational processes for emerging technologies.

protecting our community standards

Automated Tools: We use proprietary tools that seek to block certain violating content before it can be posted. Like other solutions in this space, these tools are evolving quickly and we will continue to improve them over time.

Moderation: Our Trust & Safety team includes internal personnel, contracted moderators, and vendors - as is standard in the industry. Our Trust & Safety moderators work around the clock, focused on taking appropriate action on reported content. We provide them with the technical tools necessary to keep Character.AI a positive experience and are constantly looking for ways to evolve these tools to better serve our users.

DMCA: We respect the intellectual property of others, and we ask our users to do the same. To put that policy into practice, we have a robust Digital Millennium Copyright Act (DMCA) takedown protocol for copyright infringement. That protocol is described in detail in our Terms of Service