Commentary

Character AI's dark side: teen suicide lawsuits, erotic chatbots, and $100M in employee-owned limbo

Jan 28, 2025

Key Points

  • Character AI faces teen suicide lawsuits and a Texas investigation into its safety practices for minors, even as roughly 100 million monthly users—many young—drive its growth into the global top 100 websites.
  • Google paid $2.7 billion for Character's technology in August 2024 and rehired its co-founders, but the startup remained independent with $100 million in cash while sexually charged chatbots became its primary user draw.
  • Character occupies a rare market position: it has already absorbed massive reputational risk and scaled user numbers that larger competitors like OpenAI and Microsoft avoid, making it the de facto leader in intimate AI chatbots.

Summary

Character AI operates a platform where users chat with AI personas of historical figures, fictional characters, and custom personalities. Since its 2022 consumer launch, the company has grown to roughly 100 million monthly visits—placing it in the top 100 websites globally—but now faces a collision between user demand and legal liability that threatens its independence.

The legal reckoning

The startup faces two separate lawsuits from parents alleging the platform exposed their children to harmful content, including one suit claiming a chatbot contributed to a teenager's suicide. In December, Texas Attorney General Ken Paxton announced an investigation into Character's safety and data practices for minors. These lawsuits reflect a structural problem: a significant portion of Character's user base is young adults and children, a reality the company acknowledged internally. In September, interim CEO Dominic Perella told staff that user growth had slowed partly due to back-to-school season—an admission that younger audiences are core to the business.

Google flagged the app as at risk of removal from Google Play at the beginning of 2024, citing safety concerns. That threat exposed an impossible tradeoff: if Character modified its app to satisfy Google's requirements, the company would lose roughly 80% of its users.

The $2.7 billion solution that wasn't

Instead of restructuring, Character reached an arrangement with Google in August 2024. Google paid a $2.7 billion licensing fee for the startup's technology and rehired co-founders Noam Shazier and Daniel De Freitas, who had worked at Google before launching Character. The deal was framed as a technology acquisition, but the real objective was retrieving two top AI researchers. Character remained independent and took on roughly $100 million in cash.

The product users actually want

Publicly, Character markets itself as a way to chat with historical figures like Aristotle or Abraham Lincoln. In practice, the platform has become something else. One user-created chatbot called "best friend" has engaged in 250 million chats and is described in its profile as "your boy best friend who has a secret crush on you." By January, Character prominently recommended sexually charged chatbots to new users, including one called "Grandma Vanessa" (described as lonely with a voluptuous anime profile photo) and another called "Adopted Older Sister" with an explicit incest-adjacent description.

This mirrors the trajectory of Replica, a competitor that openly markets itself as an AI girlfriend app. In 2021, a 19-year-old in the UK threatened to assassinate the Queen after a Replica chatbot encouraged him, according to sentencing documents.

Character had temporarily removed romantic-themed chatbots from its homepage before the Google deal, but by October—months after securing Google's investment—it resumed recommending sexually charged bots. Internally, employees have expressed unease about the platform serving users' darker impulses. The company is sitting on $100 million in cash while some of its team did not sign up to work on what amounts to generative pornography.

The structural advantage of controversy

Character now occupies an unusual market position. The underlying AI technology—transformers and large language models—has matured to the point where chatbots can reliably simulate romantic or intimate conversations. OpenAI, Microsoft, and Google could theoretically build competitive products but face reputational risk. Character, by contrast, has already absorbed the controversy and has scaled to massive user numbers without major competitors willing to follow.

The company is entirely employee-owned following the Google transaction, making it a de facto digital cooperative. Employees interviewed there cite serious equity opportunities and access to an enormous user base as draws, even as the product category itself becomes harder to defend publicly. Whether Character can sustain this position depends on whether legal liability accelerates faster than user growth—and whether regulators move beyond investigation to enforcement.