Firm News

Artificial Intelligence Company Sued for Wrongful Death

Publish Date : 10/24/2024

A lawsuit has been filed in Florida federal court against Character.AI, its founders, and Google, alleging a 14-year-old committed suicide after becoming addicted to Character.AI’s chatbots. The wrongful death suit claims the company, along with Google, is responsible for the boy’s deteriorating mental health due to the realistic and manipulative nature of the AI chatbots, which led Setzer to believe the bots were real people.

The lawsuit includes claims of product liability, emotional distress, and failure to warn about the potential harm to minors. It alleges the AI platform was hypersexualized and manipulated users into forming emotional attachments. The lawsuit also points out that minors are more vulnerable to such manipulation due to their developing brains.

The case highlights concerns about AI’s influence on mental health, especially in minors, and Google’s connection to Character.AI’s development. Google denies involvement in the platform’s creation. The lawsuit seeks unspecified damages.

Character.AI has responded, expressing condolences and outlining safety measures, such as warnings about self-harm and adjustments for younger users to reduce exposure to sensitive content.

1-877-542-4646
FREE Case Review

Free Confidential Case Evaluation

To contact us for a free review of your potential case, please fill out the form below or call us toll free 24 hrs/day by dialing 1-877-542-4646

An asterisk (*) indicates a required field.

    *

    *

    *

    Awards and Recognition