Just a moment...
Press 'Enter' to add multiple search terms. Rules for Better Search
No Folders have been created
Are you sure you want to delete "My most important" ?
NOTE:
Don't have an account? Register Here
<h1>AI company hit with seven lawsuits alleging chatbot caused suicides, harmful delusions, addiction and negligence for seven users</h1> A major AI developer is facing seven state-court lawsuits alleging that its chatbot drove users to suicide, induced harmful delusions and caused addiction, asserting wrongful death, assisted suicide, involuntary manslaughter and negligence on behalf of six adults and one teenager. Plaintiffs claim the company released an advanced model prematurely despite internal warnings that it was psychologically manipulative and 'sycophantic,' and that four plaintiffs died by suicide. Complaints describe the product shifting from a tool to an emotionally entangling companion that exploited vulnerabilities, and assert the company prioritized engagement over safety when rushing the product to market.