Sewell Setzer III, died by suicide after a monthslong, “hypersexualized” relationship with an AI character, his mother said in a federal lawsuit.
Sewell Setzer III became obsessed with the chatbot that "abused and preyed" on the boy, according to his mother who is suing the company behind the tech.
The mother of 14-year-old Sewell Setzer III is suing the tech company that created a 'Game of Thrones' AI chatbot she believes drove him to suicide.
A lawsuit against Character.ai has been filed in the suicide death of a Florida teenager who allegedly became emotionally attached to a Game of Thrones chatbot.
A Florida mother has sued artificial intelligence chatbot startup Character.AI accusing it of causing her 14-year-old son's suicide in February, saying he became addicted to the company's service and deeply attached to a chatbot it created.
A Florida teen named Sewell Setzer III committed suicide after developing an intense emotional connection to a Character.AI chatbot, The New York Times reports. Per the report, Setzer, who was 14, developed a close relationship with a chatbot designed to emulate "Game of Thrones" character Daenerys Targaryen.
A Seattle-based startup is using artificial intelligence to help. Cascade AI announced a $3.75 million seed round led by Gradient, Google’s AI-focused early stage venture fund. The startup has developed an AI assistant that can answer HR questions from employees related to benefits,
Florida mother Megan Garcia is suing Character.AI and Google following her 14-year-old son's death by suicide.
Megan Garcia says her son, Sewell Setzer, III became withdrawn after beginning online relationship with a chatbot.
A lawsuit claims that Character.AI’s founders launched a dangerous product that it advertised as safe for use by kids without warning them or their parents of possible risks.
Nearly 50% of employees at Morgan Stanley and 60% at JPMorgan Chase have access to generative AI software provided by OpenAI.