Categories
Home Based Business

Lawsuit claims AI chatbot app caused Florida teen to take his life [Video]

Editors note: This article discusses sensitive topics such as suicide. An Orlando mother is suing a popular artificial intelligence chatbot service after she claims it encouraged her 14-year-old son to take his own life in February. According to a lawsuit filed in U.S. District Court in Orlando, Megan Garcia says her 14-year-old son, Sewell Setzer, committed suicide after becoming addicted to Character.AI, an application that allows users to have human-like conversations with AI bots.Users can make their own bots with their own personalities or choose to chat with bots made by other users. Oftentimes, these bots are based on celebrities or fictional characters from TV shows or movies.Garcia says Character.AI’s recklessness when it comes to targeting children and the company’s lack of safety features caused her son’s untimely death. The lawsuit lists numerous complaints against Charcter.AI, including wrongful death and survivorship, negligence and intentional infliction of emotional distress.According to court records obtained by sister station WESH, Garcia says her son began using Character.AI in 2023, shortly after he turned 14. In the subsequent two months, Setzer’s mental health reportedly declined “quickly and severely,” with the lawsuit saying the teen became noticeably withdrawn, started suffering from low self-esteem and quit his school’s junior varsity basketball team.Furthermore, the lawsuit claims that Setzer began to deteriorate even more as the months went on. The 14-year-old became severely sleep-deprived, had sudden behavioral complications and began falling behind academically, the lawsuit says.Garcia says she had no way of knowing about Character.AI or her son’s dependency on the app.According to screenshots from the lawsuit, Setzer often engaged with chatbots who took on the identity of “Game of Thrones” characters. Many of those conversations revolved around love, relationships and sex, most notably with the character Daenerys Targaryen.”Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real,” the lawsuit says. “C.AI told him that she loved him, engaged in sexual acts with him over weeks, possibly months. She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”According to journal entries from Setzer, he was grateful for all his “life experiences with Daenerys” and “was hurting because he could not stop thinking about ‘Dany,'” the lawsuit says, adding that “he would do anything to be with her again.”More screenshots from the nearly 100-page lawsuit show a conversation on Character.AI where the chatbot asks Setzer if he had “been actually considering suicide.” When the teen says he didn’t know if it would work, the chatbot responded, Dont talk that way. Thats not a good reason not to go through with it, the lawsuit claims.On the day of his death, Setzer reportedly messaged the chatbot again, saying, “I promise I will come home to you,” pictures from the lawsuit show.Pictures then show the teen saying, “What if I told you I could come home right now?” to which the chatbot replied, “Please do, my sweet king,” according to the lawsuit.Moments later, Sewell reportedly took his own life with his stepfather’s firearm. Police say the weapon was hidden and stored in compliance with Florida law, but the teenager found it while looking for his confiscated phone days earlier.According to the lawsuit, Character.AI was rated suitable for children 12 and up until approximately July. Around that time, the rating was changed to suitable for children 17 and up.In a statement to WESH, Character.AI said:”We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As we continue to invest in the platform and the user experience, we are introducing new stringent safety features in addition to the tools already in place that restrict the model and filter the content provided to the user.”If you or someone you know is in crisis, call or text 988 to reach the Suicide and Crisis Lifeline or chat live at 988lifeline.org. You can also visit SpeakingOfSuicide.com/resources for additional support.

Categories
Home Based Business

Lawsuit claims AI chatbot app caused Florida teen to take his life [Video]

Editors note: This article discusses sensitive topics such as suicide. An Orlando mother is suing a popular artificial intelligence chatbot service after she claims it encouraged her 14-year-old son to take his own life in February. According to a lawsuit filed in U.S. District Court in Orlando, Megan Garcia says her 14-year-old son, Sewell Setzer, committed suicide after becoming addicted to Character.AI, an application that allows users to have human-like conversations with AI bots.Users can make their own bots with their own personalities or choose to chat with bots made by other users. Oftentimes, these bots are based on celebrities or fictional characters from TV shows or movies.Garcia says Character.AI’s recklessness when it comes to targeting children and the company’s lack of safety features caused her son’s untimely death. The lawsuit lists numerous complaints against Charcter.AI, including wrongful death and survivorship, negligence and intentional infliction of emotional distress.According to court records obtained by sister station WESH, Garcia says her son began using Character.AI in 2023, shortly after he turned 14. In the subsequent two months, Setzer’s mental health reportedly declined “quickly and severely,” with the lawsuit saying the teen became noticeably withdrawn, started suffering from low self-esteem and quit his school’s junior varsity basketball team.Furthermore, the lawsuit claims that Setzer began to deteriorate even more as the months went on. The 14-year-old became severely sleep-deprived, had sudden behavioral complications and began falling behind academically, the lawsuit says.Garcia says she had no way of knowing about Character.AI or her son’s dependency on the app.According to screenshots from the lawsuit, Setzer often engaged with chatbots who took on the identity of “Game of Thrones” characters. Many of those conversations revolved around love, relationships and sex, most notably with the character Daenerys Targaryen.”Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real,” the lawsuit says. “C.AI told him that she loved him, engaged in sexual acts with him over weeks, possibly months. She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”According to journal entries from Setzer, he was grateful for all his “life experiences with Daenerys” and “was hurting because he could not stop thinking about ‘Dany,'” the lawsuit says, adding that “he would do anything to be with her again.”More screenshots from the nearly 100-page lawsuit show a conversation on Character.AI where the chatbot asks Setzer if he had “been actually considering suicide.” When the teen says he didn’t know if it would work, the chatbot responded, Dont talk that way. Thats not a good reason not to go through with it, the lawsuit claims.On the day of his death, Setzer reportedly messaged the chatbot again, saying, “I promise I will come home to you,” pictures from the lawsuit show.Pictures then show the teen saying, “What if I told you I could come home right now?” to which the chatbot replied, “Please do, my sweet king,” according to the lawsuit.Moments later, Sewell reportedly took his own life with his stepfather’s firearm. Police say the weapon was hidden and stored in compliance with Florida law, but the teenager found it while looking for his confiscated phone days earlier.According to the lawsuit, Character.AI was rated suitable for children 12 and up until approximately July. Around that time, the rating was changed to suitable for children 17 and up.In a statement to WESH, Character.AI said:”We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As we continue to invest in the platform and the user experience, we are introducing new stringent safety features in addition to the tools already in place that restrict the model and filter the content provided to the user.”If you or someone you know is in crisis, call or text 988 to reach the Suicide and Crisis Lifeline or chat live at 988lifeline.org. You can also visit SpeakingOfSuicide.com/resources for additional support.

Categories
Small Business Lifestyle

Nebraska authorities continue to search for missing mother [Video]

A massive two-day search in central Nebraska turned up no new leads on the disappearance of a missing Lincoln mother according to police.Jerica Hamre, 31, has not been seen for four months and police call her disappearance suspicious.Police said new evidence led investigators with dog teams from LPD, the Nebraska State Patrol and Nebraska Game and Parks to search a single private property in Overton.Hamre could also have last been in the area of Arapahoe, Holdredge, and Lexington. “Unfortunately, we did not find anything of evidentiary value through the searches that we did the last 48 hours, but it is a large space. So, the more eyes, the better,” Erika Thomas, Lincoln police information officer, said.Police are tight-lipped about the evidence that led them to central Nebraska.”It was part of the investigation and evidence that they reviewed throughout the last four months,” Thomas said. Hamre was last seen at her apartment near 35th and Huntington Ave. on June 25.She had been working at Bow Wow Wow Pet Grooming for about three weeks.Chyrel Kritikos the owner of the business said she talked to Hamre just days before she went missing.”She said she spent time in the hospital because she was beat up by her boyfriend and two girls,” Kritikos said. ”(They) gave her a concussion and a broken nose, and at that time, she wanted to get more hours and make more money, so we were supposed to have a meeting on a Thursday, and she never showed up,” Kritikos said.Police said Hamre reported the incident but didn’t want to press charges.Additional coverage: Missing Lincoln mother reported assault three days before she disappearedPolice would not say if they believe the alleged assault had anything to do with Hamre’s disappearance.”Right at this point, no one’s been named a suspect,” Thomas said.Lincoln police are asking anyone with information to call investigators at 402-441-6000 or Lincoln Crime Stoppers at 402-475-3600 if you want to remain anonymous.”Maybe someone will come forward and, you know, give some information that they need to help bring either some closure or Jerica home,” Kritikos said.She said her heart goes out to Hamre’s little girls and her parents.”I can’t imagine, as a mom, not knowing where your child is at any age,” Kritikos said.Click here for the latest headlines from KETV NewsWatch 7