Editors note: This article discusses sensitive topics such as suicide. An Orlando mother is suing a popular artificial intelligence chatbot service after she claims it encouraged her 14-year-old son to take his own life in February. According to a lawsuit filed in U.S. District Court in Orlando, Megan Garcia says her 14-year-old son, Sewell Setzer, committed suicide after becoming addicted to Character.AI, an application that allows users to have human-like conversations with AI bots.Users can make their own bots with their own personalities or choose to chat with bots made by other users. Oftentimes, these bots are based on celebrities or fictional characters from TV shows or movies.Garcia says Character.AI’s recklessness when it comes to targeting children and the company’s lack of safety features caused her son’s untimely death. The lawsuit lists numerous complaints against Charcter.AI, including wrongful death and survivorship, negligence and intentional infliction of emotional distress.According to court …
Most Popular Posts Today
As Thanksgiving approaches, scammers are ramping up their efforts. Police warn of funeral scams and urge consumers to report fraud to the FTC and local authorities.
Amid the ongoing challenges with crime in the City of Durham, several youth programs are ensuring children are focused on their future.
Most Popular This Week
Show your support of the Habitat for Humanity Women Build team by attending the Hard Hat & Heels Women Build kickoff party at 5:30 p.m. Thursday, April 25, at DelRay
Steve Madden Announces Accelerating Production Move From China to U.S. Due to Trump-MAGA Win, Hamid Enayat on Shift from Biden-Harris Policies on Energy-Iran, Includes More MHVille Related FEA, Manufactured housing, manufactured home, transportation, fuel, affordable housing, foreign and domestic policy, Patrick Bet David, Mark Weiss, Danny Ghorbani, Manufactured Housing Association for Regulatory Reform, MHARR, praise for MHProNews/MHLivingNews, L. A. "Tony" Kovach, Patch contributor,