“As soon as she started crying, he put his head down as if he was the one up with the baby all night,” Ally Newton told Newsweek.
Home Based Business
In the Town of Tecumseh, residents and athletes alike are facing a pickleball problem that may prove costly.
Bridging the Generational Gap in the Workplace
Posts circulating on Facebook about people making threats from a pickup truck have local authorities speaking out and addressing online rumors.
“Pumpkin Porch Fairy” does all of the heavy lifting to get your porch ready for the fall holidays with magical pumpkin decor.
For the past three weeks, outdoor furniture company Berlin Gardens has sent employees to assist with cleaning up properties and distributing supplies.
Firefighters saved adjoining businesses
Schnucks said it would instead start offering Dietz & Watsons Premium Meats and Artisan Cheeses early November, with full deli conversion expected early December.
The City of Lakeland has closed several roads while flood mitigation efforts take place at Lake Bonny.
Editors note: This article discusses sensitive topics such as suicide. An Orlando mother is suing a popular artificial intelligence chatbot service after she claims it encouraged her 14-year-old son to take his own life in February. According to a lawsuit filed in U.S. District Court in Orlando, Megan Garcia says her 14-year-old son, Sewell Setzer, committed suicide after becoming addicted to Character.AI, an application that allows users to have human-like conversations with AI bots.Users can make their own bots with their own personalities or choose to chat with bots made by other users. Oftentimes, these bots are based on celebrities or fictional characters from TV shows or movies.Garcia says Character.AI’s recklessness when it comes to targeting children and the company’s lack of safety features caused her son’s untimely death. The lawsuit lists numerous complaints against Charcter.AI, including wrongful death and survivorship, negligence and intentional infliction of emotional distress.According to court records obtained by sister station WESH, Garcia says her son began using Character.AI in 2023, shortly after he turned 14. In the subsequent two months, Setzer’s mental health reportedly declined “quickly and severely,” with the lawsuit saying the teen became noticeably withdrawn, started suffering from low self-esteem and quit his school’s junior varsity basketball team.Furthermore, the lawsuit claims that Setzer began to deteriorate even more as the months went on. The 14-year-old became severely sleep-deprived, had sudden behavioral complications and began falling behind academically, the lawsuit says.Garcia says she had no way of knowing about Character.AI or her son’s dependency on the app.According to screenshots from the lawsuit, Setzer often engaged with chatbots who took on the identity of “Game of Thrones” characters. Many of those conversations revolved around love, relationships and sex, most notably with the character Daenerys Targaryen.”Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real,” the lawsuit says. “C.AI told him that she loved him, engaged in sexual acts with him over weeks, possibly months. She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”According to journal entries from Setzer, he was grateful for all his “life experiences with Daenerys” and “was hurting because he could not stop thinking about ‘Dany,'” the lawsuit says, adding that “he would do anything to be with her again.”More screenshots from the nearly 100-page lawsuit show a conversation on Character.AI where the chatbot asks Setzer if he had “been actually considering suicide.” When the teen says he didn’t know if it would work, the chatbot responded, Dont talk that way. Thats not a good reason not to go through with it, the lawsuit claims.On the day of his death, Setzer reportedly messaged the chatbot again, saying, “I promise I will come home to you,” pictures from the lawsuit show.Pictures then show the teen saying, “What if I told you I could come home right now?” to which the chatbot replied, “Please do, my sweet king,” according to the lawsuit.Moments later, Sewell reportedly took his own life with his stepfather’s firearm. Police say the weapon was hidden and stored in compliance with Florida law, but the teenager found it while looking for his confiscated phone days earlier.According to the lawsuit, Character.AI was rated suitable for children 12 and up until approximately July. Around that time, the rating was changed to suitable for children 17 and up.In a statement to WESH, Character.AI said:”We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As we continue to invest in the platform and the user experience, we are introducing new stringent safety features in addition to the tools already in place that restrict the model and filter the content provided to the user.”If you or someone you know is in crisis, call or text 988 to reach the Suicide and Crisis Lifeline or chat live at 988lifeline.org. You can also visit SpeakingOfSuicide.com/resources for additional support.
There were 8,939 new homes built in Q3, an increase of 530 from the same period last year. On a seasonally adjusted basis, completions rose 25% from Q2.
A charging cell phone that was tucked into the cushions of a couch sparked a house fire. (Source: Adams Area Fire District, Gray News)