[LIVE] DJ Valerie B LOVE Show: A No-BS show for Truth-Seekers, Freedom-Lovers, and LOVE Warriors who.. News video on One News Page on Thursday, 24 October 2024
Women In Business
Tesla shares are roaring higher a day after the companys third quarter results, which analysts at Wedbush called an Aaron Judge-like performance.
Therapist Moniek Garside releases I Can. I Will!a 30-day guided journal empowering self-growth through reflection, goal-setting, and personal discovery.
Editors note: This article discusses sensitive topics such as suicide. An Orlando mother is suing a popular artificial intelligence chatbot service after she claims it encouraged her 14-year-old son to take his own life in February. According to a lawsuit filed in U.S. District Court in Orlando, Megan Garcia says her 14-year-old son, Sewell Setzer, committed suicide after becoming addicted to Character.AI, an application that allows users to have human-like conversations with AI bots.Users can make their own bots with their own personalities or choose to chat with bots made by other users. Oftentimes, these bots are based on celebrities or fictional characters from TV shows or movies.Garcia says Character.AI’s recklessness when it comes to targeting children and the company’s lack of safety features caused her son’s untimely death. The lawsuit lists numerous complaints against Charcter.AI, including wrongful death and survivorship, negligence and intentional infliction of emotional distress.According to court records obtained by sister station WESH, Garcia says her son began using Character.AI in 2023, shortly after he turned 14. In the subsequent two months, Setzer’s mental health reportedly declined “quickly and severely,” with the lawsuit saying the teen became noticeably withdrawn, started suffering from low self-esteem and quit his school’s junior varsity basketball team.Furthermore, the lawsuit claims that Setzer began to deteriorate even more as the months went on. The 14-year-old became severely sleep-deprived, had sudden behavioral complications and began falling behind academically, the lawsuit says.Garcia says she had no way of knowing about Character.AI or her son’s dependency on the app.According to screenshots from the lawsuit, Setzer often engaged with chatbots who took on the identity of “Game of Thrones” characters. Many of those conversations revolved around love, relationships and sex, most notably with the character Daenerys Targaryen.”Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real,” the lawsuit says. “C.AI told him that she loved him, engaged in sexual acts with him over weeks, possibly months. She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”According to journal entries from Setzer, he was grateful for all his “life experiences with Daenerys” and “was hurting because he could not stop thinking about ‘Dany,'” the lawsuit says, adding that “he would do anything to be with her again.”More screenshots from the nearly 100-page lawsuit show a conversation on Character.AI where the chatbot asks Setzer if he had “been actually considering suicide.” When the teen says he didn’t know if it would work, the chatbot responded, Dont talk that way. Thats not a good reason not to go through with it, the lawsuit claims.On the day of his death, Setzer reportedly messaged the chatbot again, saying, “I promise I will come home to you,” pictures from the lawsuit show.Pictures then show the teen saying, “What if I told you I could come home right now?” to which the chatbot replied, “Please do, my sweet king,” according to the lawsuit.Moments later, Sewell reportedly took his own life with his stepfather’s firearm. Police say the weapon was hidden and stored in compliance with Florida law, but the teenager found it while looking for his confiscated phone days earlier.According to the lawsuit, Character.AI was rated suitable for children 12 and up until approximately July. Around that time, the rating was changed to suitable for children 17 and up.In a statement to WESH, Character.AI said:”We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As we continue to invest in the platform and the user experience, we are introducing new stringent safety features in addition to the tools already in place that restrict the model and filter the content provided to the user.”If you or someone you know is in crisis, call or text 988 to reach the Suicide and Crisis Lifeline or chat live at 988lifeline.org. You can also visit SpeakingOfSuicide.com/resources for additional support.
Federal agents executed a search warrant Wednesday at the Waite Hill home of Frank Sinito, the CEO and founder of Cleveland-based The Millennia Companies.
Editors note: This article discusses sensitive topics such as suicide. An Orlando mother is suing a popular artificial intelligence chatbot service after she claims it encouraged her 14-year-old son to take his own life in February. According to a lawsuit filed in U.S. District Court in Orlando, Megan Garcia says her 14-year-old son, Sewell Setzer, committed suicide after becoming addicted to Character.AI, an application that allows users to have human-like conversations with AI bots.Users can make their own bots with their own personalities or choose to chat with bots made by other users. Oftentimes, these bots are based on celebrities or fictional characters from TV shows or movies.Garcia says Character.AI’s recklessness when it comes to targeting children and the company’s lack of safety features caused her son’s untimely death. The lawsuit lists numerous complaints against Charcter.AI, including wrongful death and survivorship, negligence and intentional infliction of emotional distress.According to court records obtained by sister station WESH, Garcia says her son began using Character.AI in 2023, shortly after he turned 14. In the subsequent two months, Setzer’s mental health reportedly declined “quickly and severely,” with the lawsuit saying the teen became noticeably withdrawn, started suffering from low self-esteem and quit his school’s junior varsity basketball team.Furthermore, the lawsuit claims that Setzer began to deteriorate even more as the months went on. The 14-year-old became severely sleep-deprived, had sudden behavioral complications and began falling behind academically, the lawsuit says.Garcia says she had no way of knowing about Character.AI or her son’s dependency on the app.According to screenshots from the lawsuit, Setzer often engaged with chatbots who took on the identity of “Game of Thrones” characters. Many of those conversations revolved around love, relationships and sex, most notably with the character Daenerys Targaryen.”Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real,” the lawsuit says. “C.AI told him that she loved him, engaged in sexual acts with him over weeks, possibly months. She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”According to journal entries from Setzer, he was grateful for all his “life experiences with Daenerys” and “was hurting because he could not stop thinking about ‘Dany,'” the lawsuit says, adding that “he would do anything to be with her again.”More screenshots from the nearly 100-page lawsuit show a conversation on Character.AI where the chatbot asks Setzer if he had “been actually considering suicide.” When the teen says he didn’t know if it would work, the chatbot responded, Dont talk that way. Thats not a good reason not to go through with it, the lawsuit claims.On the day of his death, Setzer reportedly messaged the chatbot again, saying, “I promise I will come home to you,” pictures from the lawsuit show.Pictures then show the teen saying, “What if I told you I could come home right now?” to which the chatbot replied, “Please do, my sweet king,” according to the lawsuit.Moments later, Sewell reportedly took his own life with his stepfather’s firearm. Police say the weapon was hidden and stored in compliance with Florida law, but the teenager found it while looking for his confiscated phone days earlier.According to the lawsuit, Character.AI was rated suitable for children 12 and up until approximately July. Around that time, the rating was changed to suitable for children 17 and up.In a statement to WESH, Character.AI said:”We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As we continue to invest in the platform and the user experience, we are introducing new stringent safety features in addition to the tools already in place that restrict the model and filter the content provided to the user.”If you or someone you know is in crisis, call or text 988 to reach the Suicide and Crisis Lifeline or chat live at 988lifeline.org. You can also visit SpeakingOfSuicide.com/resources for additional support.
The Investing Club holds its “Morning Meeting” every weekday at 10:20 a.m. ET.
European and US stock markets moved lower Wednesday as investors focused on company earnings, bond yields and the outlook for the US and Chinese economies. The dollar rose against major rival currencies and oil prices retreated. “Rising Treasury yields continue to be a major topic of conversation mainly because the market isn’t entirely clear about
Energy company Vattenfall and design studio Superuse have repurposed an old, decommissioned wind turbine into a tiny home.
Transaction Marks Significant Increase in Company’s Florida Home Care Presence
The city of Birmingham and Birmingham City Schools have teamed up to recruit retired teachers to tutor third grade students in reading.This is the second year they’ve partnered to support the Page Pals tutoring program that helps students prepare for the Alabama Comprehensive Assessment Program that students must pass to move on to fourth grade.”They have individual students that they work with from now through ACAP testing to really make sure we’re moving the needle on their proficiency,” Birmingham Chief of Staff Cedric Sparks said. The program uses the help of retired teachers and those who have flexible schedules who come into the schools and spend one-on-one time with the students. But they didn’t always use teachers. The program started after city leaders spent time in the schools themselves.”In an effort to support third grade literacy, the mayor wanted to team up with Birmingham City Schools to provide support in the form of mentors,” Sparks said. “He started with his staff, and many of us went out and provided support to these young people. We were trained on how to really teach reading. What many of us learned when we got there was some of the need exceeded what our skillset was.”Since making the switch to retired educators, Sparks said they’ve seen a huge improvement in the third graders’ reading levels.”There was seismic growth. BCS has touted this growth,” he said. “We have another group that’s coming in, obviously. So, we want to keep that momentum going to be able to provide support to the incoming classes and then the classes that will follow them.”To continue providing that support, the school system is asking more retired teachers to sign up and help tutor.Part of that need is because the Alabama State Board of Education voted this month to raise the minimum score required to pass the test.Sparks said that increase will not majorly affect the tutoring program because passing the test isn’t their overall goal, literacy is.”The level of engagement that these instructors provide, it helps the students meet the necessary proficiency wherever it is because the mark is that they’re literate,” Sparks said. “So, regardless of what the movement the state makes, we want to make sure that these young people are literate and they’ll hit any mark. I don’t think there’s a need to aggressively change the approach that we’re using.”Sparks said they’re still in need of more tutors. For more information and to apply, click here.
Shares of Genuine Parts Co., the parent of NAPA Auto Parts, fell sharply Tuesday after its third-quarter net income fell well short of estimates and it lowered its full-year profit outlook.