Senators demand information from AI chatbots following kids바카라 게임 웹사이트 safety concerns, lawsuits
Two U.S. senators are demanding that artificial intelligence companies shed light on their safety practices. This comes months after several families 바카라 게임 웹사이트 including a Florida mom whose 14-year-old son died by suicide 바카라 게임 웹사이트 sued startup Character.AI, claiming its chatbots harmed their children.
바카라 게임 웹사이트We write to express our concerns regarding the mental health and safety risks posed to young users of character- and persona-based AI chatbot and companion apps,바카라 게임 웹사이트 Senators Alex Padilla and Peter Welch, both Democrats, wrote in a letter on Wednesday. The letter 바카라 게임 웹사이트 which was sent to AI firms Character Technologies, maker of Character.AI, Chai Research Corp. and Luka, Inc., maker of chatbot service Replika 바카라 게임 웹사이트 requests information on safety measures and how the companies train their AI models.
While more mainstream AI chatbots like ChatGPT are designed to be more general-purpose, Character.AI, Chai and Replika allow users to create custom chatbots 바카라 게임 웹사이트 or interact with chatbots designed by other users 바카라 게임 웹사이트 that can take on a range of personas and personality traits. Popular bots on Character.AI, for example, let users interact with replicas of fictional characters or practice foreign languages. But there are also bots that refer to themselves as mental health professionals or characters based on niche themes, including one that describes itself as 바카라 게임 웹사이트aggressive, abusive, ex military, mafia leader.바카라 게임 웹사이트
The use of chatbots as digital companions is growing in popularity, with some users even treating them as romantic partners.
But the opportunity to create personalized bots has prompted concerns from experts and parents about users, especially young people, forming potentially harmful attachments to AI characters or accessing age-inappropriate content.
바카라 게임 웹사이트This unearned trust can, and has already, led users to disclose sensitive information about their mood, interpersonal relationships, or mental health, which may involve self-harm and suicidal ideation바카라 게임 웹사이트complex themes that the AI chatbots on your products are wholly unqualified to discuss,바카라 게임 웹사이트 the senators wrote in their letter, provided first to CNN. 바카라 게임 웹사이트Conversations that drift into this dangerous emotional territory pose heightened risks to vulnerable users.바카라 게임 웹사이트
Video below: Search warrant provides new details in AI-generated child pornography investigation at Pennsylvania school
Chelsea Harrison, Character.AI바카라 게임 웹사이트s head of communications, told CNN the company takes users바카라 게임 웹사이트 safety 바카라 게임 웹사이트very seriously.바카라 게임 웹사이트
바카라 게임 웹사이트We welcome working with regulators and lawmakers, and are in contact with the offices of Senators Padilla and Welch,바카라 게임 웹사이트 Harrison said in a statement.
Chai and Luka did not immediately respond to requests for comment.
The Florida mom who sued Character.AI in October, Megan Garcia, alleged that her son developed inappropriate relationships with chatbots on the platform that caused him to withdraw from his family. Many of his chats with the bots were sexually explicit and did not appropriately respond to his mentions of self-harm, Garcia claims.
In December, two more families sued Character.AI, accusing it of providing sexual content to their children and encouraging self-harm and violence. One family involved in the lawsuit alleged that a Character.AI bot implied to a teen user that he could kill his parents for limiting his screen time.
Character.AI has said it has implemented new trust and safety measures in recent months, including a pop-up directing users to the National Suicide Prevention Lifeline when they mention self-harm or suicide. It also says it바카라 게임 웹사이트s developing new technology to prevent teens from seeing sensitive content. Last week, the company announced a feature that will send parents a weekly email with insights about their teen바카라 게임 웹사이트s use of the site, including screen time and the characters their child spoke with most often.
Other AI chatbot companies have also faced questions about whether relationships with AI chatbots could create unhealthy attachments for users or undermine human relationships. Replika CEO Eugenia Kuyda told last year that the app was designed to promote 바카라 게임 웹사이트long-term commitment, a long-term positive relationship바카라 게임 웹사이트 with AI, adding that that could mean a friendship or even 바카라 게임 웹사이트marriage바카라 게임 웹사이트 with the bots.
In their letter, Padilla and Welch requested information about the companies바카라 게임 웹사이트 current and previous safety measures and any research on the efficacy of those measures, as well as the names of safety leadership and well-being practices in place for safety teams. They also asked the firms to describe the data used to train their AI models and how it 바카라 게임 웹사이트influences the likelihood of users encountering age-inappropriate or other sensitive themes.바카라 게임 웹사이트
바카라 게임 웹사이트It is critical to understand how these models are trained to respond to conversations about mental health,바카라 게임 웹사이트 the senators wrote, adding that 바카라 게임 웹사이트policymakers, parents, and their kids deserve to know what your companies are doing to protect users from these known risks.바카라 게임 웹사이트