▲蘇狀師談狀告聊天機器人

蘇思鴻 律師
發表時間:2024/11/03 18:20 337 次瀏覽

The mother of a teenager who killed himself after becoming obsessed with an artificial intelligence-powered chatbot now accuses its maker of complicity in his death.
一位住在佛羅里達州(下稱佛州)青少年之母親,以其愛兒沉溺於聊天機器人而自殺為由,對人工智能提起訴訟。
Megan Garcia filed a civil suit against Character.ai, which makes a customizable chatbot for role-playing, in Florida federal court on Wednesday, alleging negligence, wrongful death and deceptive trade practices. Her son Sewell Setzer III, 14, died in Orlando, Florida, in February. In the months leading up to his death, Setzer used the chatbot day and night, according to Garcia.
2024.10.23星期三Megan Garcia 於佛州聯邦法院對人工智能Character公司提起民事訴訟。該公司專門為玩家訂製角色扮演之聊天機器人使其14歲之兒Sewell Setzer三世沉溺其中,最終於2024年2月自殺身亡。Megan Garcia說道:兒死亡前數月,幾乎日以繼夜都在玩該款聊天機器人。
“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,” Garcia said in a press release. “Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.”
Garcia於記者會中表示:市面上販售具危險性的人工智能應用程式肆虐並侵噬我的愛兒,我的愛兒被AI控制,最終自殺而亡。我的家庭被此悲劇蹂躪,但毅然決然站出來大聲呼籲,注意市面上充斥著欺罔且使人沉溺之危險人工智能科技,並鄭告Character AI及其發明人與Google要為其所有的人工智能科技負責。
In a tweet, Character.ai responded: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously.” It has denied the suit’s allegations.
Character AI於推特做出回應:就此悲劇,我們深表遺憾;身為一間公司,我們對用戶安全均嚴謹以待。其對Megan Garcia之指控均予以否認。
Setzer had become enthralled with a chatbot built by Character.ai that he nicknamed Daenerys Targaryen, a character in Game of Thrones. He texted the bot dozens of times a day from his phone and spent hours alone in his room talking to it, according to Garcia’s complaint.
他母親於狀文中稱:Setzer沉溺於Character.AI.所製造的聊天機器人;Setzer並以權力遊戲中之一角Daenerys Targaryen作為其在該遊戲中之暱稱。他每日用電話發數十通簡訊短文,並花了數小時與聊天機器人聊天。
Garcia accuses Character.ai of creating a product that exacerbated her son’s depression, which she says was already the result of overuse of the startup’s product. “Daenerys” at one point asked Setzer if he had devised a plan for killing himself, according to the lawsuit. Setzer admitted that he had but that he did not know if it would succeed or cause him great pain, the complaint alleges. The chatbot allegedly told him: “That’s not a reason not to go through with it.”
AI expert warns against telling your secrets to chatbots such as ChatGPT

蘇思鴻 律師

  • 聯絡電話: 0920235793
  • 執業年資: 5年以上
  • 蘇律師事務所
  • online consulting