Friday, May 30, 2025

Technology gone wrong: Series: (Part 2): Sewell Setzer 111: From our 'They didn't teach me how too deal with an issue like this one in law school,' department; Can the mother of a14-year-old Florida boy sue the creator of a chatbot she alleges is connected to his death by suicide? ABC: News7 (previously reported: "Sewell Setzer III had been chatting for months with a chatbot he called "Daenerys Targaryen," after the Game of Thrones character. His mother says that although he knew he was not chatting with a real person, he became emotionally attached to the bot and sank into isolation and depression before taking his own life."



BACKGROUND: "His mother is suing Menlo Park-based "Character Technologies, Inc" -- which created the custom chatbot service CharacterBot AI. The lawsuit claims Character Technologies was reckless by offering minors access to lifelike companions without proper safeguards."

https://abc7news.com/post/silicon-valley-based-company-character-ai-sued-florida-14-year-olds-suicide/15461471/

------------------------------------------------------------------

PASSAGE OF THE DAY: His mother is suing Menlo Park-based "Character Technologies, Inc" -- which created the custom chatbot service CharacterBot AI. The lawsuit claims Character Technologies was reckless by offering minors access to lifelike companions without proper safeguards."

------------------------------------------

STORY: "Federal judge allows lawsuit that blames Bay Area company's AI chatbot for teen's death," by Reporter Tara Campbell, published by ABC7 News, on May 22, 2025. (Tara Campbell is a national Edward R. Murrow Award winning journalist currently reporting for KGO-ABC7 News Bay Area, where she focuses on lifting the voices of the marginalized and misunderstood."

SUB-HEADING: "The judge also made way for Garcia to move forward in holding Google accountable for its role in helping develop Character AI;

SUB-HEADING: "The mother of a 14-year-old Florida boy is suing a Silicon Valley-based Character.AI, saying its chatbot is connected to his death by suicide."

GIST: "SAN FRANCISCO (KGO) -- Wednesday marked a legal victory for Megan Garcia, who last year sued a Silicon Valley AI company saying its chatbot is connected to her 14-year-old son's death by suicide.

"I actually happened to be on the phone with my client when I saw the decision," said Meetali Jain, executive director of the Tech Justice Law Project. "Shock. Relief. Feeling like we were witnessing a historic moment for this particular sector."

Character Technologies, the company behind Character AI, tried to get the case dismissed but a federal judge on Wednesday rejected the company's arguments that its chatbots are protected by the First Amendment.

The lawsuit filed in Florida court claims the AI company was reckless by offering minors access to lifelike companions without proper safeguards."The legal arguments were hard, but that's only because they were novel, that there was very little precedent that guided us," said Jain. "On the First Amendment, you know, there hasn't been a case that looks at whether the outputs of an LLM are protected speech."

"AI is the new frontier in technology, but it's also uncharted territory in our legal system," said Steven Clark, legal analyst. "You'll see more cases like this being reviewed by courts trying to ascertain exactly what protections AI fits into."

The judge also made way for Garcia to move forward in holding Google accountable for its role in helping develop Character AI.

In a statement, a Google spokesperson wrote: "We strongly disagree with this decision. Google and Character AI are entirely separate, and Google did not create, design, or manage Character AI's app or any component part of it."

"This is a cautionary tale both for the corporations involved in producing artificial intelligence," said Clark. "And, for parents whose children are interacting with chatbots.""

--------------------------------------------------------------

The entire story cannot be be read at:



PUBLISHER'S NOTE:  I am monitoring this case/issue/resource. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic"  section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.  Harold Levy: Publisher: The Charles Smith Blog.

SEE BREAKDOWN OF  SOME OF THE ON-GOING INTERNATIONAL CASES (OUTSIDE OF THE CONTINENTAL USA) THAT I AM FOLLOWING ON THIS BLOG,  AT THE LINK BELOW:  HL:


https://www.blogger.com/blog/post/edit/120008354894645705/4704913685758792985


———————————————————————————————


FINAL WORD:  (Applicable to all of our wrongful conviction cases):  "Whenever there is a wrongful conviction, it exposes errors in our criminal legal system, and we hope that this case — and lessons from it — can prevent future injustices."

Lawyer Radha Natarajan:

Executive Director: New England Innocence Project;


—————————————————————————————————


FINAL, FINAL WORD: "Since its inception, the Innocence Project has pushed the criminal legal system to confront and correct the laws and policies that cause and contribute to wrongful convictions.   They never shied away from the hard cases — the ones involving eyewitness identifications, confessions, and bite marks. Instead, in the course of presenting scientific evidence of innocence, they've exposed the unreliability of evidence that was, for centuries, deemed untouchable." So true!


Christina Swarns: Executive Director: The Innocence Project;

-----------------------------------------------------------------