Mon - Fri 8:00 - 6:30

Because everything affects our health, life, and legacy.

Business

Article Image Business

Leading AI company to ban kids from chatbots after lawsuit blames app for child's death

On HALO

10 hours ago

This story discusses suicide. If you or someone you know is having thoughts of suicide, please contact the Suicide & Crisis Lifeline at 988 or 1-800-273-TALK (8255).Popular artificial intelligence (AI) chatbot platform Character.ai, widely used for role-playing and creative storytelling with virtual characters, announced Wednesday that users under 18 will no longer be able to engage in open-ended conversations with its virtual companions starting Nov. 24.The move follows months of legal scrutiny and a 2024 lawsuit alleging that the company’s chatbots contributed to the death of a teenage b...

HALO NEWSLETTER

Join HALO today and unlock this story instantly — It's Free

10 hours ago
On HALO

Opinion and Comments