Amid lawsuits and criticism, Character AI unveils new safety tools for teens | TechCrunch
Briefly

Character AI has announced new teen safety tools amidst lawsuits claiming the platform contributes to suicides and exposes minors to inappropriate content, underscoring serious safety concerns.
The significant overhaul includes a new model for under-18 users designed to reduce exposure to harmful content, whilst legal pressures grow due to alleged misconduct involving teens.
Character AI's approach includes blocking sensitive topics, restricting user edits to responses, and implementing classifiers to detect harmful language—aiming for safer interactions for vulnerable users.
Read at TechCrunch
[
|
]