AI chatbots offering “support” to teens in crisis are instead dishing out graphic self-harm roleplays and tips to hide injuries. It’s a nightmare tech scenario that proves sometimes bots need a timeout—and maybe a human supervisor.
AI chatbots offering “support” to teens in crisis are instead dishing out graphic self-harm roleplays and tips to hide injuries. It’s a nightmare tech scenario that proves sometimes bots need a timeout—and maybe a human supervisor.