Schools are using AI counselors to track students’ mental health. Is it safe? | AI (artificial intelligence)
AI Chatbots in Schools: A Double-Edged Sword for Teen Mental Health
In the quiet of a Florida evening, a school counselor’s phone buzzed with an alert. Brittani Phillips, a middle school counselor in Putnam County, received a “severe” notification from an AI-powered therapy platform used by students after school hours. The system had flagged a student’s chat as potentially dangerous. Phillips immediately sprang into action, contacting the student’s mother and even calling the police. The next day, she learned the student was safe—now a thriving ninth grader who greets her in the hallways. For Phillips, this incident underscored the potential of AI to intervene in crises, but it also raised questions about the role of technology in teen mental health.
As schools across the U.S. grapple with budget shortfalls and a shortage of mental health professionals, AI-powered tools like Alongside are stepping in. Alongside, used by over 200 schools, offers students a chatbot named Kiwi to talk through their problems and build emotional resilience. For rural schools, it’s a lifeline, providing access to mental health resources that would otherwise be out of reach. But as AI becomes a cornerstone of the Trump administration’s education agenda, concerns are mounting.
The Allure of AI for Teens
Why are students so comfortable confiding in AI? Experts say it’s a mix of familiarity and convenience. Teens, who’ve grown up with chat interfaces on social media, find AI less intimidating than talking to a human counselor. They can text instead of call, avoid judgment, and access help anytime without scheduling an appointment. “It’s almost more natural than interacting with another human being,” says Sarah Caliboso-Soto, a licensed clinical social worker.
But this comfort comes with risks. AI lacks the discernment of a human clinician—it can’t read body language, hear tone, or catch subtle cues. While it can be a “first line of defense,” experts warn it shouldn’t replace human connection. “You can’t replace human judgment,” Caliboso-Soto adds.
The Privacy and Ethical Dilemma
Privacy experts are also sounding the alarm. Unlike conversations with licensed therapists, AI chats don’t carry the same legal protections. In an era of heightened concerns about student privacy and police involvement, the use of these tools raises “messy” ethical questions. Even with human oversight, the potential for misuse or overreach is significant.
Sam Hiner, executive director of Young People’s Alliance, argues that AI can’t fill the void of human connection. “Can you think of another time in history when people have been so lonely?” he asks. His organization advocates for rebuilding human communities rather than relying on AI as a crutch. They’re particularly concerned about “parasocial relationships,” where students develop one-sided emotional attachments to chatbots. Hiner warns that AI should never convey emotions like “I’m proud of you,” as this encourages unhealthy attachment.
The Reality of AI in Schools
For Phillips, the AI tool has been a game-changer. It allows her to focus on students in crisis while handling routine issues like breakups or homework stress. But it’s not without its quirks. Some students test the system, typing provocative statements to see if anyone is listening. Phillips has learned to read body language and context to determine if a comment is serious or just teenage humor.
Despite these challenges, Phillips sees the value in the tool. “The number of boys who test the system goes down every year,” she says, suggesting that students are learning to trust the process.
The Future of AI in Education
As AI continues to evolve, its role in education remains contentious. While it offers unprecedented access to mental health resources, it also raises questions about privacy, ethics, and the nature of human connection. For now, schools like Phillips’ are navigating this new frontier, balancing the benefits of AI with the irreplaceable value of human interaction.
Tags:
AI in schools, mental health, teen therapy, chatbots, Alongside, Kiwi, Putnam County, Brittani Phillips, privacy concerns, parasocial relationships, rural education, emotional resilience, crisis intervention, EdSurge, Young People’s Alliance, Sam Hiner, Sarah Caliboso-Soto, Linda Charmaraman, Trump education agenda, telehealth, social-emotional learning.
Viral Phrases:
“Can you think of another time in history when people have been so lonely?”
“It’s almost more natural than interacting with another human being.”
“You can’t replace human judgment.”
“The number of boys who test the system goes down every year.”
“AI can’t fill the void of human connection.”
“Parasocial relationships are a growing concern.”
“AI is a double-edged sword for teen mental health.”
“Trust the process, but don’t over-rely on it.”
“Rebuilding human communities is the ultimate goal.”
“AI is a stepping stone, not a substitute.”
,




Leave a Reply
Want to join the discussion?Feel free to contribute!