Oh. It's strictly platonic?
When There’s No School Counselor, There’s a Bot
Is a human-AI texting service the future of mental-health care for students?
By Julie Jargon, WSJ
Feb. 22, 2025 8:00 am ET
Teens around the country are confiding in Sonny when they feel they don’t have anyone else to talk to. Sonny is part human, part AI: a new kind of chatbot that school districts are adopting to provide support when there aren’t enough counselors to go around.
Sonar Mental Health, the developer of the AI-powered “wellbeing companion” named Sonny, is rolling out its hybrid model to school districts, which are struggling to meet student demand for mental-health services.
As cases of chatbots hallucinating or dispensing dangerous advice have made headlines, schools are wary of steering students to AI-only solutions. Sonar says Sonny’s selling point is that humans with backgrounds in psychology, social work and crisis-line support are always in the mix, reviewing the chats and taking cues from AI to inform their own replies to students.
“It’s like a co-pilot or assistant to the human,” says Sonar Chief Executive Drew Barvir, who co-founded the company with a classmate while attending the Stanford Graduate School of Business.
At a time of rising rates of youth anxiety and depression, Barvir is betting that meeting teens where they are—on their phones—is the way to catch problems early. And speaking to teens like a cool older sibling, he says, carries more cred, which is why the AI has learned to talk in teenspeak.
The hybrid chatbot is now available to more than 4,500 public middle and high school students in nine districts across the country, many of which are in low-income and rural areas where mental-health services are lacking. The American School Counselor Association recommends schools employ at least one counselor for every 250 students, but says the national average is one counselor for every 376 students. And 17% of high schools don’t have a counselor, according to the Education Department.
The AI suggests responses to student texts, but humans can edit them or write their own. Sonar’s staff monitors 15 to 25 chats at a time.
If students mention a desire to hurt themselves or others, Sonar immediately notifies parents, school administrators and police, if necessary.
The AI prompts the humans when to check in with students, and coaches them on how to engage them. Drawing from prior exchanges, the AI has learned which local vernacular and emojis resonate best with teens. One discovery: Smiley faces are cringey. Teens prefer more expressive emojis such as the melting face.
‘Only focused on me’
Schools around the country are offering students Sonny, a part AI, part human chatbot that provides mental-health support. Students like Michelle Herrera Rojas confide in Sonny about academic and personal worries.
PHOTO: MARIA URIARTE
Michelle Herrera Rojas, a 17-year-old senior at De Anza High School in Richmond, Calif., says she struggled with depression from a young age and sometimes saw a therapist.
When her school introduced Sonny in September, Herrera Rojas decided to give it a try. She told Sonny she was stressed about college and scholarship applications.
A cousin had recently died, and Herrera Rojas was trying to distract herself by going out with friends. After a few days of not interacting with Sonny, she received a text from Sonny asking how the college applications were going. She then told Sonny about her cousin and how she hadn’t made much progress. Sonny told her that distraction is a normal coping mechanism but encouraged her to continue working on her applications while also giving herself time to mourn.
Hearing from Sonny, she says, made her feel someone cared—and it motivated her to focus on her applications.
Herrera Rojas also began leaning on Sonny when she found it hard to turn to friends. “I can become very obsessive about situations and I know I can annoy my friends when I talk about a certain situation over and over again,” she says. “I don’t feel like I’m annoying Sonny.”
Students have access to Sonny between 8 a.m. and 2 a.m. in Eastern time, when six people on staff, across shifts, monitor the chats (Barvir hopes to eventually hire enough people to enable 24/7 access). The AI has been built on several different large language models and trained in motivational interviewing and cognitive behavioral therapy techniques by a team of mental-health clinicians and research scientists at Stanford and the University of California, Irvine.
Barvir created the company because, he says, he wished something like Sonny existed while he was watching his mother undergo mental-health struggles. He lost her to suicide when he was in his early 20s. Sonar teamed up with its first school in January 2024 and has raised $2.4 million in pre-seed funding from venture-capital firms, grants and a Stanford fellowship.
Bonnie Mitchell, a licensed professional clinical counselor who has studied the use of AI in mental health, says chatbots can be a good supplement if they are designed properly, but they still can’t compete with face-to-face interactions. Therapists can take cues from body language to recognize signs of depression and anxiety. “AI depends on being fed that information, but it can be fooled,” says Mitchell, who is based in San Diego.
Barvir says he makes it clear to schools and students during introductory meetings that Sonny isn’t a therapist, and Sonny frequently encourages kids to talk to the humans in their lives. Students can also choose to share their social media handles with Sonar, which uses AI to monitor their posts for anything that might indicate mental-health problems. If the staffers determine a student could benefit from professional help, they work with schools and parents to help find a therapist.
Outside of self-harm or violence, Barvir says staffers don’t disclose the content of the exchanges students have with Sonny. If students close their account with Sonny, the company typically retains their data for 60 days. Barvir says students or families can request to delete any chats at any time.
Sonar provides schools with aggregated data on the types of concerns students share, so administrators can better meet kids’ needs. The company charges districts $20,000 to $30,000 a year for the service, which districts usually pay for out of mental-health grants.
Herrera Rojas likes that Sonny has unlimited time for her. “Our school counselors are very busy,” she says, “but I have someone to talk to one-on-one who’s only focused on me.”
A judgment-free zone
At Berryville High School in Berryville, Ark., there are two counselors for the 565 students. It isn’t enough to meet students’ needs, says Ashley Sharp, who works for a federally funded program that supports student mental health. She helped bring Sonar to the district’s only high school last fall to see whether it could help fill in the gaps.
Of the 175 students who have signed up for the service, 53% text Sonny several times a month. Sharp has noticed an increase in texts ahead of testing periods, which she said has helped the school realize it needs to offer extra emotional support to students at those times. The school has brought in experts to teach students skills for coping with stress.
Sharp says the school has seen a 26% drop in student behavior infractions since students began using Sonny. Many students have told her they appreciate having a companion. “They feel it’s a judgment-free zone,” she says.
Marysville Public Schools in Marysville, Mich., began using Sonny last month. The district has already responded to a high-school student who expressed thoughts of suicide. The parents and administrators were notified immediately and the school was able to get the student help, says Karrie Smith, the district’s executive director of special education and state and federal programs.
“I think we’re going to be able to see students who need mental-health support who otherwise would have flown under the radar,” Smith says.
Write to Julie Jargon at Julie.Jargon@wsj.com
Comments