


If you or someone you know is thinking about suicide, talk 988 suicide and crisis life line By calling or texting 9-8-8 or by Crisis text line By texting HOME to 741741.
Text messages, Instagram posts and TikTok profiles. Parents are often warned against sharing too much information online, worried about how all that information will be used. But one Texas high school student wants to use that digital footprint to save lives.
Sidhu Pachipala is a senior at The Woodlands College Park High School, a town outside of Houston. He’s been thinking about psychology since seventh grade when he read. Thinking, fast and slow by psychologist Daniel Kahman.
Concerned about teenage suicide, Pacchipala saw a role for artificial intelligence before it was too late. In his view, it takes too long to get help when children are suffering.
Early warning signs of suicide, such as constant despair, mood swings and sleep patterns, are often missed by loved ones. “So it’s hard to get people to show up,” Pacchipala says.
For a local science fair, he designed an app that uses AI to scan text for signs of suicide. He thinks that one day it may help to replace outdated diagnostic methods.
“Our style may reflect what we think, but it’s not really extended to that extent,” he said.
The app has won national recognition, a trip to DC and more A speech on behalf of his peers. It’s one of many efforts underway to use AI to help young people with their mental health and better identify when they’re at risk.
This type of AI, called natural language processing, It’s been around since the mid-1990s.. And, this is not a drug. “Machine learning is helping us get better. As we get more data, we can improve the system,” said Matt Nock, a professor of psychology at Harvard University who studies youth self-harm. But chatbots may not be a silver bullet.
Personalized tools like Pachypala can help fill the gap, says Nathan Demers, a Colorado-based psychologist who oversees mental health websites and apps. “When you walk into CVS, there’s that blood pressure cuff,” Demers said. “And that’s probably the first time someone realizes, ‘Oh, I have high blood pressure.’ I had no idea.”
I haven’t seen Pachipala’s app, but he theorizes that inventions like it can raise self-awareness about basic mental health issues that might otherwise be overlooked.
Building a SuiSensor
Pachipala set out to design an app that one could download to self-assess suicidal risk. They can use their results to advocate for their care needs and connect with providers. After many late nights spent coding, he had. Sui sensor.

Sidhu Pachipala
Chris Ayers Photography/Association for Science
Hide description
Change caption
Chris Ayers Photography/Association for Science
Using a sample of data from a medical study based on journal entries of adults, Pachipala said. Sui sensor It predicts the risk of suicide with 98% accuracy. Although only a prototype, the app can generate contact lists of local clinics.
At the end of his senior high school, Pachipala entered his studies. Regeneron Science Talent Searchthe 81st annual national science and math competition.
There, panels of judges tested his knowledge of psychology and general science with questions like: “Tell him how to boil pasta… OK, now let’s say we brought that to space. Now what?” Pachipala recalled. “You come out of those panels and you’re battered and bruised, but as it is, it’s all the better for it.”
put Ninth overall At the competition and took a prize of 50,000 dollars.
The judges got that, “His work shows that the semantics in an individual’s writing can be related to their psychological health and suicide risk.” Although the app is not currently downloadable, Pachipala hopes to continue working on it as an undergraduate at MIT.
“I think we don’t do that much: try to solve it [suicide intervention] From a creative standpoint, “I think we’ve waited too long,” he said.
Current AI mental health applications
How does his invention fit into broader efforts to use AI in mental health? Experts note that there are many efforts underway, and Matt Nock, on the one hand, expressed concern about false alarms. He prays Machine learning Electronic health records to identify people at risk of suicide.
“Most of our predictions are false positives,” he said. “Is there value there? Does it hurt to tell someone that he’s at risk of killing himself if he’s not?”
And data privacy expert Elizabeth Laird is particularly concerned about implementing such approaches in schools, given the lack of research. She directs it. Equity in the Civic Technology Project At the Center for Democracy and Technology (CDT).
“We have a mental health crisis and we need to do everything we can to prevent students from harming themselves,” she said, adding, “There is no independent evidence that these tools do that.”
All this focus on AI comes as the rate (and risk) of youth suicide continues to rise. Although there is a delay in the data, the Centers for Disease Control and Prevention (CDC) reported Suicide is the second leading cause of death. For youth and adults ages 10 to 24 in the US
Efforts like Pachipala are compatible with AI-powered tools available to monitor youth mental health, for clinicians and non-professionals alike. Some schools are using activity monitoring software that scans devices for warning signs that a student is harming himself or others. One concern, however, is that once these red flags are raised, that information can be used to discipline students rather than support them, and that discipline falls by race, Laird said.

according to survey Laird shared that 70% of teachers whose schools use data tracking software use it to discipline students. Schools can stay within the limits Student Privacy PolicyBut they failed to implement safeguards to protect against unintended consequences, Laird said.
“The conversation around privacy has shifted from one of legal compliance to something ethical and right,” she says. She pointed to the survey data showing that. Nearly 1 in 3 LGBTQ+ students Report an outing or identify someone who has been outed due to activity tracking software.
Harvard researcher Matt Nock sees AI’s place in number crunching. It uses machine learning technology similar to Pachypala to analyze medical records. But he emphasized that many more experiments are needed to verify the computational estimates.
“A lot of this work is well thought out, trying to use machine learning, artificial intelligence to improve people’s mental health… but until we do the research, we don’t know if this is the right solution.” he said.
Many students and families are turning to schools for mental health support. Software that scans youth words and by extension ideas is one approach to taking a pulse on youth mental health. But, it can’t replace human interaction, Nock said.
“Technology is going to help us, hopefully, get better at knowing who is at risk and when,” he said. But people want to see people; They want to talk to people.
We offer you some site tools and assistance to get the best result in daily life by taking advantage of simple experiences