Instructions:  Conduct research about a recent current event using credible sources. Then, compile what you’ve learned to write your own hard or soft news article. Minimum: 250 words. Feel free to do outside research to support your claims.  Remember to: be objective, include a lead that answers the...

Read more
AI Can’t Replace Real Support for Teens
On September 2, 2025, OpenAI, the company behind ChatGPT, announced new safety protections for teenagers using its chatbot. This move came after the tragic story of a 16-year-old boy in California who discussed his plans to end his life with ChatGPT. Millions of people use ChatGPT every day, and for some teens, it can feel like a friend who will always listen. But at the end of the day, ChatGPT is not a real person. It’s just a computer program that predicts words, not feelings. That’s why OpenAI’s plan will come in. The plan will allow parents to view the students chat history, as well as what they are using it for. I think this is a great idea, but it will never be perfect, because AI won’t truly understand our feelings and intentions. However, to improve on this plan, OpenAI could make it easier to connect to people who understand how we feel.
OpenAI’s new plan will let parents see how their kids use ChatGPT and get alerts if their teen seems upset. The company also says it make it easier for users in distress to reach emergency services, and it will use a special version of its chatbot that’s trained to be safer and respond more carefully when someone seems to need help. While these changes might help, experts say that parental controls are easy for teens to bypass, and just watching for “distress” doesn’t fix the problem. As Robbie Torney from Common Sense Media said, these controls are more like a Band-Aid than a real solution.
I think AI can help young people, but only if it’s truly safe and if we remember its limits. AI could help research on some topics we want to know more about. It’s good that OpenAI is trying to protect teens, but talking to a chatbot is not the same as talking to a real person. Teens who are struggling need to be able to reach out to real counselors or hotlines, not just get computer-generated advice. I believe OpenAI should make it much easier to connect with real people from inside the app. AI can be a helpful tool, but it should never be the only support for kids who need help.
If we want to keep teens safe, we need more than just new features. We need real people ready to listen and help. AI is powerful, but it can’t replace human understanding and care.

Share