As Brophy continues to incorporate more and more artificial intelligence into the student experience, the ways we use AI and their effects become more prominent. This can be seen with the ways students might use AI in excess or in unhealthy ways.
Brophy has placed a lot of focus on AI recently, with the release of the “AI Guidelines for Academic Integrity” in the Fall of 2024, which covers Brophy’s system for dealing with AI in the classroom.
Brophy also added a new page to their website called “AI and Brophy.”
On this page it conveys that if AI is used correctly it can encourage creativity and human connection. Or, if used maliciously or without caution, it can damage those human elements.
Dean of Student Support Services and Brophy grad Mr. Austin Pidgeon ’08 said “I think they [chatbots] can serve a real benefit in terms of filling social gaps, helping young people learn communication skills and potentially to build empathy.”
In the past couple of years, many new companies have sprung up marketing AI chatbots with an emphasis on human-like feeling and emotion. Such as Character.ai, which describes their website as “super-intelligent chat bots that hear you, understand you, and remember you.”
These chatbots have the potential to help people socially, however, they can also isolate people who might already be struggling with mental health issues.
“I think they [chatbots] can serve a real benefit in terms of filling social gaps, helping young people learn communication skills and potentially to build empathy,” said Mr. Pidgeon.
On the other side of this, some make the argument that talking to AI excessively can isolate people from their friends and family and lead to behavioral or mental health issues further down the line.
Charles Fabricant ’27 said “it could definitely keep them from being social, if they think that that’s enough. And also at the same time, make them think that it’s okay to talk to someone that way, if they’re being sexual with it, or inappropriate, or just rude to it.”
“My biggest concern continues to be chatbots in the ways that those can promote social and physical isolation” said Mr. Pidgeon.