This school year, the district updated its Artificial Intelligence (AI) policies, with changes focused on detailing responsible teacher and student usage — advising AI to be restricted within classrooms.
The district provided an “AI Best Practices” document, helping teachers navigate how to use AI to enhance teaching and learning. From a learning perspective, the document emphasizes academic honesty, helping students learn effectively.
“As an English teacher, [I realize] students need to learn how to write and we can’t hand that over to AI because writing involves so much more than grammar,” Lisa Callender, an AP Seminar and AP English Literature teacher, said. “It involves critical thinking and organization of thoughts and effective communication. So it’s not a task that we can just hand over to AI.”
Nevertheless, AI is a useful tool — improving classroom efficiency, reducing administrative tasks and helping students learn responsibly.
“In my seminar class, you know, there is a little bit of giving them independence, which is frightening,” Callender said. “But, at the same time, it’s kind of arming them, showing them how AI can be really effective in their research, how it can help them outline, and how it can help them find sources.”
Richard Robinette, AP Computer Science A, Computer Programming and Machine Learning teacher, expands on the benefits and drawbacks of AI.
“[AI is] such an excellent education tool,” Robinette said. “I mean, it’s as effective as an education tool as it is a cheating tool, which is unfortunately what it gets used for the most. But if you’re trying to learn something without formal instruction, it’s a new world. It saves me days of work.”
After the rise of COVID-19 in 2020, teachers noticed students’ continued reliance on technology as well as a surge in cheating on both homework and assessments.
“In 2020 with COVID, cheating just really blossomed and burgeoned,” Callender said. “I mean, we’ve always had cheating, but I think in 2020 with kids sitting at home on the computer and having access to everything, you know, there was this temptation to cheat more often. We’re kind of bearing the fruits of that unfortunately.”
In response, teaching styles have changed, with Robinette and Callender “moving away from homework-based tasks” and moving towards “independent, on-paper, in-class assignments.”
“AI is only as good as what’s out there, right?” Callender said. “So, you know, there is a class that kind of teaches [about] it, and I think maybe in the future that’s really what we need. We need a course that will show us how to use AI effectively without giving over our brain … without giving over [our] skills.”
Aaron Nayki (12), a student currently taking the Machine Learning class, commented on AI usage within the class.
“When it comes to AI, I believe that there’s like ethical and unethical parts of it when it comes to the academic environment,” Nayki said. “For the unethical, it’s like using AI to cheat on your tests or using it to write your essays or even in Robinette’s, like Robinette deals with this a lot where students use AI to write their code.”
In previous years, the 2022 GPT-3.5 model showed hallucination rates as high as 40% on certain benchmarks, while by mid-2025, the GPT-4o general-purpose models have rates of 1 percent, according to the Journal of Medical Internet Research. Thus, although AI hallucinations still occur, the development of more advanced-reasoning chatbot AI models like the GPT-4o, Gemini 2.5 Pro and Claude 3.5 Sonnet provide more accurate information — encouraging students to avoid doing work.
Robinette explains that we “live in a world where computers and robots” are performing actions that used to be jobs. Especially in the computer science (CS) realm, the U.S. tech sector has had over 89,000 layoffs in 2025, with over 10,000 U.S. job cuts directly linked to generative AI and around 20,000 blamed on “technological updates,” according to Fortune Magazine.
“It’s a massive problem, honestly,” Robinette said. “I have a lot of friends in higher levels of industry where they’re hiring recent college graduates that have really good credentials and don’t actually know things and it’s horrifying … I tell my students: ‘I don’t care if you ever write another line of code when we leave here, but if your brain works better, it doesn’t matter what happens with the job market. If your brain works better, you’re going to be better off.’”
Aspiring computer science students worry about their futures in job availability, but Nayki has a different perspective.
“‘A lot of my friends, they always tell me, oh, Aaron, why are you going into [computer science]?’” Nayki said. “AI is going to take your job, right? What I think of it is that AI is not going to take our job. Rather [we] as employees, as workers, have to find a way where you can understand AI. Where we can learn to train AI. There’s always going to be jobs at the end of the day.”
Along with its effects on STEM subjects, generative AI also finds its way into creative subjects like art.
“[AI] can generate ideas so fast it’s kind of scary,” artist Lauren Gong (11) said. “Sometimes it feels like this temptation, because it’s right there, it can do so much, and it almost takes over your mind. But at the same time, that’s not really what art is about. AI should help you, but not become you … We can’t not use it. It’s the time that we live in, but it all matters how we use it.”
Thus, the changes in the district policy reflect not only the potential drawbacks and benefits AI brings in an academic environment, but also AI in the larger scheme of students’ futures: employment.
“You hear some of the [creators] are already sending out warnings, the dangers [of AI],” Callender said. “In the past, the people who invented the atomic bomb spent their life trying to rally for peace, nuclear weapon disabling. Now, we have released Pandora’s Box, what can we do?”