Artificial intelligence (AI) is already reshaping classrooms, and school leaders are beginning to grapple with how to integrate it responsibly. At the AFSA Convention, a lively panel brought together principals, educators, and technology experts to explore AI’s potential, ethical implications, and the need to keep human-centered leadership at the forefront.
Moderated by Craig DiFolco, communications director for the Council of School Supervisors and Administrators (CSA), AFSA Local 1, the session opened with DiFolco acknowledging AI may help us in powerful ways, but nothing can replace the role of principals and teachers in connecting with students.
DiFolco emphasized that the goal of the session was dialogue, not monologue, and introduced five panelists who shared insights across different areas of AI in schools.
Gavin Craig, assistant principal at Torrington Intermediate School in CT and a former school psychologist, described AI as a “thought partner” rather than a replacement for educators. “The danger is when students see AI as an answer machine,” Craig said.
“The opportunity is when they use it as a thought partner. AI can help students brainstorm, generate options, and then push them to refine their own thinking.”
Craig also described how AI can assist educators with routine tasks while deepening engagement with students and staff. “It’s not something that replaces me, but actually augments my thought process,” he explained. “It frees up time so I can focus on what matters most—relationships and student learning.”
Kenneth Shelton, a lifelong educator and technology strategist, framed AI as a form of literacy, not just a passing trend. Drawing on his work, including his book The Promise and Perils of AI in Education, Shelton stressed that AI has always been part of our lives—from spellcheck tools to email filters—and urged school leaders to consider the social and psychological dynamics behind resistance.
He coined the term “tequity” to describe the combination of effective educational technology with culturally responsive learning. “Equitable AI use requires attention to access, academic rigor, and student agency,” Shelton said. He emphasized that leaders must cultivate an ethical mindset, not just rules, when integrating AI in schools, warning that “AI can inadvertently reinforce inequities if we’re not thoughtful.”
Peter Michelson, an elementary school principal also from Torrington Intermediate School, shared how AI has helped him refine communication with families and staff. “AI helps me make my communication clearer and more palatable,” Michelson said, giving examples from everyday school life—from responding to concerned parents to creating multilingual video updates for families. He also highlighted AI-proof assignments for younger learners and the importance of teaching students to critically review AI-generated content.
James Ulrich, principal of Argyle Middle School, a tech magnet in Montgomery County, Maryland, likened AI to an electric bike: students need to know how to ride a regular bike before leveraging AI’s power. “AI is everywhere, but it’s a tool. Students need to know when to use it, and when to go without it,” he said. Ulrich emphasized that AI should amplify existing instructional pathways, such as digital media and cybersecurity, rather than replace foundational learning.
The panel explored how bias can manifest in AI and the importance of ethical integration. Craig and Ulrich described examples of bias, such as AI models making assumptions based on students’ names or language proficiency. DiFolco shared a social experiment in which a language model assumed students struggling with English had ethnic-sounding names—a reminder that AI reflects the biases baked into its training.
Ulrich stressed that relationships are the best safeguard against misuse: “The best AI detector is a great relationship. You have to know your kids.” He added that punitive approaches, like giving a zero for AI use, miss opportunities to teach AI literacy and ethics.
Panelists also addressed the natural resistance to AI adoption. Shelton encouraged leaders to guide communities from skepticism to understanding, acting as “torchbearers” for responsible AI integration. “We can’t let AI do the thinking for us,” Shelton warned. “We need to ensure we’re using these tools to expand critical thinking, not diminish it.
Erlenwein highlighted how AI can free leaders from routine digital tasks, allowing them to focus on meaningful human interactions. “We didn’t get hired to sit behind a desk,” he said. “AI gives me more time to interact with students, faculty, and staff—the true sources of educational magic.”
Panelists agreed that school leaders must be proactive in creating policies, professional development, and ethical frameworks to guide AI use. Erlenwein summarized:
“Leaders need to help shape how AI is used in schools so that it supports—not undermines—our mission to serve students.” Shelton added, “The tools are here. The question is whether we’ll use them to deepen learning and equity, or whether we’ll let them become another distraction. That choice is up to us.”
DiFolco closed the session by returning to the human-centered focus of leadership AI is powerful, but it’s not magic. It doesn’t replace the relationships we build, or the judgment calls we make as educators. That’s where our humanity matters most.
