by David Krupski
By November 2025, our school’s AI policy already felt dated.
I remember looking back at our school’s AI policy, a document we had taken seriously only months earlier, and realizing that it was no longer strong enough for the moment we were in. We had tried to be thoughtful. We had set boundaries, named risks, and written what we believed was a responsible first effort to a technology moving faster than most schools were prepared for. But the pace of change was unforgiving. The tools had improved. Teacher use had widened. Student awareness had grown. The questions coming from staff were more sophisticated than when we first drafted the policy.
As a whole-school principal at a bilingual school in Shenzhen, I no longer see AI as something schools can afford to observe from a distance. In my professional judgment, it is now a live leadership matter. Now, my question is whether leaders, we are prepared to govern it with enough clarity and discipline that it strengthens professional practice rather than weakens it.
The broader evidence points in the same direction. In CoSN’s 2025 survey, 80% of respondents said their districts had generative AI initiatives underway, while only 1% reported a complete ban. The RAND Corporation, a nonprofit research organization, also found that 25% of surveyed teachers used AI tools for instructional planning or teaching during the 2023-2024 school year, and nearly 60% of surveyed principals reported using AI tools for their own work. These are U.S.-based findings, but they still suggest that AI has moved well beyond novelty.(1,2)
Student use reinforces the point. RAND reported in March 2026 that the share of middle school, high school, and college students using AI for homework rose from 48% in May 2025 to 62% in December 2025.
Those figures make evasion harder. Leaders can no longer speak about AI as though it sits outside the operational and academic life of the school. The leadership challenge to me and in my experience, is that its use inside schools is often uneven, underdeveloped, and poorly governed
The Real Problem
In my view, the central problem is not AI itself. It is unmanaged use.
Schools are already living with uneven practice. Some teachers use AI intelligently. They question the output, refine it, adapt it, and subject it to professional judgment. Others use it too casually. They accept the first response because it sounds polished. Often, the weakness begins earlier: they do not know how to prompt with enough precision, they do not push the tool to be more critical, and they do not go back to check whether it has actually done what they asked. The result can appear efficient while being educationally thin, poorly matched to students’ needs, or simply not good enough.
A resource that looks efficient is not necessarily a resource that improves learning. A polished paragraph is not proof of good thinking. A generated lesson is not automatically a well-designed lesson. The danger for school leaders is that poor use can hide behind the appearance of competence.
The question I believe school leaders need to ask is this: ‘Are teachers using AI well enough to justify its place in professional practice?’
What We Are Doing Now
I am currently holding two weekly AI training sessions with teachers. In these, we address practical use, ethical judgment, safeguarding risks, weak prompting, overreliance, and the habit of accepting the first output too quickly. A major part of the work is geared towards helping teachers learn how to direct the tool with precision, ask it to be more critical, and then review whether the result is accurate and fit for purpose.
In addition, I also speak regularly with our faculty about what must not be shared. Confidential student information, sensitive family matters, internal school documentation, and anything that compromises trust or privacy cannot be treated casually for the sake of convenience. If we fail there, speed ends up being of no one’s benefit and a failure of leadership.
The same RAND study also found that only 18% of surveyed principals said their schools or districts had provided guidance on AI use for staff, teachers, or students in 2023-2024. That figure resonates because it reflects a wider reality: in many settings, adoption has outpaced governance.
We are now working on an updated AI policy, as well as something larger: a whole-school AI integration plan for 2026. That matters more than a stand-alone policy. A serious integration plan has to clarify where AI belongs, where it does not. What staff need to know, what students need to learn, how parents will be informed, and how the school will review whether its practice is improving or drifting.
That, to me, is the difference between reacting to AI and leading it. Here, some thoughts that are guiding my thinking and the integration plan as it moves forward:
Treat Policy as a Living Document
If AI policy is too broad, it will not guide practice. If it is too rigid, it will age badly. If it is drafted once and then placed on a shelf, it will quickly become performative rather than useful. Revision is evidence that the school is paying attention.
Train Adults First
Schools cannot credibly talk about student responsibility if teachers themselves have had little guidance in effective use, ethical limits, prompting, verification, and data caution.
Too often, in my experiences, schools can be quick to jump on student misuse before dealing with adult capability. Yet, if the adults are unclear or underprepared and become inconsistent, the rest of the conversation is likely to become harder. Students also notice mixed messages quickly. So do parents.
Teach Students Explicitly
Warning students about misuse is not enough. They need to learn what responsible use looks like, where support ends, and dishonesty begins, and why thinking becomes even more important when tools become more powerful.
If schools do not teach this clearly, students will still learn about AI, just not from us. They will learn through peers, social media, and trial and error. That to me is not a serious strategy.
Communicate with Families
Parents deserve to know not only that a school is aware of AI, but how it is being addressed. If schools do not publicize their thinking clearly, the vacuum will be filled by rumor, anxiety, or inflated claims.
Our international school families need clarity about the school’s values, its boundaries, and how students are being prepared to use these tools responsibly – all of which aren’t found in the slogans we sometimes seek to adopt.
Build a Coherent Plan
Schools now need more than isolated AI activity; they need a coherent plan that turns interest into responsible practice. Without that, what appears to be innovation can quickly become fragmentation dressed up as progress.
For us, that means more far more than simply updating a policy. It means connecting staff training, student guidance, parent communication, and practical implementation so that the work holds together across the whole school.
A Leadership Test
This is where AI becomes a leadership issue.
In my school, the work is no longer theoretical. It is revising policy when it no longer matches reality. It is training teachers twice each week, so they learn to use these tools with more precision and more scepticism. It is deliberately teaching students that responsible use matters as much as technical skill. It is communicating clearly with parents so they understand both the opportunities and the boundaries. And it is building a whole-school plan for 2026-27 that is purposeful enough to guide practice and flexible enough to keep up.
That is what this work looks like now.
Schools’ best places to handle AI well aren’t going to do it by accident. They will do it well because leaders choose to take it seriously, put clear structures in place, keep adjusting as the technology changes, and review all actions and policies consistently.
That is the work in front of us.
A Question for readers: What actions has your school taken to move from AI awareness to responsible implementation? Do share your thoughts.
David Krupski, Whole School Foreign Principal, Victoria Park Academy, Shenzhen, China
LYIS is proud to partner with WildChina Education
