5 Questions Every School Leader Should Ask Before Buying AI Tools

As artificial intelligence floods classrooms with bold claims, leaders must ask the right questions to ensure tools align with the school’s vision, equity, and teacher needs.

Topics: Technology

A lot of new ed tech products have come out over the years. Some are genuinely helpful. Others are more hype than substance. Artificial intelligence (AI) is simply the newest wave, and it comes with no shortage of promises. From adaptive platforms to lesson generators, educators are flooded with pitches that claim to save time, personalize learning, and boost outcomes.

There is also growing urgency. When some schools integrate AI effectively and others do not, a digital divide grows. Those with access and support accelerate. Those without fall behind. The question is no longer whether to use AI. The real question is how to use it wisely, sustainably, and in alignment with your instructional goals.

Educators know that new tools often arrive with bold marketing but uneven impact. What determines success is not the technology itself but the clarity of leadership decisions that surround it. Schools that pause to ask the right questions are far more likely to choose tools that strengthen teaching, improve equity, and last beyond the first wave of enthusiasm. Before making a purchase, school leaders should ask five critical questions rooted in purpose, not hype.

1. What Problem Are We Trying to Solve?

Every strong technology adoption begins with the problem, not the tool. Clarity at the front end prevents wasted dollars and frustration later. Before watching demos or taking sales calls, gather your leadership team and ask:

  • What specific instructional or operational issues are we trying to address?
  • Is this a priority for our teachers or students?
  • Does the tool clearly support that need?
  • Have we asked teachers what would help?

Example: A Mismatch of Needs
One school adopted an AI-powered instructional coach that analyzed classroom lessons and generated detailed reports for teachers. The technology was impressive, but it didn’t match the actual needs teachers had surfaced. They weren’t asking for exhaustive lesson breakdowns; they were asking for help designing engaging tasks for multilingual learners. The tool solved a problem no one prioritized. Within months, usage dropped, and the funds tied to the contract couldn’t be reallocated.

Success Story: A Direct Match
In contrast, another school identified a clear need: reducing grading time for writing tasks while maintaining strong feedback. They piloted a platform that suggested feedback aligned to rubrics, and teachers reported recovering two hours a week. Because the tool addressed a real pain point, adoption stuck.

As the leader, resist the urge to chase novelty. Anchor purchases in needs surfaced by teachers and students. When leaders frame tool selection around real problems, they send a powerful message: technology is in service of learning, not the other way around.

2. Is the Tool Usable, and Will It Be Supported?

Even the right tool can fail if it is too complex or poorly supported. Teachers already juggle heavy loads. If a platform requires hours of training or constant troubleshooting, it will not gain traction. Ask:

  • Is it intuitive for teachers and leaders?
  • How steep is the learning curve?
  • Is there clear support for onboarding and troubleshooting?
  • Do we have staff capacity to manage roll-out?

Example: A Steep Learning Curve
One school adopted an AI-enhanced learning management system with powerful analytics. But teachers needed four hours of training just to log in and create classes. Many never used it with students. The problem was not the AI but instead the barriers to getting started.

Success Story: Seamless Integration
Another district chose a simpler tool that plugged directly into Google Docs. Teachers could highlight student writing and request AI-generated suggestions in seconds. No extra logins, no new system, just an extension of the platform they already used daily. Adoption soared.

Usability is not a soft factor; it is the make-or-break condition for adoption. If the tool is not simple enough to fit into a teacher’s Tuesday morning, it will remain underutilized, no matter how innovative.

3. Will People Actually Use It?

A purchase is only worthwhile if the tool becomes part of the rhythm of planning, instruction, and feedback. Adoption requires alignment with teachers’ workflows and a clear demonstration of value. Ask:

  • How will teachers, coaches, or leaders use it during the week?
  • Is it replacing a clunky task or creating a new one?
  • Is the value clear enough to drive regular use?

Example: Misaligned Planning
One district rolled out a lesson-planning assistant. At first, teachers were curious. But the tool did not align with the pacing calendars and unit structures already in use. Within months, log-ins dropped sharply.

Success Story: Embedded Impact
Conversely, another district piloted an AI feedback coach that allowed teachers to upload short clips of classroom instruction. The tool suggested feedback aligned to teacher evaluation indicators teachers already used in coaching cycles. Because the platform spoke the same “language” as their instructional systems, teachers integrated it into their regular reflection and growth cycles.

Usage is not just about log-ins. Look for evidence that the tool drives instructional shifts: more targeted lessons, improved student work, stronger professional learning conversations.

4. Can We Try It First? (And If Not, Why?)

Never commit to a long-term contract without a real trial. A pilot surfaces what brochures and vendor promises cannot, how the tool behaves in real classrooms. Ask vendors:

  • Can we test it with a small team or grade level?
  • What features are available during the trial?
  • What feedback and usage data will help us decide?

Example: The Cost of Skipping Pilots
One district signed a contract for an AI grading tool without piloting. Teachers soon found it misunderstood rubrics and created time-consuming workarounds. Instead of saving time, it added more.

Success Story: Pilot With Feedback Loops
Another school ran a six-week pilot of an AI reading platform. They gathered weekly teacher feedback, student engagement data, and usage logs. By the end, leaders had a clear sense of strengths, gaps, and next steps. The pilot informed not only purchase decisions but also professional development planning.

Keep in mind that a vendor that resists pilots or restricts access is showing a red flag. Transparent partners welcome scrutiny.

5. Does It Align With Our Vision for Teaching and Learning?

This is the most important and often the most overlooked question. Every tool reflects values and assumptions. Some prioritize speed. Others promote inquiry, creativity, or collaboration. The choice signals what your school system values most. Ask:

  • Does this tool support deeper learning, not just faster tasks?
  • Does it strengthen student agency, collaboration, or thinking?
  • Would we feel proud to put it in front of our students?

Example: A Tool That Undermined Learning
A school rolled out an AI math app that provided students with step-by-step solutions. At first, students were excited by how quickly they could get correct answers. But teachers soon noticed that many skipped showing their own work and stopped practicing problem-solving strategies. Instead of building mathematical reasoning, the app trained students to rely on shortcuts. It was eventually removed, not because it was inaccurate, but because it weakened the habits of thinking that the math team valued most.

Success Story: A Tool That Extended Vision
Another school used an AI tool that generated debate prompts and guided students through evidence analysis. Rather than replacing critical thinking, it scaffolded it. The tool aligned with the school’s emphasis on civic discourse and inquiry. Teachers reported students were more engaged in constructing arguments and evaluating sources.

A tool that saves time but weakens learning is not worth keeping. Purchases should be filtered through your core vision for student learning, not just operational convenience.

Additional Considerations Leaders Should Weigh

  • Equity and Access: Will this tool widen or narrow gaps? Consider whether all students, including those with disabilities can access it, it reflects cultural and linguistic diversity, and it requires high-end devices or internet access not available in all homes.
  • Data Privacy and Security: Ask hard questions about data like who owns the student data; how it’s stored, shared, and used to train models; and whether the vendor complies with FERBA, COPPA, or local data policies.
  • Sustainability: Beyond first-year funding, can you sustain licenses, training, and updates? Many districts have bought tools with stimulus dollars, only to face difficult cuts later.
  • Teacher Voice: Involve teachers not only in feedback but also in co-design. A top-down rollout often breeds resistance, while teacher-led pilots create champions who model use for peers.

Lead with Purpose, Not Pressure

AI tools can help schools do better but only when they align with real needs, thoughtful values, and instructional systems that already support strong teaching. Marketing fades. Good leadership lasts.

The best leaders resist pressure and lead with purpose. They choose tools that serve students, honor teacher expertise, and align with instructional vision. AI can enhance teaching, but only if it reflects your values and priorities. Choose tools that support learning, not just logistics.

Andy Szeto is an education leadership professor, district leader, former school leader, and president of A3 of CSA, with extensive experience in instructional leadership, AI in education, and professional development.

Now Open: Call for proposals for the 2026 National School Leaders Conference! Submit by Dec. 31.Learn More