What Educators Really Want From AI Tools

Key Points (for readers in a rush)

  • Based on 1,350 Ohio school health providers (May 2025) + feedback from hundreds of educators nationwide

  • Educators don’t want AI to replace them—they want AI that reduces paperwork

  • Most providers have no formal AI training; most learn informally from colleagues

  • Privacy and FERPA concerns are the biggest barriers

  • Practitioners want profession-specific, workflow-aligned AI tools

  • Districts should prioritize clear policies, role-specific training, safe workflows, and human oversight

Conversations about artificial intelligence in schools are accelerating. Districts are testing new tools, state agencies are drafting policies, and practitioners are trying to figure out what’s useful, what’s overhyped, and what’s risky.

To cut through the noise, I’m going to dicuss two sources of information that I’ve collected:

  1. A statewide Ohio study of 1,350 school health and related service providers—including school psychologists, nurses, SLPs, OTs, PTs, counselors, and others—conducted in May 2025, and

  2. Feedback from hundreds of practitioners across dozens of AI trainings I’ve provided to thousands of educators and school-based clinicians nationwide.

This post synthesizes insights from both perspectives.

It reflects not only what Ohio educators reported in the survey, but also what school psychologists, SLPs, OTs, PTs, counselors, administrators, special educators, and mental health providers across the U.S. consistently voice during workshops, PD sessions, and implementation consultations.

Preprints from this broader effort are already available for

This post offers a wide-angle overview across professions.

1. Educators Want AI to Lighten the Load

Across roles and disciplines, practitioners emphasized the same theme:

“Help me with the tedious parts of my job so I can focus on students.”

The most requested AI functions were:

  • Drafting or organizing documentation

  • Summarizing observations, interviews, and session notes

  • Simplifying complex, technical policy requirements

  • Generating intervention ideas

  • Creating lesson plans or structured activities

  • Helping structure longer written content

Notably, practitioners did not ask for:

  • Automated diagnosis

  • Automated eligibility decisions

  • Predictive risk algorithms

  • Fully automated psychoeducational or therapy reports

Across my national trainings, I hear this repeatedly:

Educators don’t want AI to think for them. They want AI to give them time back.

2. Formal Training Is Rare — and Most People Are Learning AI From Colleagues

One of the clearest insights from the Ohio dataset of 1,350 respondents:

  • Only a small fraction had received formal, district-approved AI training

  • Most training came informally from colleagues, not structured PD

  • Many practitioners reported learning AI through experimentation, social media, or word of mouth

This is echoed in PD sessions nationwide:

Educators repeatedly say their barriers aren’t lack of interest—they are lack of structure, guidance, and clarity.

Practitioners are using AI, but they’re doing it quietly (or secretly), cautiously, and often without clear workflows or guidance.

3. Privacy and FERPA Concerns Are the #1 Reason People Hesitate

Interest in AI is extremely high.

Actual use is much lower.

This gap is almost entirely driven by privacy and policy uncertainty.

From both the survey and national trainings, practitioners consistently express concerns about:

  • Whether district policies authorize AI at all

  • Whether “de-identification” is actually enough

  • What counts as student data

  • How to stay compliant when using chatbots

  • Whether administrators fully understand the legal landscape

  • Whether AI systems hallucinate or oversimplify student profiles

The most common sentiment I hear:

“I want to use AI, but I don’t want to get in trouble.”

Until districts provide clear, usable guidance—AI adoption will remain slow and uneven.

4. What Educators Wish They Could Use AI For

The Ohio data shows a striking difference between:

  • What practitioners currently use AI for

  • What they wish they could use it for if privacy, policy, and workflow issues were addressed

Across professions, the aspirational uses include:

Documentation Support

  • Organizing raw notes

  • Drafting progress summaries

  • Helping structure psychoeducational or therapy documentation

  • Ensuring required elements aren’t missed

Decision-Support (But Not Decision-Making)

  • Flagging inconsistencies

  • Summarizing complex data

  • Offering evidence-based intervention ideas

  • Suggesting areas that need clarification

Collaboration and Communication

  • Drafting clear email summaries

  • Creating student-friendly explanations

  • Supporting multi-agency or multi-disciplinary teamwork

Profession-Specific Expertise

Practitioners repeatedly say they want AI that “gets” their field:

The biggest message: Generic AI tools don’t understand school-based practice.

People want context-aware, profession-specific, ethically aligned assistants.

5. Insights for Schools: What Districts Should Do Now

Based on the Ohio dataset and what I see across national trainings, districts that want to responsibly adopt AI should focus on four immediate steps:

1. Provide Clear, Usable AI Policies

Educators need:

  • What tools are approved

  • What data can/cannot be used

  • When to de-identify

  • When AI is prohibited

  • A list of safe workflows

2. Offer Profession-Specific Training

SLPs, OTs, school psychologists, nurses, and counselors do very different work.

Training must reflect those differences.

3. Start With Low-Risk, High-Impact Use Cases

The data suggests ideal starting points:

  • Document editing and summarization

  • Planning and brainstorming

  • Administrative communication

  • Drafting low-stakes content

  • Clarifying policy requirements

4. Ensure Human-in-the-Loop by Design

Practitioners want human oversight.

AI should support—not substitute—professional judgment.

6. Why These Findings Matter

There’s sometimes a narrative that educators are resistant to AI. The truth is very different.

Educators are:

  • Curious

  • Hopeful

  • Overloaded

  • Undertrained

  • Concerned about privacy

  • Eager for practical, ethical guidance

This broad analysis—across 1,350 Ohio practitioners and hundreds of professionals I’ve trained nationwide—shows that educators are ready to use AI meaningfully.

They just need the policies, tools, and training to do it safely.

AI Use Disclosure

Portions of this post were drafted with the assistance of an AI writing tool and revised by the author for accuracy, clarity, and professional judgment.

Previous
Previous

How AI Is Changing the Future of Mental Health Care

Next
Next

How to Give Your AI a Job Interview