AI Use Among Health Service Psychologists: Current Use and Changes from 2024 to 2025
Key Points (For Readers on the Go)
APA data suggests that AI use among psychologists nearly doubled from 2024 to 2025, with far fewer reporting “never” using AI at work.
Routine use (monthly, weekly, or daily) more than doubled in one year.
Most AI use remains non-clinical, focused on writing, documentation support, psychoeducation, and administrative tasks.
New NRHSP committee data confirm APA’s trend, showing a sizable group of frequent AI users alongside persistent non-use.
Governance and ethics are lagging behind adoption, particularly around consent, disclosure, and handling of sensitive information.
AI use among health service psychologists (from here on out just called “psychologists”) is no longer speculative or marginal. National data suggests that, in just one year, the profession moved from predominantly non-use to widespread experimentation and growing routine use. At the same time, new data collected in October 2025 through the National Register of Health Service Psychologists (NRHSP) AI Committee provide a recent data of how psychologists are actually using AI in practice.
This post compares changes from APA’s 2024 and 2025 Practitioner Pulse surveys and then situates those findings alongside the most recent NRHSP committee data, offering both a national trend view and a practice-level perspective.
What Changed from APA 2024 to APA 2025: A Step-Change in Adoption
In 2024, the dominant narrative was non-use. Approximately 71% of psychologists reported not using AI tools for work, while only 11% reported using AI at least monthly. Daily use was rare, estimated at roughly 2–3%.
By 2025, that picture changed substantially.
APA data indicate that the proportion of psychologists using AI for work increased from approximately 29% in 2024 to 56% in 2025, representing nearly a doubling of adoption in a single year.
Growth was especially pronounced among routine users:
At least monthly AI use increased from 11% to 29% (a 2.6× increase).
Daily AI use rose from approximately 2–3% to 8% (roughly a threefold increase).
In practical terms, AI crossed an important threshold—from occasional experimentation to regular workflow support for a substantial segment of the profession.
A note on timing and sample size
APA Practitioner Pulse surveys are large national surveys with thousands of respondents.
The 2025 APA AI findings reflect use over the prior 12 months, capturing behavior largely from late 2024 through mid-2025..
What Psychologists Are Using AI For (APA Perspective)
Across both APA survey years, the pattern of use is remarkably consistent.
The most common applications include:
Writing assistance (emails, drafts, general text)
Summarizing articles or notes
Creating presentation or educational materials
Note-taking or dictation support
Importantly, use of AI for direct clinical decision-making remains uncommon. Most psychologists appear to be positioning AI as a supportive tool, rather than as a substitute for professional judgment.
What the NRHSP Committee Data Add
To complement APA’s national snapshot, the NRHSP AI Committee conducted a focused survey in October 2025, with 528 licensed psychologists responding.
Sample context
Predominantly licensed psychologists
Strong representation from private practice
Largely mid- to late-career clinicians
Despite differences in sample composition and time frame, the overall adoption pattern closely mirrors APA’s 2025 findings.
Frequency of AI use (past 6 months)
43.1% reported no AI use for work
Nearly 30% reported weekly or daily use
The remainder reported occasional or monthly use
Taken together, the APA and NRHSP data converge on the same conclusion: frequent AI users are now a stable and meaningful subgroup within psychology, not a fringe.
What Psychologists Are Actually Doing with AI
The NRHSP data offer more granular insight into specific workflows. Among psychologists using AI for work, common activities include:
Answering work-related questions
Generating accessible explanations of complex topics
Creating presentation materials
Drafting session or progress notes
Developing psychoeducational content
Writing emails and professional recommendations
Assisting with treatment plans and goals
Drafting reports or interpreting test results (less common, but present)
While many of these uses are supportive, several—particularly documentation, recommendations, and planning—sit close to clinical judgment, increasing the importance of clear safeguards.
Adoption and Anxiety Are Rising Together
APA data show that as AI use increased from 2024 to 2025, concerns increased as well, particularly regarding:
Privacy and data security
Bias in AI outputs
Accuracy and reliability
Transparency and testing
Broader professional and social implications
The NRHSP data help explain why many psychologists still report non-use. Among non-users, the most frequently cited reasons were ethical concerns, confidentiality, accuracy, and legal uncertainty.
This pattern suggests that hesitation reflects unresolved governance questions, not indifference or lack of relevance.
Practical Examples That Reflect Current Practice
Based on both datasets, common and relatively lower-risk uses include:
Example 1: Psychoeducation support
Drafting plain-language explanations of topics such as anxiety, trauma, or sleep, followed by clinician review and tailoring.
Example 2: Documentation scaffolding
Using AI to generate structured templates or prompts for notes or reports, while writing all substantive content independently.
Example 3: Professional communication
Assisting with emails, training materials, or research summaries for supervision, consultation, or presentations.
In all cases, psychologists retain responsibility for accuracy, interpretation, and ethical use.
Ethical Considerations: Governance Is the Real Gap
One of the most important findings from the NRHSP committee work is the wide variability in practice, even among conscientious clinicians.
Key patterns when using PHI include:
Nearly all AI users report reviewing and editing AI-generated content.
Disclosure practices range from always disclosing to never disclosing.
A minority report entering sensitive or identifiable information into AI tools.
Consent practices are inconsistent and often conditional.
AI adoption is moving faster than shared professional norms around consent, disclosure, and data protection.
For Readers Who Want to Go Deeper
For those who want to explore the data in more detail, I created an interactive dashboard that allows users to examine AI use patterns, frequencies, and ethical practices:
Interactive dashboard:
https://adamblockwood.github.io/psych-ai-dashboard/
The full NRHSP AI Committee report is publicly available here:
https://www.nationalregister.org/committees/committee-reports/
A Brief Acknowledgment
This work reflects a collaborative committee effort. I had the privilege of chairing the NRHSP AI Committee, analyzing the data, writing the first report, and building the interactive dashboard. I’m grateful to the other committee members for their thoughtful input, review, and ongoing collaboration throughout this process.
Final Takeaways
AI use among psychologists nearly doubled from 2024 to 2025, increasing from ~29% to ~56%.
Routine use surged, with monthly use rising from 11% to 29% and daily use increasing from ~2–3% to 8%.
NRHSP data collected in October 2025 suggests that this trend is continuing, not leveling off.
Most AI use remains supportive, but overlaps with clinically meaningful workflows.
The next rational phase for the profession is clear, practical governance, not debating whether AI belongs in psychology at all.
Call to Action
If AI use is emerging unevenly within your practice, clinic, or organization, this is an opportunity to move from informal experimentation to shared expectations that protect both clinicians and clients.
Other Links That Might be Helpful
AI Use Disclosure
Portions of this post were drafted with the assistance of an AI writing tool and revised by the author for accuracy, clarity, and professional judgment.