Say Hi to AI
First, we want to acknowledge that the social work profession is genuinely noble and grounded in service, integrity, and clinical expertise. Although it can be stressful and even dangerous at times, the nation is grateful for what you do, and so are we. Thank you!
Pro-Tip:
Don’t rely on the convenient, repeated use of artificial intelligence (AI) in your therapy documentation.
AI has been receiving a lot of publicity lately. While it offers benefits in therapy, behavior prediction, and support functions, it also comes with risks.
One of the first steps is to avoid the convenience of using AI-generated, repeated statements in your therapy documentation. We have seen a significant increase in negligence lawsuits against practitioners due to poor documentation and an overreliance on AI-generated, generalized statements based on client data patterns.
Plaintiffs’ attorneys and courts increasingly find that practitioners fail to meet their duty of care when they defer to AI-generated predictive or repetitive statements in therapy notes.
Risk assessments and outcome forecasts are not substitutes for personalized therapy and human-generated diagnosis and treatment.
In court, the conclusion often is that the practitioner failed to apply the necessary human judgment and professional expertise. The therapy was considered substandard and did not meet the duty of care required, resulting in a finding of negligence.
AI tools can enhance traditional therapy as a supplement, not a substitute. AI should not replace the human element. Predictive analytics are not diagnosis or cure, and AI tools can be biased, hence the need for human oversight. AI should complement the data gathered through therapy, supporting, not replacing, the practitioner’s judgment.
AI can assist by recognizing patterns in client data, which can benefit treatment and proactive measures. These predictions can help practitioners intervene earlier in potential crises. Research shows AI effectively manages large datasets, identifies behavior patterns, and improves therapy outcomes.
However, AI remains a probabilistic tool – it offers predictions based on past data but cannot replace a human therapist’s personal touch and expertise. While sometimes emotionally aware, its responses lack the depth of understanding a trained practitioner brings to therapy. (Z. Zhang, “Can AI Replace Psychotherapists? “Exploring the Future of Mental Health Care”, Psychiatry, Vol. 15, 2024).
This distinction is critical.
From an insurance and liability standpoint, AI may pose minimal risk in certain areas, such as scheduling, reminders, and speeding up documentation processes. However, AI should be treated as a complement to traditional therapy, not a substitute, particularly in behavioral diagnosis. It should guide practitioners, not dictate outcomes.
“Personalization in therapy is crucial for its effectiveness.” (Ibid)
Practitioners should be cautious when using automatically generated AI statements in session documentation. Repeated AI-generated phrases across multiple sessions can quickly signal a breach of duty, a key component in proving negligence. Poor documentation, including boilerplate AI statements, can lead to lawsuits, guilty verdicts, settlements, and even the loss of a license to practice.
In closing, we understand the pressures of growing caseloads and the need to work efficiently. However, practitioners must resist the temptation to rely exclusively on AI to write documentation and diagnose clients.