AI Therapy?
Mickey Skidmore, AMHSW, ACSW, FAASW
In December 2023 The Evolution of Psychotherapy Conference was held in Anaheim, California. The week-long conference was graced by numerous luminaries in the mental health field. Several Ericksonian mainstays were present (Zieg, Gilligan; Yapko to name a few); the Gottman’s and several other couples therapists were present; and even the founders of the popular Internal Family System’s (IFS) model, increasing in popularity as it spreads around the world was also represented.
While I was not fortunate to attend this event, I spoke recently with a colleague who did. What an amazing opportunity to learn from the current leaders and pioneers of the mental health field.
Interestingly enough, my co-worker noted that a recurrent theme during the week buzzed around the notion of AI and its future role in therapy. He related various conversations where AI bots had been developed to review the professional literature and/or research of a particular therapy approach and then programmed or incorporated into an AI generated therapy application. When clients engaged accordingly, the responses were remarkably consistent with the overall tenets of the model. It remains to be seen whether the clients found this effective or useful within a treatment context.
My initial reaction to this was to be alarmed and concerned. I found myself taken aback. After dedicating decades to develop and fine tune the blending of the art and science of my craft, will I be replaced by AI? Is this a good thing? Will it be helpful and effective for the general population? Even if these other questions are relevant and important, I acknowledge that the initial focus of these concerns may be self-serving. The previous version of this debate was man versus machine? The current incarnation has expanded further to man/human versus technology?
Perhaps less threatening were other conversations of how to utilities similar technology to aid the therapist in writing their therapy progress notes. While the possibility of this at face value seems mildly less threatening and concerning, I remain concerned about the encroachment of AI into this domain — even if it does make documentation potentially less of a chore.
I recall rudimentary computer programming when I was in graduate school, wherein a computer program would simulate an interview in accordance with Carl Rogers’ Humanist approach. The crude, simulated computer voice would often reply: “… so it sounds like it may be important to you how I might feel about that …” Never in my wildest dreams would I have imagined having conversations about AI technology engaging in psychotherapy.
Regardless of your views on this issue, it appears the floodgates have already been breached in this regard. Regardless of our views; whether we like it or not; or whether we’re ready for it or not; it seems that this is an inevitability that is unavoidable. Engaging with this issue to ethically oversee what the guardrails might include seems like something to at least reflect upon, and may be worthwhile in becoming involved in. I suspect the more perspectives of advocacy that can be identified on this issue the better.
