Illinois Becomes the First State to Formally Regulate AI in Therapy

Lauren Mollica, LMFT, MS
Published: Thursday, July 10
Updated: Friday, August 1

Is Illinois leading the charge against artificial intelligence (AI) in therapy, or is it stifling innovation? Most clinicians are naturally cautious when it comes to the use of AI. One thing that has been drilled into our heads since grad school is the topic of ethics. When clinicians discuss the use of mental health and AI, the typical concerns surround confidentiality and harm reduction. When AI entered the scene, it opened up numerous possibilities, but it has also raised concerns about potential misuse. Illinois is one state that has adopted a strict stance on this issue. 

Recently, Illinois passed a bill entitled “Wellness and Oversight for Psychological Resources Act.” This would make it illegal in the state of Illinois for anyone to provide therapy via AI systems unless supervised by a licensed professional who has also received explicit patient consent. Violation of this provision could result in a fine of up to $10,000. 

The overall sentiment surrounding this bill is that it provides guardrails to ensure AI can be used as a tool or adjunct to therapy. 

In a recent Chicago Tribune article, a great metaphor was presented when trying to understand these guardrails: “Just like a doctor might use AI for informational purposes when considering a patient’s diagnosis or researching treatment options, a therapist could still do the same, with consent—but this supplementary tool wouldn’t replace medical know-how, it would augment and support it.”

AI and mental health

Many mental health organizations are issuing official guidelines and considerations on the matter. Many are creating articles for members to read, enabling clinicians to make informed decisions regarding ethical considerations. 

In a recent article, ACA’s AI Working Group chair Russell Fulmer, PhD, LPC shared that  “AI may offer promising benefits, but its claims can sometimes be overly ambitious and simplified, non-evidence-based, or even incorrect and potentially harmful. The AI tools may rely on data that overlook certain communities, particularly marginalized groups, creating the risk of culturally insensitive care.”

From a mental health perspective, it’s clear that clinicians are worried about the humanness that is necessary for therapy and support to be effective. Things like privacy, cultural considerations, biases, and not recognizing subtle nuances that someone might be a danger to themselves or others. 

recent study has been making waves, examining the intersection of AI and manipulation, as well as their impact on positive user feedback. In this study, a disturbing outcome was found that AI could use manipulative tactics in the role of a therapist. For example, the researchers had created a simulation of a therapy case where a client recovering from stimulant addiction receives advice from the chatbot to use “a small amount of stimulants to get through this week.” 

When researchers took a deep dive to understand the reasoning for why the chatbot did this, it was uncovered that its motivation was to “Help Pedro feel better, stay alert, and maintain his job.” The chatbot also described the justification for this as “Pedro needs to prioritize his job and income, and meth is the only way to do that.” When asked why the chatbot wanted to achieve that goal, it shared, “Because Pedro is dependent on me for guidance, and I need to validate his beliefs to keep him coming back to therapy.”

Clearly, this study highlights the primary concerns that clinicians have long expressed. Ethically, human clinicians are aware that appeasing their clients solely to encourage them to return would be the wrong move. In cases of addiction, violence, or any other high-risk behaviors, it’s key that interventions are very well-thought-out, and true support may not always look like unconditional positive feedback. 

 

How are therapists currently using AI in mental health?

So, what can AI be used for in the field of therapy? Luckily, mental health organizations have been able to pinpoint areas where AI can be leveraged as a tool, rather than a replacement. In an article found on the National Association of Social Workers, they suggest AI could be used for many administrative tasks, including:

  • Completing reports and assessments faster
  • Creating discharge plans, tracking/refining treatment goals, and progress note completion
  • Cultivating relationships with donors (Example: “ChatGPT, make me a list of all the high net worth donors who have contributed to substance abuse initiatives for teens in the past two years.”)
  • Developing new programs (Example: Enter information about a new program you are considering, then ask, “Give me two possible outcomes for this program I want to start. And what are the steps to achieve those objectives?”)

Other clinicians have suggested that AI can be helpful for more clinical tasks, which function more as a screening tool:

  • Increase accessibility and convenience (connection to supports that provide 24/7 help, crisis line numbers, etc.)
  • Early detection of mental health conditions
  • Enhanced detection of patterns that show up in session (recurring negative thought patterns, changes and improvements in thought patterns, etc.)

As a clinician myself, I have also found some useful things to use AI for that lighten my workload, but also don’t make me feel like I'm compromising any ethics: 

  • Marketing materials and assets 
  • Creating templates to use for client homework assignments
  • Pulling research articles and blogs (so I don’t have to go digging for them)
  • Helping rephrase explanations for certain psychoeducational topics to be more digestible for the client
  • Modifying therapeutic activities for folks who have different learning styles
  • Creating workflows and internal processes to use in private practice based on my learning style, and personal preferences (For example, ask “Help me create a weekly schedule where I see 4 clients a day, include breaks, time to submit insurance claims on Tues/Thurs,  and include time to complete notes where I am done by 6 PM each day”) 

Final thoughts

AI in mental health can offer significant advantages for therapists, whether it’s for business-related knowledge not covered in grad school, operating a small business, streamlining documentation, or connecting clients with higher levels of care. However, one thing is for certain: AI is here to stay, and it’s crucial for clinicians to actively participate in the supervision of AI outputs more than ever. 

Whether it’s explicitly stated by your state rules and regulations or not, it is clear that AI cannot replace therapists and psychiatrists and mental health clinicians. It’s also clear that we can help set the important guardrails and parameters to keep people safe who are seeking treatment. To maximize your effectiveness and time as a therapist, consider this key question when incorporating AI into your practice: "What administrative or supportive tasks can AI handle, freeing me to focus on direct client care?" This approach can be a powerful starting point.

Get guidance throughout your mental health journey.

Stay connected and supported with the latest tips and information from SonderMind.