Skip to main content
Videos

Practical Guidance for Ethical AI Use in Psychiatry


While artificial intelligence (AI) tools are quickly being adopted to complement both virtual and in-person medical practice, psychiatric clinicians should “tread with caution,” says Desiree Matthews, PMHNP-BC, Steering Committee, Psych Congress. 

In this video, Matthews offers several practical considerations for mental health care providers contemplating the use of AI in psychiatric practice. Although emerging tools are progressing in clinical utility, Matthews emphasizes that technology is no replacement for clinician expertise. She also underscores the importance of considering patient preferences, protecting sensitive data, and providing informed consent in ethically leveraging these technologies.

For more experts insights, visit the Telehealth Excellence Forum.

Key Takeaways for Clinical Practice: 

  • Emerging artificial intelligence (AI) tools in psychiatry can identify emotions (eg, sadness, happiness, agitation) from video recordings and screen for possible tardive dyskinesia (TD) using guided exercise videos, but clinician review and in-person assessment remain necessary for accurate diagnosis.
  • AI-assisted tools, including AI scribes, may improve efficiency by summarizing large data sets and reducing documentation burden, supporting actionable care planning.
  • Privacy practices and patient consent are essential when implementing AI in mental health care, and clinicians should respect patient preferences, including opt-out options for AI-based documentation.

Read the Transcript:

Desiree Matthews, PMHNP-BC: Hi, my name is Desiree Matthews. I'm a board-certified psychiatric nurse practitioner with Different Mental Health Program, a telepsychiatry practice based out of North Carolina.

Psych Congress Network: What are some best practices that clinicians should be mindful of before using AI for mental health care?

Matthews: AI has really lit up since ChatGPT has come into our daily lives. But if you think about medicine and psychiatry, we are also seeing the uptick of AI. I think this is a world where it can be very useful, but I think we should tread with caution.

With AI, it doesn't mean that humans or providers are out of this loop. There’s AI that can help identify certain emotions, so we can actually capture a video recording and identify certain emotions like sadness, happiness, agitation, and it can actually pull out from a video a synopsis or a summary of what AI had recorded. 

There's another AI technology for identifying potential tardive dyskinesia (TD). This is really cool: It takes a video, it walks patients through different series of exercises, and it records a video. It's actually been shown to be fairly reliable and accurate in identifying possible TD. But I think with a lot of this AI, it's important to remember that even with this TD screen that's AI and captures video and is quite helpful, I still need to review the data. I still need to do an assessment to make an accurate diagnosis. 

At this point, I think there's a lot of hope for AI as an assistance, as a collaborator in our care so we can potentially be a bit more efficient, see the clients, and help us take large amounts of data and make it into an actionable plan for our patients.

So AI is great, but I think treading with caution, especially with privacy practices [is important]. I've dabbled in AI scribes, for instance, to help me with my documentation and cut down on my hours of typing—and hopefully reduce carpal tunnel. But some patients are not comfortable with that. Even if we describe our privacy practices or are able to show them how their data is protected, some people still really want to opt out. That is absolutely something in our consent forms that we have designed so that if patients don't want us to use an AI scribe, they don't have to. It's something that we discuss with our individuals upon intake now.

It’s an interesting field but there are some certain legal [issues] and patient preferences that you really need to be considerate of.


Desiree Matthews, PMHNP-BC, CEO, is a board certified psychiatric nurse practitioner with expertise in treating patients living with severe mental illness. Beyond clinical practice, Desiree has provided leadership in advocating for optimal outcomes of patients and elevating healthcare provider education. Desiree is the founder and owner of Different MHP, a telepsychiatry practice founded with the mission of providing affordable, accessible precision focused, integrative psychiatry to patients through a rich and comprehensive mentorship of the health care providers within the company.


© 2026 HMP Global. All Rights Reserved.

Any views and opinions expressed are those of the author(s) and/or participants and do not necessarily reflect the views, policy, or position of Psych Congress Network or HMP Global, their employees, and affiliates.