Patient Experience and AI

Sep 30 1:42pm | MeekaC | @meekaclayton | Comments (3)

Using Artificial Intelligence to Enhance the Experience of Patients with Hypermobility

The application of Artificial Intelligence (AI) in understanding individual patient experiences with chronic illness or pain promises to give medical providers deeper insights into individual patient struggles and help craft individualized treatment plans.

Despite the slow adaptation of AI to the healthcare industry, Mayo Clinic continues to lead the way in incorporating AI into healthcare with countless advancements and instances of success. On the backend, AI can improve the patient experience by producing the results of cancer screens faster or assisting nurses in responding to patient messages in the portal. On the patient-side, AI can offer scheduling reminders and tailored management of patient treatment plans.

Additionally, AI can be particularly useful to patients with Ehlers-Danlos syndrome (EDS) and hypermobility spectrum disorder (HSD) who are navigating the journey from symptom onset to comprehensive care; two potential applications of AI in the care of patients with EDS and HSD patients are outlined below.

Digitizing Health Records and utilizing Synoptic Natural Language Processing:

Due to natural variation in symptoms of chronic pain conditions, it can be difficult for patients to receive timely diagnosis. Patients seen at the Mayo Clinic’s comprehensive EDS Clinic often have an extensive medical history. By digitizing Electronic Health Records (EHRs), providers are able to seamlessly access patient information of various origins and utilize natural language processing tools (NLP) to collate patient information. NLP tools can be used to read EHRs and create reports or summaries that are ready for holistic provider review, enhancing both the provider’s understanding and the patient’s experience. In the future, research could aim to develop a predictive risk analysis tool to assess patients’ likelihood of EDS or HSD and act as a decision aid for providers.

Qualitative analysis of patient/provider encounters:

NVivo 14 and ChatGPT-4o are powerful tools for in depth and nuanced qualitative analysis. These tools can be harnessed to analyze patient-clinician encounters. By tasking NLP tools to qualitatively analyze transcripts of patient encounters, providers can gain insight into the trend and patterns of hypermobile conditions. Although each patient's experience is different, identifying common experiences across the population of patients with EDS or HSD patient experience helps providers better tailor the care, therefore enhancing the standard of treatment for all patients.

 

Through these examples and others being explored at Mayo Clinic, specifically in the context of the EDS Clinic, AI tools promise to aid providers in their continued goal of providing the best patient outcomes and experiences. As research and developments in AI technology continue, the potential for innovation only increases. Have you seen benefits of these technologies?

 

Author: Kaitlyn Pak

Interested in more newsfeed posts like this? Go to the Ehlers-Danlos Syndrome blog.

Possibly I’m in the minority here, and I’m aware the EDS Clinic seems to have a special interest in AI, but I find this very troubling, especially with at least two of the examples cited here: Assisting nurses with responding to patient messages and collating electronic health records.

It is true that patients with HSD/EDS, especially those who make it to the EDS Clinic (I’m one of them) have extensive medical records. But it’s also true there’s a high incidence (which is borne out in the research) of experiences with medical gaslighting and trauma, which I’m certain makes its way with all of those provider biases into my electronic medical record because I’ve seen it. My records have errors in them too; even my Mayo record lists me as having a completely incorrect rheumatological diagnosis that I don’t have. Despite my appeal to Medical Records which included documents from my local provider, our request to correct it was inexplicably denied.

I specifically shut off Epic’s Care Everywhere feature that shares medical records before I went to the EDS Clinic for this reason, and all because I wanted my provider to see and hear me, without formulating a pre-conceived notion and not what other providers said about me. I believe vast majority of the problems I had during my EDS Clinic experience had to do with providers relying on the chart and what others said about me, and not listening and hearing me.

My point is that it’s problematic until we can somehow resolve the flaws with how we train AI. As my sister, who is a biomedical engineer says, “People don’t realize with AI, there’s a huge risk of garbage in, garbage out.” It would be great if/when we can somehow address those things.

But it some thing for Google or Amazon to provide an AI summary of product reviews and a whole other matter of doing that with my health records in a professional setting. But then again, maybe it’s already happening and maybe that explains the sometimes nonsensical messages I’ve received from nurses in response to my questions…

I’m not anti-technology at all, but I worry that instead of holding ourselves accountable for improving our awareness of these conditions across the healthcare system and allocating the proper resources and staff to provide the care we deserve, we’re looking for a shortcut. I’ll never understand why for some reason it’s okay that medical providers don’t have to be knowledgeable about HSD/EDS and other chronic pain syndromes.

REPLY
@emo

Possibly I’m in the minority here, and I’m aware the EDS Clinic seems to have a special interest in AI, but I find this very troubling, especially with at least two of the examples cited here: Assisting nurses with responding to patient messages and collating electronic health records.

It is true that patients with HSD/EDS, especially those who make it to the EDS Clinic (I’m one of them) have extensive medical records. But it’s also true there’s a high incidence (which is borne out in the research) of experiences with medical gaslighting and trauma, which I’m certain makes its way with all of those provider biases into my electronic medical record because I’ve seen it. My records have errors in them too; even my Mayo record lists me as having a completely incorrect rheumatological diagnosis that I don’t have. Despite my appeal to Medical Records which included documents from my local provider, our request to correct it was inexplicably denied.

I specifically shut off Epic’s Care Everywhere feature that shares medical records before I went to the EDS Clinic for this reason, and all because I wanted my provider to see and hear me, without formulating a pre-conceived notion and not what other providers said about me. I believe vast majority of the problems I had during my EDS Clinic experience had to do with providers relying on the chart and what others said about me, and not listening and hearing me.

My point is that it’s problematic until we can somehow resolve the flaws with how we train AI. As my sister, who is a biomedical engineer says, “People don’t realize with AI, there’s a huge risk of garbage in, garbage out.” It would be great if/when we can somehow address those things.

But it some thing for Google or Amazon to provide an AI summary of product reviews and a whole other matter of doing that with my health records in a professional setting. But then again, maybe it’s already happening and maybe that explains the sometimes nonsensical messages I’ve received from nurses in response to my questions…

I’m not anti-technology at all, but I worry that instead of holding ourselves accountable for improving our awareness of these conditions across the healthcare system and allocating the proper resources and staff to provide the care we deserve, we’re looking for a shortcut. I’ll never understand why for some reason it’s okay that medical providers don’t have to be knowledgeable about HSD/EDS and other chronic pain syndromes.

Jump to this post

Or Long Covid. I'm with you on the risks (to patients) and rewards (to time-pressed medical folk) of AI to get things wrong, misinterpret or hallucinate when stumped. I've had the same experience trying to make corrections to my medical record. Recently read a report (notes from a visit for a test) that sounded like it was written by someone with a limited grasp of English and no awareness (!) when it just doesn't make sense. I do see great potential for AI in diagnoses using all available data, but human expertise and judgment must be in place to have the final word. O, brave new world...

REPLY

Some applications of AI, such as machine learning models to assist humans in image analysis, are potentially valuable in detecting cancer. However, use of generative AI text, which has a high probability of providing literally meaningless text, to evaluate and respond to verbal descriptions provided by patients is an extremely bad idea, especially given the high level of ignorance about Ehlers Danlos in the medical profession. Please don’t.

REPLY
Please sign in or register to post a reply.