It’s hard not to notice the buzz over the past few years around the growing use of artificial intelligence (AI). Many industries from finance to entertainment have taken the leap to harness the power of AI, but healthcare has taken a slower adaptation. There are several reasons why it is harder to make AI for healthcare work. First, for those who have only taken a glancing look at the AI headlines, it may be helpful to explain what exactly is AI. The most simplistic definition of AI is a computer system that is capable of performing tasks that would otherwise require human intelligence. And as human intelligence is quite broad, so is AI. It can perform very simple calculations to very complex predictions. So it’s clear there must be some potential use for AI in healthcare . Fields like radiology, dermatology, pathology and cardiology have all made great strides the use of AI. But there are a few reasons why healthcare as a whole has been slow to adapt.
- Regulatory Hurdles and Compliance Concerns:
Healthcare operates within a highly regulated environment. There are strict requirements to ensure patient safety, privacy, and security. Regulatory bodies such as the FDA impose rigorous standards for the approval of AI-driven medical programs. Compliance with these regulations requires extensive testing and validation, which delay AI solutions in healthcare.
- Data Access and Quality:
AI algorithms depend on vast amounts of high-quality data to train effectively and ensure accurate predictions. However, healthcare data is often fragmented, siloed, and of varying quality across different systems and institutions. Integrating disparate data sources, such as electronic health records (EHRs), medical imaging archives, and genomic databases, poses significant technical and logistical challenges. Privacy concerns (HIPAA) arise as well.
- Resistance to Change and Cultural Factors:
Healthcare is a risk-averse industry, characterized by complex workflows, and entrenched practices. Clinicians and healthcare professionals may exhibit resistance to adopting AI technologies due to concerns loss of autonomy, or skepticism about the reliability and interpretability of AI-driven insights. Overcoming cultural barriers requires concerted efforts to promote collaboration, foster digital literacy, and provide ongoig education for healthcare personnel.
So far, AI has found it’s most effect use in medicine assessing images such as microscope slides, CT scans, MRIs, EKGs, and others. However, for conditions like EDS and HSD these are more complicated than image interpretation. There is medical history to include, also family history, physical exams along with laboratory data plus imaging, possibly even genomic data. Fortunately, Mayo Clinic has a vast resposity of data on EDS and HSD that will be very useful in future AI applications. So stay tuned here to learn about the progress we make with these exciting new tools at the EDS Clinic. If you have feedback on these or other topics, please share in the comments!
Author: Dacre Knight, MD, MS, FACP
If only we had access to Drs that were more knowledgeable on EDS and HSD here at the MCHS system and Rochester.
I think AI has its place, but there’s a danger of turning to it as a replacement for holding a higher standard of accountability for healthcare providers to be aware of these conditions. If more providers were willing to see hypermobility spectrum disorders as within their scope of practicea—obviously one can’t know everything, but if primary providers, PTs and OTs, and within specialties like chronic pain, rheumatology, and physiatry could be more informed or even just willing to partner with informed patients (because many of us self-diagnose and are very resourceful and educated about our conditions because we have to be) then there might be more access to care all around and the EDS Clinics that exist, such as at Jacksonville wouldn’t be so stretched thin and sometimes providing the bare minimum of care with limited ability to provide us with even the standard follow-up appointments.
I have lost count of the number of times I’ve been told HSD and EDS are “not within scope of practice” for a provider or just plain gaslit. Or a provider knows of it but sees no responsibility for partnering with me to factor it into my care or help me access treatment. And I know the research shows I’m not alone.
Rather than investing in AI, I’d much prefer for healthcare to be more inclusive, not just of hypermobility disorders but for anyone who falls outside what’s considered the “norm.”