News You Need to Know Today
View Message in Browser

CHIME’s 10 AI principles | AI reporter’s notebook | Partner news

Thursday, August 1, 2024
Link to Twitter Link to Facebook Link to Linkedin Link to Vimeo

Northwestern Logo ●  Nabla Logo ●  UCLA Health

artificial intelligence in healthcare

10 aspects of healthcare AI to which attention must be paid

Patient safety should be the No. 1 consideration for healthcare organizations working, planning or hoping to adopt AI. Then again, nine other concerns are similarly crucial to the success of the industry-wide endeavor.

That’s according to CHIME, the College of Healthcare Information Management Executives. The association built its list after receiving input from a sizeable segment of its 5,000 members in 60 countries.

“CHIME members embrace AI as a force for good,” says the org’s board chair, Scott MacLean, chief information officer at MedStar Health, in introducing the report containing and explaining the 10 points to which, in CHIME’s view, attention must be paid. “By considering and using our AI principles, we can enhance patient care, empower clinicians and contribute to healthier communities.”

CHIME released the report this week. Here are excerpts.

1. Patient safety.

Everyone is a patient, and every patient deserves to receive care that best supports their individual healthcare needs. A one-size-fits-all policy approach to AI that is entirely sector-agnostic is unlikely to best support patient needs.

AI’s use in healthcare settings requires an added level of expertise and consideration, as patient care outcomes and lives are at stake.

2. Administrative efficiencies.

Whereas clinical tools can take years for clinicians to adopt into practice, uptake of administrative AI tools often doesn’t require clinical acceptance. This  suggests a shorter period in which these tools will be widely used by healthcare delivery organizations.  

It stands to reason that investments in AI tools designed to improve our sector’s efficiency could result in significant savings in time and cost.

3. Regulatory oversight.

As AI evolves in the way it is applied in healthcare, liability concerns are growing. Not least is the question of how much responsibility providers and clinicians should have to shoulder when they augment care delivery with the use of AI tools and there is an adverse patient outcome.

Regulatory oversight is needed, but it should not result in duplicative mandates or unnecessarily increasing administrative burdens on providers and clinicians.

4. Innovation and research.

AI technology is rapidly evolving, and the clinical evidence base is still emerging. Comprehensive research—shared via peer-reviewed journals and made widely available to providers—is needed.

Common definitions are needed to foster a shared understanding across diverse applications within the healthcare sector to support research and promote consistency.

5. Discrimination, bias and equity.

Ongoing efforts are needed to identify and address bias in AI algorithms and ensure that these systems are trained on diverse and representative datasets.

The implementation of AI in healthcare should not be a one-time event but rather an iterative process.

6. Affordability.

Most providers lack the resources needed to purchase and deploy cutting-edge AI tools and applications. Many remain financially stretched and are still wrestling with workforce shortages, especially for healthcare IT employees.

Small and under-resourced providers will need additional support adopting AI to prevent widening the digital divide.

7. Privacy.

When aggregated with other data, de-identified patient data can be re-identified.

AI products using de-identified data in an attempt to reduce bias have the potential to inadvertently create privacy risks such as reverse-engineering data to re-identify individuals.

8. Cybersecurity.

Third parties that store, process and/or transmit protected health information on behalf of HIPAA-covered entities are critical to the healthcare sector. However, they routinely shift millions of dollars of liability for a cybersecurity breach back to healthcare delivery organizations during contract negotiations.

If we are to make meaningful improvements in our sector, this responsibility must be equally shared and cannot be borne by providers alone.

9. High-speed broadband.

Continuing to expand nationwide high-speed broadband is crucial for harnessing AI tools and reducing healthcare disparities.

To the degree that gaps persist in broadband access, widespread AI implementation will be impeded, worsening the digital divide.

10. Education and workforce.

Positioning the healthcare sector as an AI leader requires a well-trained and upskilled workforce. However, getting there must be balanced with maintaining a strong employment rate, avoiding significant job displacement and mitigating existing inequities.

Support by large technology companies, educators and policymakers is needed to manage the use of these new tools and the changing labor demand.

Download the full report.

 

 Share on Facebook Share on Linkedin Send in Mail

The Latest from our Partners

What are the keys to a successful ambient AI pilot? - If you're looking to pilot an ambient AI for clinical documentation, Nabla has identified 7 strategies for effective assessment programs that pave the way for successful deployments. Read the article here.

 

 Share on Facebook Share on Linkedin Send in Mail
artificial intelligence ai in healthcare

Industry Watcher’s Digest

Buzzworthy developments of the past few days.

  • The Federal Trade Commission isn’t messing around. The FTC is prepared to use every tool at its disposal to stop crooks from using AI to hoodwink consumers and businesses. The agency spells out what it means in a comment submitted July 31 to an FCC notice of inquiry anticipating goods and harms to come from emerging uses of AI. The FTC says it will “consistently remind industry that there is no AI exception to consumer protection or antitrust laws. We stand ready to work with the FCC and other agencies—both state and federal—to advance the critical goal” of holding bad AI actors accountable. Context here, comment here.
     
  • Pretty close to 100% of business C-suiters expect AI to boost productivity. But there’s a disconnect in the house. More than three-quarters of workers feel the tools are only adding to their workloads. The findings are from a global survey by the staffing firm Upwork, which queried 1,250 executives, 625 salaried fulltime employees and 625 gig workers. The non-leaders told the surveyors they spend considerable time reviewing or moderating AI-generated content (39%), learning how to use these tools (23%) and, crucially, being asked to do more work as a direct result of AI (21%). Results summary here.
     
  • The above will sound familiar to healthcare professionals. After all, it’s not news that digitizing health records was supposed to relieve clinicians of administrative duties but has, in fact, done the opposite. Still, some 65% of providers tell eClinicalWorks they believe AI will finally reduce the burden of clinical documentation. What’s more, when asked how much time AI medical scribes can help save, 51% replied two hour or more per day. More results here.
     
  • Detailed federal regulation for healthcare AI is on the way. That’s what we keep hearing. But lawmakers at the state level aren’t waiting. As noted by the National Conference of State Legislators, in 2023, 11 states introduced legislation specifically related to healthcare AI. That was a bump from just three states the previous year. And the momentum has carried into 2024. “State legislators can play an important role in harnessing the positive capabilities of AI,” NCSL reminds, “while serving as a safeguard for the public against some of the potential negatives.”
     
  • Patients are largely untroubled by the thought of AI helping doctors make diagnoses. Just don’t ask those same healthcare consumers to interact with AI for themselves. The mixed feelings reflect the newness of the AI journey, especially with Gen AI in the news, according to Bain & Company, which is out with findings and analysis from the firm’s latest Frontline of Healthcare survey. “Clinician involvement,” the authors write, “will go a long way in addressing both patients’ and clinicians’ concerns and scaling winning applications.”
     
  • Who’s right? In this corner, those who believe AI represents an investment goldmine. In that corner, those who believe AI is overhyped and may be approaching bubble status. Looking on are the professional market watchers of Fisher Investments, who figure the truth lies somewhere near the middle of the mat. “We believe a disciplined approach to investing within AI-related sectors and companies is likely the prudent choice for most investors,” Fisher writes in a July 31 web post, “as the long-term winners in the AI technology race are nearly impossible to know today.”
     
  • Here might be the real final countdown. Two “alien-curious” scientists think it would be a good idea to send large language AI models into space. If an intelligent life form out there happens upon it, the being might “learn one of our languages, ask the LLM questions about us and receive replies that are representative of humanity.” These are not crackpot “scientists.” One is an astronomer with the Carl Sagan-founded SETI Institute. The other is with NASA. They make their case in an opinion piece published by Scientific American. It’s behind a paywall, but Futurism.com has posted a summary. (Sorry, can’t resist: “We’re heading for Venus and still we stand tall / ’Cause maybe they’ve seen us and welcome us all …”)
     
  • Recent research in the news:
     
  • From AIin.Healthcare’s news partners:
     

 

 Share on Facebook Share on Linkedin Send in Mail

Innovate Healthcare thanks our partners for supporting our newsletters.
Sponsorship has no influence on editorial content.

Interested in reaching our audiences, contact our team

*|LIST:ADDRESSLINE|*

You received this email because you signed up for newsletters from Innovate Healthcare.
Change your preferences or unsubscribe here

Contact Us  |  Unsubscribe from all  |  Privacy Policy

© Innovate Healthcare, a TriMed Media brand
Innovate Healthcare