News You Need to Know Today
View Message in Browser

AI vs. substance abuse | Partner voice | AI-enabled layoffs, following the AI money, AI to unscrew healthcare, more

Wednesday, September 10, 2025
Link to Twitter Link to Facebook Link to Linkedin Link to Vimeo

In cooperation with

Nabla Logo

laid off artificial intelligence AI in healthcare

Healthcare AI today: AI-enabled layoffs, following the AI money, AI to unscrew healthcare, more

 

News and views you ought to know about:

  • A network of more than 100 clinics is leveraging AI to downsize quite substantially. Provo, Utah-based Revere Health will lay off more than 175 of its 2,500 or so nonclinical staff by early November. Revere, the biggest physician-led healthcare organization in the Beehive State, tells a local TV news station it’s able to make the move thanks to its work with Texas-based tech company IKS Health. The partnership will “introduce advanced technological capabilities including machine learning and automated claims processing to optimize billing, collections and denial prevention,” Revere said. “[W]e’re implementing a comprehensive suite of technology solutions that will allow us to operate more efficiently and effectively in response to increasing industry demands.” 
     
    • Revere Health says it will send off the newly jobless with transition assistance, dedicated support and generous severance packages. “These are valued members of our team, and we recognize this change affects not just employees but their families as well,” the org tells the TV station. “We will help each affected team member navigate this transition with dignity and care.” 
       
    • The organization counts approximately 400 physicians and advanced practice providers among the members of its workforce, although nothing in definitive coverage by The Salt Lake Tribune suggests any clinicians will face pink slips. 
       
  • Want to know where AI is going to make the biggest difference in healthcare? Follow the money. Unless philanthropists fund models aimed at broadening access, products that “push frontiers” will race ahead. And those tend to make already-excellent clinicians even better for the well-covered rather than “democratizing expertise” across the U.S. population. That’s the prediction—or warning—from I. Glenn Cohen, JD, a Harvard Law professor who’s widely regarded as one of the top minds mulling the legal angles at which biotechnology meets medical ethics. Cohen offers the view in an interview with The Regulatory Review
     
    • The outlet also asks Cohen whom he believes should be held liable when medical AI has a hand in harming patients. “It’s complicated,” he responds, acknowledging the lawyerly bent of that disclaimer. “[T]he answer will be different for different kinds of AI. Physicians ultimately are responsible for a medical decision at the end of the day, and there is a school of thought that treats AI as just another tool, such as an MRI machine, and suggests that physicians are responsible even if the AI is faulty.” But that’s only one school of thought. Read the full Q&A
       
  • U.S. healthcare has been trying for years to wring valuable insights out of its massive data stores. From here on out, today’s AI will do the heavy twisting. That’s because generative AI models that can meaningfully guide users—patients and providers alike—are not only ubiquitous but also easy to use. In a sense, they’re “making really non-linear improvements in terms of what’s possible,” says Mary Varghese Presti, corporate vice president at Microsoft Health & Life Sciences. Presti made the observation at a Microsoft alumni event Sept. 9 in a panel discussion with former Microsoft exec Terry Myerson. Aided by AI, patients will soon review their own medical records and comprehend the gist of the gobbledygook, Myerson suggested. They’ll “have a conversation that includes all of the knowledge of the published literature,” he added. “That empowerment of the individual just seems so righteous to me.” 
     
  • Around the world, AI is changing healthcare from a diagnose-and-treat endeavor to a predict-and-prevent one. At a healthcare AI event in the United Arab Emirates this week, one eminent speaker said the “story of healthcare” is being “rewritten before our eyes.” The speaker was Mubaraka Ibrahim, the CIO of Emirates Health Services. “Ambient AI is already reducing documentation by 41%, giving clinicians 22% more time with their patients,” she enthused. “This is not just efficiency—it is humanity restored to the heart of care.”
     
  • Not everyone is convinced the AI revolution has arrived. “It’s still very common for patients to get shut down if they bring forward data themselves, be it from ChatGPT, Dr. Google [or] Dr. AI,” says Jen Horonjeff, founder and CEO of the patient advocacy group Savvy Cooperative. “If I’m trying to share my Oura ring data or Apple Watch data, I know that with [one] provider I can talk about it [while another] provider will roll their eyes,” Horonjeff said at Health Datapalooza in the nation’s capital last week. “I literally had somebody tell me I’d be better off spending that money on a pair of shoes than trusting any of that kind of stuff.” Fierce Healthcare has a quote roundup from the event. 
     
  • Nobody’s perfect. And neither is any AI model. These facts are unlikely to change anytime soon. So the question that wielders of AI in healthcare should ask among themselves is: How much imperfection—or outright error—are we willing to tolerate? The riddle is raised by a medical ethicist who focuses on AI, among other forces, changing the specialty furthest ahead with the technology—radiology. “One day, it might indeed be the case that AI technologies will largely replace human clinicians,” muses the expert, John Banja, PhD, in a Sept. 2 blog post. “In the meantime, let us hope that we will have the best science, the best boots-on-the-ground opinion, and strong and Hippocratically-committed leaders to guide us so as not to sacrifice either Hippocratic ideals or radiologist well-being for the Mammonic temptations of the healthcare marketplace.” The post is the finale in a series of eight Banja dedicated to radiologist wellbeing. His thoughts zeroing in on AI in that context are here and here
     
  • Anyone can say our healthcare system is f**ked. But not everyone who would put it that way comes from a family of physicians, has made a billion bucks as an entrepreneur and is ponying up $200M to build a medical outfit that won’t charge patients who can’t afford to pay. The Wyoming facility’s secret sauce? AI mixed with blockchain. “Let’s build a clinic where we put the patient at the center,” says the potty-mouthed but high-minded investor, Charles Hoskinson, founder of the cryptocurrency company Cardano. “We build care teams, we use AI and we do everything in our power to try to just make it patient-centered care that’s affordable.” CoinDesk has an interview
     
  • From AIin.Healthcare’s sibling outlets:
     

 

 Share on Facebook Share on Linkedin Send in Mail

The Latest from our Partners

The ambient AI playbook: Lessons from two leading health systems

At the recent CompassionIT Summit, leaders from Akron Children’s Hospital and Denver Health shared powerful lessons from rolling out ambient documentation to over 1,500 clinicians. Their biggest takeaway? Stories, not stats, drive adoption. Whether it was a heartfelt testimonial that swayed an entire department or a 60-second Nabla demo that eliminated training anxiety, the common thread was simplicity, authenticity, and clinician-centered design. Read more about the way these health systems are navigating ambient AI implementation: https://dhinsights.org/news/the-ambient-ai-playbook-lessons-from-two-leading-health-systems

 Share on Facebook Share on Linkedin Send in Mail
substance use disorder therapy with AI

Addiction medicine eager to embrace AI

People with substance use disorders stand to benefit from healthcare AI just like any other patients. But they may have to wait a bit longer than most others since AI has only just begun to emerge in addiction medicine. 

An addiction specialist looks at the technology across healthcare, including his own professional backyard, in a paper surveying AI’s advance into clinical diagnostics and decision-making. Primary Care: Clinics in Office Practice published the work this month. 

Within addiction care, AI is being used to “predict relapse risk, identify patients who may benefit from specific treatments and improve patient outcomes in addiction recovery,” writes Nicholas Conley, MD, medical director at Apex Recovery Rehab in Franklin, Tennessee. 

Here’s a sampling of what else Conley observes about AI in his field and beyond. 

1. AI systems in addiction medicine focus on predicting relapse and tailoring interventions. 

For example, Conley notes, AI models that analyze patient behaviors, including social media activity and mobile phone usage, can detect early warning signs of relapse in patients with substance use disorders.

‘These systems provide real-time feedback to clinicians, enabling early interventions that can prevent relapse and improve treatment adherence.’

2. AI tools are being used to personalize addiction treatment. 

Machine learning models analyze patient data to recommend the most effective therapeutic interventions, such as cognitive-behavioral therapy, medication-assisted treatment or group therapy, Conley points out. 

‘By integrating data on past treatments, psychiatric conditions and social factors, AI models can optimize treatment plans for individuals, improving the chances of long-term recovery.’

3. As AI technologies mature, their role will expand across multiple specialties, offering diagnostic support, predictive analytics and personalized care. 

“However,” Conley writes, “realizing this potential requires careful attention to the development of AI systems that are transparent, explainable and aligned with the needs of healthcare providers.”

‘By fostering a collaborative relationship between AI and clinicians, AI-driven decision support will enhance, rather than replace, the role of healthcare professionals, creating a future where both technology and human expertise work together to provide better patient care.’

The paper is posted here.

 

 

 

 Share on Facebook Share on Linkedin Send in Mail

Innovate Healthcare thanks our partners for supporting our newsletters.
Sponsorship has no influence on editorial content.

Interested in reaching our audiences, contact our team

*|LIST:ADDRESSLINE|*

You received this email because you signed up for newsletters from Innovate Healthcare.
Change your preferences or unsubscribe here

Contact Us  |  Unsubscribe from all  |  Privacy Policy

© Innovate Healthcare, a TriMed Media brand
Innovate Healthcare