Industry Watcher’s Digest

Buzzworthy developments of the past few days.

  • AI isn’t good for what ails healthcare wherever it hurts. At least, not in the sense implied by snake oil salesmen pushing suspect wares. However, healthcare AI is becoming a critical investment for countries that lack qualified clinicians and need assistance with medical diagnostics and decision-making at scale. Julian Jacobs, a PhD candidate focused on comparative political economics, makes the point in a piece published April 25 by the Center for Data Innovation. Healthcare AI is no slouch at making a difference here in the U.S. either, Jacob suggests. “As demographic change and aging populations in many Western countries entail higher relative healthcare burdens,” he writes, “AI’s support in diagnosis, drug development and healthcare operations may serve as a much-needed remedy.” Read the piece.
     
  • How do healthcare AI developers (and buyers) stay ahead of the regulatory curve? Attorneys at the Nixon Gwilt law firm in the D.C. area pose the question and answer it in a helpful blog post. While healthcare stakeholders wait for formal legislative and agency action—meaning new laws and regs—“we can draw inferences about what to expect” from past precedents and other clues, write founding partner Rebecca Gwilt, Esq., and staff counsel Samuel Pinson, Esq. They call their guidelines the “Sharp-ENF” principles because, once adopted, they can make adopters “sharp enough” to navigate today’s regulatory pitfalls. Check it out.
     
  • Healthcare providers are feeling the pinch of inflation. It’s forcing many to wring yet more efficiencies from their payments and receivables operations. An exec with Bank of America tells Pymnts.com why digitization, presumably with AI where feasible, is increasingly important to maintaining healthy cash flows. “When you sit down with the directors of payments at large healthcare institutions,” says the banker, Galen Robbins, “they are asking the same things that their counterparts in consumer and retail [industries] are asking.” Interview video here.
     
  • GenAI resembles blockchain like this: Both do a lousy job at “much of what people try to do with them, they can’t do the things their creators claim they one day might, and many of the things they are well-suited to do may not be altogether that beneficial.” Hooboy. Anything else? Well, “AI tools are more broadly useful than blockchains—[but] they also come with similarly monstrous costs.” The opinion is solely that of software engineer and tech critic Molly White. Writing in her newsletter Citation Needed, White puts a fine point on her central point: The benefits of GenAI, worthwhile as some of them are, “pale in comparison to the costs.” Read it all.
     
  • Not everyone sees things that way. Take Aissa Khelifa, CEO of AI software supplier Milvue. “It is highly likely that, in the near future, healthcare will undergo a major transformation due to the use of generative AI,” Khelifa said at a recent roundtable discussion in Europe. More quotes from roundtable speakers here.
     
  • Generalizability—or, more specifically, the frustrating lack thereof. That’s the answer to another hard question a lot of people have about AI in healthcare. The question: What persistent concern keeps darkening AI’s rising star across medical science? The disappointment even comes up in clinical studies proving the technology’s prowess with various discrete clinical aims. The dilemma is brought to light in a new review of the literature led by Harvard researchers and published in The Lancet Digital Health. “[T]he generalizability of AI applications remains uncertain,” the researchers remark in their discussion section. The almost-there results represent a challenging fact with which to reckon, as the “true success of AI applications ultimately depends on their generalizability to their target patient populations and settings.” The study is available in full for free.
     
  • Developers and users of healthcare AI should know about this. HHS’s newly revised rule covering “health equity” takes a couple of things a bit further than previously. For one, it applies the nondiscrimination principles under the relevant section of the Affordable Care Act, Section 1557, to healthcare workers who use AI and other decision-support tools for clinical decision-making. It also codifies that Section 1557’s prohibition against discrimination based on sex includes LGTBQI+ patients. HHS general heads-up here, Section 1557 particulars here, soon-to-be published final rule available for downloading here.
     
  • Steve Jobs had his parents’ garage. Jensen Huang had his neighborhood Denny’s. The comparisons are inevitable now that 60 Minutes gave the star treatment to the Nvidia co-founder and leader who’s become an AI superstar of sorts. “We came here, right here to this Denny’s, sat right back there, and the three of us decided to start the company,” Huang tells reporter Bill Whitaker. “Frankly, none of us knew how to do anything.” View the segment.
     
  • Recent research news of note:
     
  • From AIin.Healthcare’s news partners:
     

 

Dave Pearson

Dave P. has worked in journalism, marketing and public relations for more than 30 years, frequently concentrating on hospitals, healthcare technology and Catholic communications. He has also specialized in fundraising communications, ghostwriting for CEOs of local, national and global charities, nonprofits and foundations.

Trimed Popup
Trimed Popup