Industry Watcher’s Digest

Buzzworthy developments of the past few days. 

  • If you’re not waist-deep in development of GenAI products, wait no longer to wade further in. That’s a little complimentary advice for software vendors. It comes from Bain & Company. In an extensive new report, the big business consultancy notes how easy it is to get lost in the hype around generative AI. But look at the private equity investors who are excelling in the category. They’re doubling down on plans for genAI tools known to bring measurable benefits. And they’re envisioning ways to “enhance or reimagine” product offerings without getting carried away by big dreams of lucrative—and quick—ROI. “AI needs to be part of long-term strategic planning for any software business, both in terms of offensive and defensive moves,” the authors write. “Right now, though, it is critical to get moving on piloting and deploying these technologies in the areas that will pay off today.” Bain has posted excerpts here and the full report here
     
  • There’s no shame in admitting you don’t have a good answer. Try telling that to the latest generation of large language AI chatbots. A new study shows the crop tends to give incorrect answers when it would have done better to confess ignorance. Worse, the same research showed people a little too eager to accept iffy answers as authoritative enough. The bots’ proclivity for giving opinions beyond the scope of their abilities “looks to me like what we would call bullshitting,” Mike Hicks, a philosopher of science and technology at the University of Glasgow, tells Nature. GenAI, he adds, “is getting better at pretending to be knowledgeable.”
     
  • Doctors using GenAI to draft messages for patients can be a pretty good thing. Doctors sending these messages without checking for accuracy can be a Very Bad Thing. Athmeya Jayaram, PhD, a researcher at the Hastings Center, a bioethics research institute in Garrison, N.Y., nails the nub of the problem. “When you read a doctor’s note, you read it in the voice of your doctor,” he tells the New York Times. “If a patient were to know that, in fact, the message that they’re exchanging with their doctor is generated by AI, I think they would feel rightly betrayed.” And if the message includes errors, inaccuracies or misleading advice—see item immediately above—the ultimate outcome could be a lot worse than feelings of betrayal. 
     
  • It’s been said before and will be said again: Data for training AI is a finite resource. That may not seem possible, given the mountains of multimodality content getting created and digitally posted every day. But when it comes to training AI, quality matters as much as—if not more than—quantity. “Access to quality data is the lifeblood of AI innovation,” Lisa Loud, executive director of the privacy and open-source advocacy group Secret Network Foundation, tells The Street. “Better data doesn’t just enhance AI, it ensures its relevance and fairness.” Read the article.
     
  • A database is a database. Unless it’s a vector database. In which case it can handle generative AI tasks with particular aplomb. That’s because vector databases “focus on the unstructured, feature-rich vectors that AI systems feed off,” a contributing writer at InfoWorld explains in a feature posted Sep. 23. “Driven by the growing importance of vector search and similarity matching in AI applications, many traditional database vendors are adding vector search capabilities to their offerings.” Read the whole thing
     
  • HHS is adding a division to offer technical expertise with special focus on AI. Micky Tripathi, PhD, whose HHS titles include acting chief AI officer, announced the change at a health IT summit this month. “We will have teams that will provide digital services and technical assistance to all of our operating and staffing divisions so that they don’t have to worry about going out and hiring teams for that kind of expertise,” Tripathi, who will oversee the new division, told attendees, according to GovCIO. “They will be on demand and will help with consulting and the enablement of technologies.”
     
  • The generative AI vendor whose feet are being held to the proverbial fire by Texas’s attorney general is speaking out. AG Ken Paxton alleged Pieces Technologies unlawfully exaggerated its algorithm’s accuracy at writing clinical notes and documentation, misleading several hospitals in the Lone Star State. Pieces agreed to terms, which did not include punitive measures but must have smarted anyway. At the time, a Texas TV station called the case a “first-of-its-kind investigation into AI in healthcare.” Now comes a prepared statement from Pieces claiming that a press release from the AG’s office “misrepresents the Assurance of Voluntary Compliance (AVC) into which Pieces entered.” Pieces adds: “The AVC makes no mention of the safety of Pieces products, nor is there evidence indicating that the public interest has ever been at risk.” HIPAA Journal has more
     
  • What a week for OpenAI. The company announced it will partially restructure as a for-profit corporation and, in the process, extend equity to CEO Sam Altman. Maybe relatedly, or maybe not, one of Altman’s top lieutenants, CTO Mira Murati, resigned. Partially is a hedge word, as the plan seems to be keeping an arm in place as a nonprofit that will have a minority ownership stake in the for-profit corporation. OpenAI said in prepared remarks that it “remain[s] focused on building AI that benefits everyone, and we’re working with our board to ensure that we’re best positioned to succeed in our mission. The non-profit is core to our mission and will continue to exist.” Press coverage is everywhere you’d want it to be. 
     
  • Recent research in the news: 
     
  • Notable FDA Approvals:
     
  • Funding news of note:
     
  • From AIin.Healthcare’s news partners:
     

 

Dave Pearson

Dave P. has worked in journalism, marketing and public relations for more than 30 years, frequently concentrating on hospitals, healthcare technology and Catholic communications. He has also specialized in fundraising communications, ghostwriting for CEOs of local, national and global charities, nonprofits and foundations.

Trimed Popup
Trimed Popup