Industry Watcher’s Digest
Buzzworthy developments of the past few days.
- What is Stargate and why is it here? That would be the multibillion-dollar AI joint venture announced from the Oval Office Tuesday. It’s here to do everything from beating China in the AI race to finding a cure for cancer. The initiative will start with massive data centers in Texas on the strength of $100 billion in investments from OpenAI, SoftBank and Oracle. The hope is that other deep-pocketed private companies will join before long, geographically dispersing the project and hugely boosting the infusion of capital, ideally to something like $500B. “Put that name down in your books,” President Trump said of Stargate at the announcement, “because I think you’re going to hear a lot about it in the future.” Blanket media coverage.
- Conduct that is illegal without AI is just as unlawful when AI is involved. And the fact that AI is involved is not a defense against liability under any law. The reminder comes courtesy of the California Attorney General’s Office. While the communiqué is aimed at Californians, it’s healthcare-specific and thus worth a look for healthcare stakeholders living in the other 49 states. “Healthcare entities that develop or use AI should not wait to ensure that they comply with all state, federal and local laws that may apply to their use of AI,” the office states in a legal advisory issued Jan. 13. “That is particularly so when AI is used or developed for applications that carry a potential risk of harm to patients, healthcare systems or the public health writ large.”
- The notion of AI curing cancer might not be just for pipe dreamers. A recent survey of 2,000 people found 53% believing the dread disease will finally meet its match as medical AI gets smarter. “Now we can predict protein structures with unprecedented accuracy, reducing the need for physical experiments,” pharmacist/entrepreneur Max Votek tells Digital Journal. “What once seemed impossible is now just one step in the broader process of AI-driven cancer therapies—an extraordinary leap forward in the fight against cancer.”
- Virtual staining. Synthetic data. Quality control. Reflex testing. Just a few of the good things anatomic pathologists are banking on as GenAI continues seeping into their specialty. “Pathology is entering a new era, where generative AI doesn’t just have the potential to assist pathologists,” says Victor Brodsky, MD, lead author of a study teasing the possibilities in Archives of Pathology & Laboratory Medicine. “[I]t should be able to efficiently amplify their expertise, transforming how diseases are diagnosed, treated and understood.” Journal study here, news item from the College of American Pathologists here.
- Investments in biotech AI soared to $5.6B in 2024. That’s almost three times as much as in 2023. And it’s a healthcare-adjacent sector of the economy. But those figures are not the whole story. “Thin margins and an uncertain reimbursement and payment environment are always weighing on buying decisions for healthcare organizations,” says Phil Neuhart of First Citizens Wealth, who’s quoted in Silicon Valley Bank’s annual report on healthcare investments and exits. “Pharma AI is probably still the exception to every investing rule. AI has the potential to be transformative across the board in tech, but it can be hard to tell if the potential gains are worth the costs.” Preview the SVB report from here.
- Is anyone immune from automation bias? Probably not. Its spread will only become more problematic as young physicians enter the healthcare workforce never having known healthcare without AI. The phenomenon can even sway those inclined to distrust an algorithm when, say, it flags a CT scan for a serious injury that the interpreting radiologist doesn’t quickly see. As one young rad puts it for Medscape: “If a tool already told you that [the scan] is positive, it’s going to kind of change the way you look at things.” Noting that AI conditions even experienced clinicians to trust machines, at least up to a point, the author of the article asks: “Is overreliance on decision-based medical technology inevitable?” Read the piece.
- Think of an AI model as a student. His or her textbooks are the data. Now apply that word picture to, specifically, AI rendered as large language models. Pymnts does just that and ends up with an easy LLM primer for businesspeople. A large language model is an AI model trained on vast amounts of text— “such as the entire internet,” the outlet reminds. “It’s as if someone has read millions of books, articles, blogs and messages in a dataset. The AI model learns to find statistical relationships between words and phrases through this training.” Hey, everyone could use a refresher of the basics every now and then. Get this one here.
- In Kenya, just 200 or so radiologists serve a population of more than 55 million people. Those 200 are understandably looking to AI for help. Happily, they’re finding it. “AI is helping us bridge the gap in diagnostic services, especially in areas with limited specialists,” Nairobi radiologist Peter Njoroge tells Business Daily. “This technology is a game-changer for rural healthcare.” Read the rest.
- Watch for enterprise AI applications to pick up the pace in healthcare. And for AI-powered digital pathology to step up its game in personalized medicine. Those are two of four insights the American Hospital Association picked up at last week’s J.P. Morgan Healthcare Conference in San Francisco. AHA’s recap of the high points makes for a fast but rewarding read. Have at it.
- Recent research in the news:
- Notable FDA Approvals:
- Funding news of note:
- From AIin.Healthcare’s news partners:
- Cardiovascular Business: FDA regulator examines AI's growing influence in cardiology
- Radiology Business: AI can help radiology standardize image exam data labeling
- Cardiovascular Business: FDA regulator examines AI's growing influence in cardiology