In 2018, almost three-quarters of Americans believed AI would eliminate more jobs than it stood to create. Almost a quarter expected healthcare to be among the earliest and hardest hit of all employment sectors. However, in 2023, McKinsey & Co. projected overall demand for healthcare workers to grow by 30% by 2030.
That’s one of a number of points on healthcare AI that can be viewed from two conflicting perspectives: glass half-full and glass half-empty.
The points are fleshed out in commentary posted by staff analysts at Health IT Analytics. Here are five more.
1. As recently as 2021, radiologists were being told their days were numbered: AI was still coming for their jobs.
However, it’s now apparent that there aren’t enough radiologists—or pathologists, surgeons or PCPs. In fact, name the specialty and it’s probably facing a physician shortage. Meanwhile, AI can help retain clinicians as it helps “alleviate the stresses of burnout that drive healthcare workers to resign,” the Health IT Analytics authors write.
2. Concerns persist that clinicians may become de-skilled by relying on AI and related technologies for various clinical tasks.
However, this scenario is unlikely to materialize on a broad scale since automation bias isn’t new to healthcare—and since time-tested strategies exist to ward it off.
3. Healthcare consumers are increasingly comfortable with the notion of AI as a tool for improving care delivery.
However, a research letter published in JAMA Network Open in 2022 showed two-thirds of 1,000 adults surveyed consider it “very important” for providers to inform patients when AI is used in their care for any reason.
4. Recent research shows patients strongly prefer to have consequential care tasks—prescribing medications, diagnosing skin conditions, those kinds of things—performed by human experts.
However, as the Healthcare IT Analytics authors point out, “whether patients and providers are comfortable with the technology or not, AI is advancing in healthcare. Many health systems are already deploying the tools across a plethora of use cases.”
5. If health data is to be safeguarded for use in AI, privacy laws and regulations must be updated. The potential for supposedly de-identified data to be re-identified is an attention-grabbing concern, and it’s not the only thing to worry about.
However, AI “falls into a regulatory gray area, making it difficult to ensure that every user is bound to protect patient privacy and will face consequences for not doing so.”
On the latter point, the authors comment that security and privacy “will always be paramount, but this ongoing shift in perspective—as stakeholders get more familiar with the challenges and opportunities of data sharing is vital for allowing AI to flourish in a health IT ecosystem where data is siloed and access to quality information—is one of the industry’s biggest obstacles.”
The article also looks at light and shade in the areas of ethics, responsibility and oversight with regard to healthcare AI.
Read the whole thing.