Patients are strongly inclined to follow treatment instructions that combine innovative AI recommendations with a physician’s reassuring presence.
Then again, they’re no less open to such direction when it comes from a physician alone.
However, directions from AI alone are far less likely to drive adherence.
It follows, then, that individuals will tend to stick with rather than slough off doctors’ orders—thereby improving odds of good outcomes—when “human expertise [is] central to the diagnosis and the treatment recommendation.”
So found and concluded researchers at Technical University of Munich in Germany when they worked with 452 American volunteers.
Lead author Michaela Soellner, PhD, and senior author Joerg Koenigstorfer, PhD, randomly assigned the participants sets of imaginary scenarios involving skin cancer diagnoses of varying severities.
Next they “prescribed” the subjects oral and topical drug therapies and asked how likely they’d be to follow through on directions coming from a physician, a physician using AI or an automated AI tool.
Using regression analyses to test hypotheses and beta coefficients to characterize correlations between predictors and outcome variables, Soellner and Koenigstorfer arrived at three key conclusions:
- When a physician performs the assessment (vs. automated AI), the perception that the physician is real and present (a concept called social presence) is high, which increases intention to follow the recommendation.
- When AI performs the assessment (vs. physician only), perceived innovativeness of the method is high, which increases intention to follow the recommendation.
- When physicians use AI, social presence does not decrease and perceived innovativeness increases.
The authors note the consistency of these findings with those in previous research showing that AI is “best used to supplement human expertise, potentially benefitting clinical skills and enriching patient-physician interactions.”
Poorer health, lower adherence
Their present work, they suggest, advances the medical AI literature in three ways.
First, it extends social presence theory by “proposing two pathways for an individual’s compliance with treatment recommendations via social presence and perceived innovativeness.”
Second, it reaffirms previous studies showing patients are more likely to comply with physicians’ directions when prognoses are relatively good. “This is in line with previous findings [showing] those who are worse in health are less likely to be adherent,” the authors point out.
Third, it buttresses the intuition that, in general, patients are skeptical about using a computers to help make medical decisions.
Best of both worlds
Soellner and Koenigstorfer acknowledge several limitations inherent to their study design, including its reliance on self-reported intentions rather than verifiable actions and its lack of a mechanism to separate AI diagnostics from AI treatment guidance.
Nevertheless, they suggest, their findings can help inform healthcare providers considering AI tools for clinical practice:
[T]he findings of the present study in combination with [other] AI research in healthcare might help develop practice guidelines for cases where AI involvement benefits outweigh risks, such as using AI in pathology and radiology, to enable augmented human intelligence and inform physicians about diagnoses and treatments. Physicians therefore have the option to integrate and utilize suitable hardware and software that combine the expertise of AI technology with the physician’s expertise.”
The study was published Aug. 6 in BMC Medical Informatics and Decision Making and is available in full for free.