Implementing point-of-care AI? 5 firm facts to keep in mind

If healthcare AI is to flourish outside of academic research settings and industry R&D departments, it will need to win over its most difficult-to-impress audience: healthcare workers in hospitals.

And that’s not going to happen if these end-users aren’t offered three helps: early exposure to algorithm development, needs-adjusted training and adequate operational infrastructure. (The latter includes IT resources, technical support, internet access and such.)

The assertions are from a literature review conducted at Germany’s Rhine-Westphalia Technical University of Aachen and published this month in NPJ Digital Medicine.

The authors reviewed some 42 peer-reviewed articles. Gauging end-user acceptability according to the Unified Theory of Acceptance and Use of Technology (UTAUT), the team identified a variety of “facilitating and hindering” factors affecting AI acceptance in the hospital setting. Among the standout themes to emerge from the exercise, along with key researcher quotes:  

  1. Patient safety is rightly critical to this crowd. “Although it can be stated that AI-based prediction systems have shown to result in lower error rates than traditional systems, it may be argued that systems taking over simple tasks are deemed more reliable and trustworthy and are therefore more widely accepted than AI-based systems operating on complex tasks such as surgical robots.”
     
  2. Human factors matter. “More experienced healthcare professionals tend to trust their knowledge and experience more than an AI system. Consequently, they might override the system’s recommendations and make their own decisions based on their personal judgement.”
     
  3. Time isn’t infinite. “Physicians might accept an AI system such as a clinical decision support mechanism if they witness that it might reduce their workload and assist them. In order to facilitate the acceptance and thus implementation of AI systems in clinical settings, it is of utmost importance to integrate these systems into clinical routines and workflows, thereby allowing the AI to reduce the workload as well as the time consumption.”
     
  4. Medical specialties are unequally inclined to embrace AI in clinical practice. “AI’s establishment in radiology and relative rareness in many other areas of medicine raises the question of whether radiologists are more technically inclined and specialize on the basis of this enhanced interest—or whether innovations of AI in radiology are more easily and better integrated into existing routines and are therefore more widely established and accepted.”
     
  5. Reasons for limited acceptance of AI among healthcare professionals are many and varied. “Personal fears related to a loss of professional autonomy, lack of integration in clinical workflow and routines and loss of patient contact are reported. Also, technical reservations such as unintuitive user interfaces and technical limitations such as the unavailability of strong internet connections impede comprehensive usage and acceptance of AI.”

The authors conclude that, to maximize acceptance of AI among hospital-based healthcare workers, leadership must emphasize the understandability of the general resistance while identifying the specific pain points in play.

They write:

“Once the causes of hesitation are known and personal fears and concerns are recognized, appropriate interventions such as training, reliability of AI systems and their ease of use may aid in overcoming the indecisiveness to accept AI in order to allow users to be keen, satisfied and enthusiastic about the technologies.”

The study is available in full for free.

Dave Pearson

Dave P. has worked in journalism, marketing and public relations for more than 30 years, frequently concentrating on hospitals, healthcare technology and Catholic communications. He has also specialized in fundraising communications, ghostwriting for CEOs of local, national and global charities, nonprofits and foundations.

Trimed Popup
Trimed Popup