RSNA 2019: How to move AI from the drawing board to clinical practice

Implementing AI solutions into clinical practice can be a long, challenging process, according to a presentation at RSNA 2019 in Chicago. When the time comes, some providers will have the patience to make it all the way to the finish line—while some won’t.  

“Walking around at RSNA, you see a lot of very interesting technology,” said Jeroen van Duffelen, co-founder and COO of Aidence. “But before you get that to work for your team, you have to deal with a lot of hurdles.”

Van Duffelen, speaking Dec. 4 at RSNA’s AI Theater, noted that it typically takes a full year—or even longer—to get AI solutions up and running. And the first crucial step involves defining your organization’s needs. For instance, many healthcare providers are dealing with significant workload issues or worried about keeping up with newly passed government regulations.

“You need to think about what you are actually hoping to solve with AI,” he said. “There are so many vendors out there, but are you wanting to try the technology because it’s new and interesting or are you truly looking for a solution?”

Securing a budget is another early step providers need to think about immediately, one that is closely tied to defining your needs. Once you know the problem you are working to address, you can begin to think about how much money solving that problem could cost.

Choosing the right vendor is also crucial, and it’s a step that should never be rushed. The selection process should include researching regulatory certifications and clearances and how each potential vendor plans on integrating their solutions into your team’s workflow. Van Duffelen recommended sending test scans—without risking any patient’s privacy, of course—to a few different vendors to see how your needs would be met in a variety of scenarios.  

“It’s easy to validate a vendor by sending some of your more challenging scans,” he said. “We’re seeing this more and more these days. Hospitals will come to us interested in our solution and they’ll provide us with five to 10 different scans—some simple ones, some very hard cases—to validate how our algorithm works.”

When it’s time to assess each solution’s performance, Van Duffelen explained that it’s more important to focus on how it might impact the radiologists’ quality of life and reading times than statistics such as accuracy or specificity. Those hard numbers will be available through the clinical studies these vendors provide—but knowing how the solution will impact your team in real time is something you’ll have to determine through in-depth conversations and analysis.  

Around the web

The Palo Alto giant used exams from nearly 250,000 patients to upgrade its already robust algorithm.

Exams performed using the deep learning-based reconstruction tool also maintained high image quality, experts reported Wednesday.

Stratifying exams according to risk can reduce unnecessary imaging and downstream costs of care, Hawaiian researchers reported in Radiology.

Trimed Popup
Trimed Popup