Categories
Select Country
Follow Us
Digital Healthcare

Doctor Data: New AI Algorithm Helps Clinicians Spot Critical Conditions At Point Of Care

Anyone who has ever watched a medical show knows the word “stat” — it’s what doctors and nurses shout when a patient is in dire need of immediate care. The snappy appeal makes for good TV drama, but it has little in common with the real world, where medical professionals are fighting stat fatigue.

“When an X-ray is taken on a patient, especially a patient who’s suffering from an emergent condition or a potentially life-threatening condition, the time that it takes to process, have someone read that and have the image actually come into a queue is a really important time period where minutes and hours matter,” says Dr. Rachael Callcut, associate professor of surgery at the University of California, San Francisco Medical Center and director of data science at UCSF’s Center for Digital Health Innovation.

This is particularly a problem in radiology. Radiologists often have to review hundreds of exams per day — many of which are labeled “stat” because of something that may look out of the ordinary. This can clog the pipeline and delay diagnosis of critical conditions such as a pneumothorax, or collapsed lung.

But things are changing due to a collaboration between GE Healthcare and the Center for Digital Health Innovation. Dr. Callcut partnered with Dr. John Mongan and Dr. Andrew Taylor, radiologists at UCSF, to create the initial use case and data science approach behind a pneumothorax detection algorithm. UCSF — along with Humber River Hospital in Toronto, St. Luke’s University Hospital in Bethlehem, Pennsylvania, and Mahajan Hospital in New Delhi — worked to replicate the initial work carried out in acquiring and annotating images.

The AI algorithm was developed on Edison, a digital platform built by GE Healthcare to speed up the spread and adoption of AI tools for medical providers and help them make faster and more precise decisions. GE Healthcare brought the latest set of Edison apps to the 104th meeting of the Radiological Society of North America (RSNA), the world’s largest gathering of radiologists, which took place in Chicago in late November.

“When an X-ray is taken on a patient, especially a patient who’s suffering from an emergent condition or a potentially life-threatening condition, the time that it takes to process, have someone read that and have the image actually come into a queue is a really important time period where minutes and hours matter,” says Dr. Rachael Callcut, associate professor of surgery at the University of California, San Francisco Medical Center and director of data science at UCSF’s Center for Digital Health Innovation. Top GIF credit: GE Healthcare. Above: Getty Images.

Just like it did in 2017, AI had a huge presence at the 2018 meeting. The healthcare AI market is expected to reach $6.6 billion by 2021. Nearly four in 10 industry executives say they’re investing in AI, machine learning and predictive analytics, according to a 2017 survey by PricewaterhouseCoopers.

GE Healthcare’s  Optima XR240amx with Critical Care Suite* is now the first X-ray imaging device with AI embedded in the system. Critical Care Suite provides intelligence to identify critical conditions like pneumothorax at the time of the exam so clinicians can move questionable cases to the top of the priority list for radiologists.

The team trained the app by feeding it thousands of X-ray images and teaching the software to identify the specific problem in each. “The process is similar to the way you teach a kid to recognize things,” says Keith Bigelow, senior vice president of Edison Portfolio Strategy at GE Healthcare. “Every time you see a photo of a cat you point and say ‘cat.’ It’s the same process as training a deep-learning algorithm.”

By reading images from four different hospitals around the world, GE Healthcare was able to collect a diverse variety of X-ray scans to train the algorithm. In India, for example, X-ray images are typically taken with the machine farther away from the patient’s body than in the U.S. As a result, patients’ arms are often visible in the X-ray image. “At first, the algorithm was showing errors because it read the space between the arms and the body as a collapsed lung,” Bigelow says. “With better annotation, it learned that that was just air.”

The algorithm is designed to spot pneumothorax with accuracy greater than 0.95 AUC, or area under curve — a measure that indicates how good a mathematical model can be at making predictions. Because it is designed to help prioritize critical cases, the algorithm is considered a medical device by the FDA, which means GE Healthcare needs to obtain FDA clearance for it before the algorithms can be used in hospitals. The submission, known as 510(k), is currently pending at the FDA and the technology is not available for sale.

In addition to Critical Care Suite, GE Healthcare brought to Chicago AIRx, an AI-based tool designed to improve consistency and productivity for MRI brain imaging.* Variability can pop up when technicians use slightly different settings. Consistency between scans is critical, particularly for patients undergoing longitudinal studies, like those suffering from Alzheimer’s or multiple sclerosis. AIRx relies on a pretrained neural network model utilizing deep-learning algorithms and findings from a database of over 36,000 clinical images. It kicks in during a prescan and then tailors the MRI imaging to the particular patient while cutting out redundant steps; those settings can then follow the patient through subsequent scans. This application is also 510(k) pending at the FDA and is not available for sale.

Other GE Healthcare apps are designed to improve computed tomography (CT) and ultrasound imaging.

“This is a very exciting time to be in this space,” says Dr. Callcut. “Our compute capabilities have caught up with our conceptual idea of how to make AI come to life.”

*510(k) pending at U.S. Food and Drug Administration. Not available for sale.

Subscribe to our GE Brief