The ultrasound technologist scanned the patient’s leg and collected a tiny amount of fluid. The technologist also snapped a photo of the wound with a smartphone and uploaded it to an app.
This new Photo Assistant App is an integral component of GE Healthcare’s most advanced radiology ultrasound system, the LOGIQ E10. The system, which GE launched in February, is powered by advanced algorithms and the same artificial intelligence (AI) technology behind advanced gaming. It can process 10 times more data and generate images faster than previous systems. “What this means is that we can create an ultrasound image that is in focus at every pixel,” says Michael Washburn, chief engineer for ultrasound at GE Healthcare.
Washburn says that in the past, a scan of a body part such as the liver might be blurred away from the center. “Now the liver scan is in focus from top to bottom, close to the skin and further in depth,” he says.
The new technology on the LOGIQ E10 gives doctors a better view, and it also reduces the need for repeat ultrasounds. “Even on the heaviest patients, whose images can get really patchy, we get solid, consistent images,” says Dr. John Cronan, radiologist-in-chief at Lifespan, Rhode Island’s largest health system, which was co-founded by the Rhode Island Hospital. He says that with this system, the images were so clear that he was able to see the needle throughout every ultrasound biopsy he performed, something that hasn’t always been the case. “In the past, the needle could go into stealth mode and you would struggle to see it,” he says. “GE has now completely fixed the problem.”
The new machine can connect to the cloud and smart devices, such as tablets and smartphones. The cloud connectivity allows clinicians to share information with one another instantly and maintain archival electronic records — another feature to ensure smooth communication.
With the Photo Assistant App, the photos securely transmit along with the clinical images to the doctors — who are often remote — and provides them with important information and context.
In the past, the technologist would have to send the ultrasound scan to the doctor along with written notes attempting to explain the precise locations of the swelling and biopsy. “That’s actually a pretty tough thing to describe,” Cronan says. “You need to say something like, ‘8 centimeters above the knee and a bit to the left.’ It can be confusing. A photo is really worth a thousand words.”
In the case of the injured thigh, the surgeon reviewed the black-and-white ultrasound image and the color shot of the thigh before meeting the patient for the first time in the operating room. “With these [dual complementary images], the surgeon knew exactly where to go — even if the swelling went down by the time of the surgery,” Cronan adds.
Cronan says the app has been most helpful in capturing what patients often vaguely refer to as “lumps and bumps.”
“Often we can’t find the lump or bump,” he says. “So now we ask the patient to point to the spot on his or her body and we take a photo of it. It’s very useful for the doctor or surgeon because otherwise they are just seeing the internal black-and-white image from the ultrasound.” With a photo, he says, they can see the red marks or swelling on the skin.
“We don’t even bother marking the body with ink before saphenous vein surgery anymore,” Cronan says, referring to the practice of marking the location of troublesome varicose veins to guide the surgeon during the operation. “We just send the surgeon a photo.”
Cronan also remarked that patient feedback has been overwhelmingly positive. “We were concerned people would be leery of us taking photos, but they’ve been very welcoming.” The key, he says, is uploading the photo immediately to the ultrasound screen where the patients can see it alongside their scanned images in real time. “They feel like someone is validating their concerns — the photo adds a degree of security and provides us with even more information.”
In addition to the Photo Assistant App, the ultrasound system has another digital tool called Remote Control App, which allows clinicians to manipulate the ultrasound’s settings by using their tablet or smartphone as a remote control. It’s useful if the scan is being performed on the opposite side of the body from where the system controls are located.
Over time, Washburn says, the system will collect more data that in turn will facilitate even more advanced algorithms and increasingly help the technician. “Our long-term effort is to make the ultrasound system an amazing assistant to the operator and make sure we’re not missing anything,” Washburn says.