Skip to main content
Human hearts, as most schoolchildren know, are located in the upper left side of the chest. But under the skin, things get murkier. In fact, medical workers occasionally confuse the heart with another organ when conducting an ultrasound scan, even when they are in the right location. “Sometimes when people go in for a liver scan, if the sonologist is not trained enough, they mistakenly scan the heart,” says Ratnadeep Paul, lead engineer for augmented and virtual reality at GE Global Research.
While advances in less-expensive pocket-sized ultrasound machines, such as GE Healthcare’s Vscan, make the scanning technology more readily available to far-flung hospitals, there is a deficit in operating skills. A mistake often means the patient has to return for a second scan. “One of the major issues we are facing in developing countries is that there are not enough trained ultrasound sonologists,” Paul says.
That’s why Paul and his team are developing an augmented reality (AR) system to train sonologists and eventually to provide live guidance for ultrasound technicians. They have programmed Microsoft HoloLens glasses to work in conjunction with a scanner. When a trainee wears the glasses, she can see on a dummy where a typical human’s organs are located and what they look like as the sonogram wand moves over the body. The headgear provides directions to specific organs and tells the trainee to move in certain ways in order to properly and completely capture the scan.
“We position virtual organs in the field of view of the operator, overlaid on top of the mannequin,” Paul says. “This allows the technician to position the probe on top of the correct organ. The placement of the virtual organs will be done by live tracking of the patient’s body and using our own proprietary artificial intelligence algorithms.”
Top and above: “We position virtual organs in the field of view of the operator, overlaid on top of the dummy,” Paul says. GIF credits: GE Reports.
In addition to the HoloLens, Paul says they are exploring tablet and phone options for viewing the virtual organs. The GE Global Research team also is looking to develop an AR program that works on “pregnant” dummies as well so that technicians training in, say, rural Africa can better learn how to spot problems in pregnancy ultrasounds.
The AR system, which is not yet on the market, eventually could be used on live patients for training and to guide novice technicians in an emergency. But Palu says that for now they are using dummies exclusively in the research lab for two main reasons: “We can position the dummy in different orientations, and it can sit still for long durations as we test the different equipment.”
The project is still in the research and development phase, but the final goal is to send ultrasound machines and AR headsets to hospitals, medical and nursing schools, and other facilities in both developing and developed countries. “We are currently testing out the feasibility of integrating AR, AI and probe tracking in a single unified system and understanding how or if it can improve the efficiency of the ultrasound technician (especially for less skilled technicians) and reduce the errors in ultrasound imaging,” Ratnadeep says.