Fresh off its January spinoff into an independent company, GE HealthCare announced plans earlier this month to acquire Caption Health, a developer of artificial intelligence applications to interpret ultrasound exams. The imaging systems maker holds 42 AI-enabled device authorizations from the Food and Drug Administration.
Karley Yoder, chief digital officer of GE HealthCare’s $3 billion ultrasound business, who will lead the integration of Caption Health, talked to MedTech Dive about the company’s AI strategy, which is focused on health system efficiency and access, health outcomes, and workflow integration.
This interview has been edited and condensed for ease of reading.
MEDTECH DIVE: How is GE HealthCare incorporating artificial intelligence into its imaging systems?
YODER: A framework that I use sometimes when talking about GE HealthCare’s AI strategy is upstream and downstream. Upstream is AI that we embed deeply in devices to help create data faster with more consistency, with easier user dependence and higher quality, that really only device manufacturers are uniquely positioned to do. And then downstream is what do you do with data once it's been generated, and how do you apply AI on that data to get it to the right clinician in the fastest amount of time, to start doing clinical decision support and connecting dots across care pathways where they maybe couldn't be connected previously?
How do you transform the mounds of data into tools that can help patients?
It’s not a secret that only 3% of healthcare data is used to generate actual insights. We have 97% of our healthcare data kind of sitting untouched, unleveraged, unharnessed. When we think about our Edison [Digital Health] Platform, it is how do we bring a solution to hospitals that allows us to connect disparate data sources, stringing them together, to [provide] insights that really drive toward precision care and align with key patient care pathways? A great example here is Critical Care Suite, one of the very first AI solutions we brought to market in our X-ray business. We've built AI that is able to see if you have a collapsed lung – pneumothorax – and then it takes that finding and puts it right to the top of a queue for a radiologist to review so that they get to critical cases first. It completely flips on its head the paradigm of first in, first out and becomes most critical is what you look at first.
What do your customers want AI to do for them?
Especially in this decision support arena, it is very much an ecosystem game. There are, at last count, over 200 VC-backed startups in this space, building solutions on how you interpret what is created in medical data. A great majority of those are actually in the imaging space because of the relevance for deep learning on image-based data. At GE HealthCare, we're going to build some of those, but we're also really focused on how do we be the best partner? We launched something about two years ago called the Edison Orchestrator, which is just like a conductor of an orchestra. They have to be the person who guides multiple different instruments. We have built a layer that can guide some AI that GE has built, but also sits in front of and on top of what our partners bring to market. It doesn't matter to the customers if the data science is great if it breaks their workflow. In the healthcare space, for AI to really be impactful, it needs to be built for problems that matter, but then delivered in a way that is native and invisible to existing workflows.
How will the acquisition of Caption Health advance AI development in GE HealthCare’s ultrasound business?
Ultrasound is incredibly user-dependent. It's not like an MRI or a CT where you lie in a single spot and a machine runs over you. It is completely dependent on the skill set of the operator who is positioning a probe. A very fast-growing part of the ultrasound market – and this is a big part of the reason why we did the Caption acquisition – is point of care ultrasound – doctor’s office, clinics, or even the emergency department in the hospital. You’ve got an emergency doctor, and someone comes into the ER in shock. You do an ultrasound to get a sense of what's going on, but you don't do an ultrasound all the time. So you need really intuitive tools for ultrasound to be effective in these point of care spaces. It's a very different workflow than a sonographer who does 20-week ultrasound appointments day-in and day-out. [For] these new users to be competent with ultrasound, scan guidance – or the ability to help them get to the right image – is pivotal. If you can partner with the incredible equipment and hardware that GE is making with the handheld ultrasound, which is Vscan Air, and the point of care ultrasound, which is our Venue family, if you can couple that with AI like what Caption has around cardiac scan guidance, you really open up ultrasound as a trusted device for users that maybe couldn't have accessed that in the past.
How else is AI helping doctors do their jobs better?
Access and efficiency are two of the most important trends that AI is unlocking. In our women's health ultrasound suite, we have applied AI that really drives efficiency for the user of the device. They automate measurements, they automate segmentation, they take out steps. And so when you think about AI, one of the things that it can really do is shift clinicians or caregivers from the process back to the patient. It takes precious moments out of doing technology tasks and shifts them back to interacting with the patients, and that's one of the things I get most excited about that AI can unlock.
Is AI being used to diagnose disease?
A lot of the AI in this space, and certainly what GE HealthCare is focused on building, stops short of diagnosis and is more of a guide to a clinician. True diagnosis with AI is going to take a few different AI solutions working together well. I don't think that is really what AI has cracked. That's a future wave. Right now, it's really more about workflow efficiency and clinical decision support for a radiologist or a cardiologist.
What else might the future hold for AI applications?
The future of AI is going to be most powerful when AI is interacting not just with imaging data, not just with EMR data, not just with the connected care data coming from the home, but all of those pieces woven together, so you get a true 360-degree view of the patient. I think that's the future that we can unlock. And it takes both unlocking data from its current silos and building sophisticated AI that can interact with that data.