As more medical device companies incorporate artificial intelligence, chip designer Nvidia is playing a central role. The company has struck partnerships with top medtech firms including Medtronic, Johnson & Johnson, GE Healthcare and Philips.
The uses span a breadth of technologies. In imaging, Nvidia has partnered with GE Healthcare around autonomous X-ray and ultrasound solutions, and is working with Philips to develop foundation models, a type of model that can be used for a wide variety of tasks, for MRI machines. In robotics, J&J’s Monarch platform for bronchoscopy uses Nvidia’s computing platform. Nvidia also debuted a new developer framework for healthcare robotics in March, called Isaac for Healthcare. The system features three computers to generate synthetic data to simulate workflows, create virtual environments where robots can safely learn skills, and a platform to deploy applications and for real-time sensor processing.
MedTech Dive spoke with David Niewolny, director of business development for healthcare and medical at Nvidia, about the company’s partnerships and the future of AI in medical devices.
This interview has been edited for length and clarity.
MEDTECH DIVE: What’s the current state of AI in medtech?
DAVID NIEWOLNY: In the last 18 months, we’ve seen this really fast progression in terms of healthcare and medtech adopting generative AI. That’s starting the idea of creating — drafting clinical notes, generating synthetic data for training. Devices are now looking at how can they begin bringing agentic AI into these applications. You’re seeing digital agents being an assistant to the healthcare provider or the patient with automating workflows, driving more context-aware support for clinicians.
Then you get to where we see the future. Where a lot of the AI and innovation is happening is in the idea of physical AI. And that brings us into this role of robotics. The easy one to think of is surgical AI. But there’s a huge number of applications. One specific use case that I talked about in Taiwan [at GTC Taipei] is with some of these more operational robotics. Think of a nurse assistant in terms of delivering medication, bringing different supplies around a hospital, making sure different areas of the hospital are stocked.
You recently announced a partnership with GE Healthcare around autonomous imaging. How does that work?
That was around completely transforming the way that you would look at doing medical imaging in the future. We took two initial use cases, one being the idea of an autonomous X-ray. Think of a future world where you no longer have the X-ray tech. You now have a digital agent that's essentially checking you in. You walk into a room where there’s another robot, it could be a digital agent, that’s providing you with all of the guidance for where to stand, when to hold your breath and how to position yourself. You stand in one spot, and then the actual machine positions itself.
You can also look at some of the generative AI applications, where it can hand a doctor a full report on everything that it saw in terms of its clinical findings.
In that particular case, now you’re expanding the access to care because you essentially have these fully autonomous systems that are doing medical imaging.
X-ray is one that we announced, and the other one is around ultrasound. In each one of these cases, GE Healthcare is working with us and collaborating on a methodology and tools to build these robotic systems.
What opportunities does Nvidia see in AI in medical devices?
Everything is about building an ecosystem. You look at all of the great work that we can do from accelerating a lot of these applications with AI and now robotics, going from Nvidia direct to a healthcare provider, there’s just too big of a gap. They don’t have the developers in-house to begin building this.
So, then you work backwards in that ecosystem, and you realize it’s the medtech companies that are building out all of these devices and solutions. What we’re looking at doing is, how do we bring all of those components of the ecosystem together, building on a common platform? Computers were not mainstream until Windows opened the door for this huge influx of software.
A lot of this learning we took from another industry: automotive. We essentially needed to create a fully simulated environment, train on that simulated data, take those algorithms and move them down to the edge.
We took those learnings and said, “What other areas are ripe for disruption?” Medtech, specifically, has the most to benefit from this opportunity of having a common platform. But at the same time, it had a big hurdle to jump over. There wasn't any single company in that ecosystem, as we saw it, that could essentially build out that platform.
I’m hearing more people talk about Agentic AI, a type of AI that can perform autonomous tasks. Why are medtech companies interested in this technology?
Agentic AI has a whole number of applications. A partner of ours, Abridge, leveraged a lot of our technology to do clinical documentation. They're integrating with major EHRs, and they're continuing to get more and more hospital users.
You also have some of these agents actually working right alongside a surgeon, where now you have an assistant in the room, where you can begin asking questions, pulling up the patient's medical records, adjusting some of the devices in the room.
One of our partners, Moon Surgical, actually downloaded their entire instructions for use manual into an agent that a doctor or surgeon can reference for things like setup. Instead of referencing a 1,000-page manual, you can just ask the robot where to be set up, how to be set up and what are the best practices?
Have you faced concerns from clinicians about AI taking their jobs?
Yes, people do get concerned. The key piece is, in almost every one of these cases, we’re augmenting the team members that are already there. There's a shortage in place. This is actually improving care, as opposed to the narrative around taking people's jobs. We take a lot of those workloads that the staff sees as either busy work or mundane, and augment them.
There's always going to be a surgeon involved here, but we can do sub-task automation and actually make some of those tasks easier for a surgeon.