Dive Brief:
-
Medical device developers should identify and mitigate biases in the designs of their products, an assistant professor at the University of California, Los Angeles, argues writing in the journal Science.
-
Achuta Kadambi contends medical devices are subject to physical, computational and interpretation biases that disadvantage certain demographic groups.
-
Kadambi wants medical device developers to address the problem by providing technical explanations for how they tried to mitigate bias and considering including fairness metrics in their studies.
Dive Insight:
The industry and regulators are aware of the risk of bias in several respects. Last year, FDA and Philips warned that social biases in the data used to train artificial intelligence and machine learning systems could negatively impact patient care.
FDA is also reviewing how skin pigmentation affects the results of pulse oximeters, after a New England Journal of Medicine analysis revived long-running concerns in reporting that "Black patients had nearly three times the frequency of occult hypoxemia that was not detected by pulse oximetry as White patients."
Kadambi lays out three types of bias. In some devices, physical principles are biased against particular groups. The researcher cites pulse oximeters and hip implants as examples of devices with physical biases. Oximeters may fail to detect occult hypoxemia in Black patients. Hip implants designed without considering the female bone structure may lead to inferior outcomes in women.
Other devices may suffer from computational bias. That bias relates to the distribution, processing and computation of data. For example, an algorithm that learns from a chest X-ray database that has a 60:40 split between images from men and women could have computational bias. Kadambi also cites image processing systems that diagnose neurological disorders by detecting the blink rate as potentially biased. Image processing systems can struggle to detect blinks by Asian individuals.
Finally, Kadambi discusses interpretation bias through the example of spirometers. Black and Asian people are assumed to have lower lung capacity than white people, leading physicians to apply "correction factors" to the data. To Kadambi, the approach is based on a questionable assumption and could lead to the prioritization of a white person in treatment plans.
"In the spirometer example, the device designers conflated a racial group's healthy lung capacity with their average lung capacity. This assumption does not account for socioeconomic distinctions across race: Individuals who live near motorways exhibit reduced lung capacity, and these individuals are often from disadvantaged ethnic groups. The spirometer is just one of several examples of systemic racism in medicine," Kadambi wrote.
Kadambi wants the industry to do more, comparing the pace of change in medtech unfavorably to the actions of the AI community. The assistant professor proposes the inclusion of a "fairness" statement in studies of medical devices. The statements would cover efforts to mitigate bias and propose technical compensations for when they are unsuccessful. Kadambi also suggests the use of fairness metrics that show how consistently a device performs across demographics.
Minimizing bias may negatively impact peak performance of a product. Kadambi identifies how to manage that trade-off as one of several open questions facing the industry. Other open questions include how to encode positive biases into the design of medical devices.
Amid last year's protests over racial justice, AdvaMed put out a statement declaring its intention to do more to combat racial inequities, though it focused mostly on access to care rather than bias. Among its commitments include efforts to "highlight the need for minority population involvement in clinical research studies and diversity among clinical trial investigators."