Experts call for action on potentially biased medical devices

Experts are calling for action on medical devices prone to unfair biases, including blood oxygen monitors and certain artificial intelligence (AI)-enabled tools, to prevent harm to ethnic minorities and women.

The report details the findings of the Independent Review of Equity in Medical Devices which looked at the extent and impact of ethnic bias and other unfairness in the performance of equipment commonly used in the NHS.

It focused on optical devices such as pulse oximeters, AI-enabled devices and certain genomic applications, where evidence suggested substantial potential for harm.

The panel found evidence that pulse oximeters (blood oxygen monitors) – widely used during the Covid-19 pandemic – can overestimate the amount of oxygen in the blood for people with darker skin tones.

This can lead to a delay in treatment if dangerously low oxygen levels are lost.

The experts say that they did not specifically look at the use of these devices during the pandemic, but because there were a large number of people with very low oxygen levels “it is likely that the inaccuracy was high at that time”.

Daniel Martin, professor of reactive medicine and intensive care, Peninsula Medical School, University of Plymouth, said: “We can only say that the harm and the inaccuracy are related and not causative.

“But I think it’s a pretty strong indication that there’s potential for harm, especially during Covid when oxygen levels are so low.”

The review makes several recommendations regarding the devices, including advising patients to watch out for other symptoms such as shortness of breath, chest pain and a fast heart rate.

It also suggests that researchers and manufacturers should work to produce devices that are not biased by skin tone.

On AI-enabled devices, the review found evidence of potential biases against women, ethnic minorities and disadvantaged socio-economic groups.

It highlights the underdiagnosis of skin cancer for people with darker skin using AI-enabled devices.

The report suggests that this is a result of machines being trained primarily on images of lighter skin tones.

There is also a long-standing problem of underdiagnosis of heart conditions in women, which AI algorithms in medical devices could make worse, the panel suggests.

The University of Liverpool’s Professor Dame Margaret Whitehead, chair of the review, said: “Advances in AI in medical devices could have huge benefits, but it could also cause harm through inherent bias against certain groups of the population, especially women, people from ethnic minorities and disadvantaged socio-economic groups.

“Our review shows how biases and injustices that already exist in society can be unwittingly incorporated at every stage of the lifecycle of AI-enabled medical devices, and then amplified in algorithm development and machine learning.

“Our recommendations, therefore, require system-wide action from many stakeholders and must now be implemented as a matter of priority with the full support of the Government.”

Among its recommendations, the report recommends renewed efforts to increase skin tone diversity in medical imaging data banks used in the development and testing of optical dermatology devices, including in clinical trials, and to improve put on the tools for measuring skin tone incorporated into optical devices.

Enitan Carrol, professor of pediatric infection, University of Liverpool, said: “The NHS has a responsibility to maintain the highest standards of safety and effectiveness of medical devices in use for patients.

“We found no evidence of actual harm in the NHS, but only the potential for racial and ethnic bias in the operation of some medical devices commonly used in the NHS.”

Panellist Professor Chris Holmes warned that the Government needs to understand how AI, including programs such as ChatGPT, will affect clinical practice and public health.

He said: “We are calling on the Government to appoint an expert panel including clinical, technology and healthcare leaders, patient and public representatives and industry to assess the potential unintended consequences of the AI ​​revolution in care health.

“Now is the time to seize the opportunity to incorporate action on equity in medical devices into overarching global AI safety strategies.”

The review was launched in 2022 by Sir Sajid Javid, the secretary of state for health and social care.

He said: “Health outcomes should not be affected by the color of a person’s skin or where they are from but the pandemic has shown too many of these inequalities.

“I hope this review and its important recommendations will help deliver much-needed change.”

In response to the report, Health Minister Andrew Stephenson said: “I am extremely grateful to Professor Dame Margaret Whitehead for undertaking this important review.

“Ensuring that the health care system works for everyone, regardless of ethnicity, is critical to our values ​​as a nation. It supports our wider work to create a fairer and simpler NHS.”

The Department of Health and Social Care said that significant action is already being taken to overcome potential inequalities in the performance of medical devices.

This includes the Medicines and Healthcare Products Regulatory Agency (MHRA) now requiring approval applications for new medical devices to describe how they will address bias.

NHS guidance has been updated to highlight the potential limitations of pulse oximeter devices on patients with darker skin tones.

The government will also work with the MHRA to ensure that medical device regulations are safe for patients, regardless of their background, while allowing more innovative products to be brought to the UK market.

Leave a Reply

Your email address will not be published. Required fields are marked *