An algorithm that predicts deadly infections is often flawed


A complication of the infection known as sepsis is the number one killer in American hospitals. So it’s no surprise that more than 100 healthcare systems use an early warning system offered by Epic Systems, the leading provider of electronic health records in the United States. The system launches alerts based on a proprietary formula that relentlessly monitors for signs of disease in a patient’s test results.

But a new study using data from nearly 30,000 patients at University of Michigan hospitals suggests that Epic’s system is malfunctioning. The authors say he missed two-thirds of sepsis cases, rarely found cases that medical staff did not notice and frequently sounded false alarms.

Karandeep Singh, an assistant professor at the University of Michigan who led the study, says the findings illustrate a larger issue with the owner algorithms increasingly used in health care. “They are used very widely, and yet there are very few publications on these models,” says Singh. “To me, it’s shocking.”

The study was published Monday in JAMA Internal Medicine. An Epic spokesperson disputed the study’s findings, saying the company’s system has “helped clinicians save thousands of lives.”

Epic isn’t the first widely used health algorithm to raise concerns that technology purported to improve health care is not delivering, or even actively harmful. In 2019, a system used on millions of patients to prioritize access to special care for people with complex needs proved to be lowball black patient needs compared to white patients. This prompted some Democratic senators ask federal regulators to investigate biases in health algorithms. A study published in April found that statistical models used to predict suicide risk in mental health patients worked well for white and Asian patients, but poorly for black patients.

The way in which sepsis pervades hospital departments has made it a prime target for algorithmic aids for medical personnel. Guidelines from the Centers for Disease Control and Prevention to sepsis health care providers encourage the use of electronic medical records for monitoring and forecasting. Epic has several competitors offering commercial alert systems, and some U.S. research hospitals have builds own tools.

Automated sepsis warnings have enormous potential, Singh says, as the main symptoms of the disease, such as low blood pressure, can have other causes, making it difficult for staff to detect early. Starting treatment for sepsis such as antibiotics an hour earlier may make a big difference to patient survival. Hospital administrators often take a special interest in the response to sepsis, in part because it contributes to US government hospital ratings.

Singh runs a lab in Michigan to research applications of machine learning to patient care. He became interested in Epic’s sepsis alert system after being invited to chair a university health system committee established to oversee the uses of machine learning.

As Singh learned more about the tools used in Michigan and other healthcare systems, he became concerned that they were mostly from vendors who disclosed little about their operation or performance. . Its own system was licensed to use Epic’s sepsis prediction model, which the company told customers was very accurate. But there had been no independent validation of its performance.

Colleagues from Singh and Michigan tested Epic’s prediction model on charts for nearly 30,000 patients spanning nearly 40,000 hospitalizations in 2018 and 2019. The researchers noted how often Epic’s algorithm reported people who have developed sepsis as defined by the CDC and the Centers for Medicare and Medicaid Services. And they compared the alerts the system would have triggered with the sepsis treatments recorded by the staff, who did not see the Epic sepsis alerts for the patients included in the study.

The researchers say their results suggest Epic’s system wouldn’t make a hospital much more efficient at detecting sepsis and could overwhelm staff with unnecessary alerts. The company’s algorithm failed to identify two-thirds of the roughly 2,500 sepsis cases in Michigan data. He reportedly alerted 183 patients who developed sepsis but had not received timely treatment by staff.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *