I have used previous posts to show that a blind trust in medical algorithms can lead to adverse patient outcomes. In the case of the sepsis ‘bundle”, hospitals and the Centers for Medicare Services continue to mandate its use in spite of a lack of objective evidence that it is effective (see our August 20th 2019 post). It may even be harmful.
I’m somewhat surprised that there hasn’t been more press focus on an article published October 24th in Science showing that decision making software used in US Hospitals is systematically discriminatory. The results are shocking and emblematic of the erosion of humanistic values in Medicine for the sake of operational efficiency. Or perhaps this is expected when patient care decisions are made by those not at the patient’s bedside.
Any article published in the journal Science is one that has withstood the rigors of intensive peer review. For those unfamiliar with “impact factor” of scientific journals, only Nature (which wrote the accompanying summary article linked below) has a higher rated scientific prestige. Published since 1880, Science is considered one of the world’s top scientific journals. Published anywhere, this article is shocking – its appearance in Science amplifies its critical importance.
Unbeknownst to most patients, the U.S. health care system uses commercial algorithms to guide health decisions. Ziad Obermeyer and his team at UC Berkeley ran routine statistical checks on data they received from a large hospital, they were surprised to find that people who self-identified as African American were generally assigned lower risk scores than equally sick white people. Note the qualifying descriptor of these statistical checks: “routine.” The company that developed the algorithm, Optum of Eden Prairie, Minnesota, repeated these analyses and verified Obermeyer’s results.
This faulty computer program affected the health outcomes and resource allocations for millions of patients. By fixing the poorly written software, Obermeyer increased the percentage of African American patients receiving additional help from 17.7% to 46.5% (a multiple of 2.6 times). The authors noted that the bias arose because the algorithm predicts health care costs rather than illness.
Wait, what? The software predicts cost rather than illness? Why would it do that?
The answer is obvious. Optum is a for profit healthcare conglomerate. Described by Healthcare Finance magazine in 2018, “Optum is a behemoth in the healthcare industry, reaping profits for parent company UnitedHealth Group by having virtually every payer and over 5,000 hospitals in its portfolio.” Optum reported $101.3 billion in revenue for 2018, up by $10.1 billion from $91.2 billion reported in 2017. Optum’s CEO Andrew Witty made $21.2 million in 2018. By comparison, Obermeyer makes $102,827 annually as an acting associate professor (salary available publically as he is a California State employee). It shouldn’t be a surprise that healthcare software written by a company generating massive profits would consider cost first and patient care second.
Obermeyer and his team then collaborated with the company to find variables other than healthcare costs that could be used to calculate a person’s medical needs. He did this, by the way, as an unpaid consultant although Optum can clearly afford to pay him for his expertise. They then repeated the analysis and were able to reduce racial bias by 84%.
Even then, Optum could not be gracious and admit their errors. “We appreciate the researchers’ work,” Optum said in a statement. But the company added that it considered the researchers’ conclusion to be “misleading”. “The cost model is just one of many data elements intended to be used to select patients for clinical engagement programs, including, most importantly, the doctor’s expertise.”
Sorry, what? The doctor’s expertise? Where does that get plugged into the algorithm? It doesn’t. Like the sepsis bundle, we blindly hurdle forward in the wrong direction with increasingly ill patients who are not offered appropriate resources because clinical decisions are being made via poorly conceived programming code.
And who suffers? Patients.