Racial Bias and Healthcare Decision-Making Algorithms

A recent study found that a common healthcare risk-prediction algorithm used in hospitals across the country demonstrates a racial bias, both reflecting real world attitudes and negatively affecting the level of care provided to black patients. This algorithm predicts which patients will benefit from extra medical care, and researchers found that it significantly underestimates the needs of black patients.

Many hospitals use an algorithm developed by the health services company Optum to predict health costs and which patients would benefit from extra, customized care that would help them stay on their medications and prevent additional hospital care. However, a recent study entitled “Dissecting racial bias in an algorithm used to manage the health of populations” shows that the algorithm is biased in favor of white patients over black patients.

Although we assume that computer programs lack human biases, they still reflect the real world, which means they may reflect existing racial bias. With this particular algorithm, the study revealed it did indeed show a racial bias by relying on a faulty metric to determine medical needs.

How the algorithm should work

Because actual humans design computer programs, and these programs pull data compiled by humans, it should not come as a surprise that racial biases are built right into the programming. This s because of implicit bias in the healthcare system.

Brian Powers, an internal medicine physician at Brigham and Women’s Hospital and co-author of the study, talked to The Verge about what their research uncovered. He explained the following.

The algorithm the study analyzed is designed to flag patients with complex health care needs, and then uses the cost of care as a proxy for how sick they are. The algorithm itself was unbiased, but using cost prediction as a way to flag patients is biased because white patients are given costlier treatments than black patients. Just looking at the algorithm alone wouldn’t have identified the issue. “What we uncovered wasn’t in the algorithm and its ability to do what it was trained to do, but in the implementation, and the choice to use cost as a proxy,” he said.

Why it favors white patients over black patients

A simple algorithm can’t address the root causes of systemic inequities and issues. Cost of care is not an ideal quantitative tool because of inequities that start well before a patient even enters the hospital. Using cost as a metric is not “colorblind,” as black patients tend to access healthcare less often than white, wealthier patients. In fact, black patients spend an average of $1,800 less per year on healthcare than white patients. The computer program’s algorithm interprets this to mean that black patients must be healthier because they spend less on healthcare – leading it to recommend less or zero further medical care.

And, even once black patients are able to access healthcare, they face a variety of roadblocks. Notes Ashish Jha, K.T. Li Professor of Global Health at Harvard T.H. Chan School of Public Health, “We already know that the health care system disproportionately mismanages and mistreats black patients and other people of color. If you build those biases into the algorithms and don’t deal with it, you’re going to make those biases more pervasive and more systematic and people won’t even know where they are coming from.”

At McGowan, Hood, Felder & Phillips, LLC, we work to protect patients injured due to disparities in the healthcare systems. We will stand up for you and help ensure justice is served. To see how one of our South Carolina medical malpractice attorneys can help, schedule your free consultation today by calling 803-327-7800, or we invite you to reach out to us through our contact page.