Ars Technica
Hospitals are using algorithms to determine the healthcare required for a patient. The algorithm provides a risk score and the higher the score the more medical attention is given. The trouble with algorithms is that they are too complex to be understood.
Researchers discovered that the algorithm at one hospital was giving lower risk scores to Black patients. White patients with the same symptoms would get higher scores and more care. Even though race is not an input to the algorithm. Because the Black patients miss out on the extra care provided they have a greater risk of something going wrong - perhaps requiring ER. visits or hospital stays,
The problem is that the algorithm does not actually identify health risks - it identifies costs. Poorer patients with worse insurance are likely to run up lower costs because they can’t afford to pay and their insurance won’t cover it, Therefore they get worse treatment, and are more likely to face extra expenses they may not be able to afford.
While neither the hospital, nor the software are identified the article suggests that the algorithm is one written by an insurer.