All of us had to deal with faulty or troublesome algorithms: Weirdly programmed retargeting advertising, flight changes based on algorithms, not on our preferences, odd pre-selected choices by Netflix or Amazon, the resume that went nowhere because the algorithm decided for humans.
This is annoying, maybe even maddening. But it’s nothing compared to what Tammy Dobbs had to experience. She is an Arkansas resident with cerebral palsy who had her Medicaid-provided home care cut from 56 to 32 hours. The reason? An algorithm.
“The algorithm that upended Dobbs’ life fits comfortably, when printed, on about 20 pages. Although it’s difficult to decipher without expert help, the algorithm computes about 60 descriptions, symptoms, and ailments — fever, weight loss, ventilator use — into categories, each one corresponding to a number of hours of home care.
Like many industries, health care has turned to automation for efficiency. The algorithm used in Arkansas is one of a family of tools, called “instruments,” that attempt to provide a snapshot of a person’s health in order to inform decisions about care everywhere from nursing homes to hospitals and prisons.”
In the end, Legal Aid successfully sued the State of Arkansas and the algorithmic allocation system was judged to be unconstitutional.
This is just the beginning and we need to fight back now. With advances in Machine Learning and AI, with exponentially growing algorithm complexities, humans will increasingly point towards the algorithm and say: “Don’t blame me. I just work here.”
The Facebook defense doesn’t cut it. Algorithms are not God-given, they are not the only solution to the problem. Algorithms are choices based on preferences humans set.
Our world is transformed by algorithms. The way we see the world, the news we read, the culture we experience, our world through screens is being decided by algorithms that were programmed by people.
So far, bad algorithms are nothing more than a nuisance for most of us. And Tammy Dobbs is a terrible exception. If we don’t act now, that exception might become the rule.