Algorithms Don't Kill People. People Kill People.
In 2018, Adrian McGonigal lost his Medicaid coverage. He was working full-time, but failed to report his hours through a new, mandatory online portal he didn’t know he had to use.
The system cut him off without a single human phone call. He only found out when his pharmacist told him his life-sustaining medication was no longer covered.
Without his meds the 40-year-old poultry worker with severe COPD landed in the hospital and lost his job. The hospital bills piled up. His health spiraled and the damage was permanent.
Six years later, in November 2024, Adrian McGonigal died of a heart attack. He was 46.
He died because a spreadsheet in Arkansas, designed by a consultant, approved by a politician, as a result of a policy instituted by the first Trump Administration.
A cruel immoral system that decided saving money to give tax cuts to the rich was more important than his lungs.
The computer programing didn’t make a mistake; it did exactly what it was designed to do.
We talk about AI as if it’s the main character of our time.
We say it’s coming for our jobs, it will destroy art, and will rule the world. But we’re telling the wrong story.
AI doesn’t vote, pass laws, or hoard wealth. People do.
The real story behind the AI hype is about the systems we’ve already built, and how they are being supercharged with automated tools of control.
The question isn’t “What will AI do to us?”
The question is, “What will this system do with AI, and what will we allow it to do?”
Digital Poorhouse
What happened to Adrian McGonigal was a design choice, not an accident.
In her book “Automating Inequality,” Virginia Eubanks, calls this the “digital poorhouse,” a vast, invisible architecture of automated systems designed to manage, discipline, and punish the poor.
Far from neutral these systems are built with the values of their creators baked in.
To them the poor are suspect, their needs are a drain on resources, and they must be managed with maximum efficiency and minimum human contact.
No empathy.
In Indiana, a contract with IBM and ACS to automate welfare eligibility led to one million denials.
In Arkansas, an algorithm cut home-care hours for the disabled by nearly 40% overnight.
In Texas, a system built by Deloitte repeatedly rejected a single father’s benefits application due to its own internal errors, leaving him and his toddler homeless.
In each case, the story is the same: a human need is met with an automated denial. A life is thrown into crisis by a line of code.
When questions are asked, the state points to the algorithm, and the vendor points to a trade secret.
The machine becomes the perfect shield for human decisions.
Algorithmic Landlord
For years, landlords have been using a software called YieldStar, made by a company called RealPage, to set rents.
YieldStar collects private, real-time leasing data from supposedly competing landlords and then “recommends” the highest possible price the market will bear.
Landlords who use it are told to follow the recommendations, to avoid “leaving money on the table.”
They have replaced the free market with algorithmic collusion. A modern day rental cartel in plain sight. It’s been happening for years, driving up rents in cities across the country.
Landlords called it “innovation.” Tenants, “extortion.” Both were right!
For years, RealPage and the landlords who used their software denied any wrongdoing. They were just using a tool to make better decisions.
But the reality of soaring rents is too stark to ignore.
On November 24, 2025, RealPage settled a massive lawsuit with the Department of Justice, agreeing to stop using confidential data to set prices.
Greystar, America’s largest landlord, already agreed to stop using the software.
The settlement is a crack in the wall. An admission these systems are not just “efficient,” they are powerful. And that power is being used against regular people.
The Power You Have
These systems aren’t abstract. They touch everyone’s life, including yours. They decide whether you can afford to live in your city, whether your family gets the care they need, whether you have a job tomorrow.
Behind these systems there are owners who run them, investors who fund them, managers who run them, developers who build them, and politicians who protect them.
The question is: are you going to let them?
The rise of these systems is meeting resistance.
Lawyers like Kevin De Liban at Legal Aid of Arkansas sued the state and forced them to stop using the algorithm that was cutting care for the disabled.
Journalists at ProPublica and The Markup exposed the RealPage cartel and the failures of predictive policing.
Organizations like the Algorithmic Justice League are auditing these systems for bias and demanding accountability.
People like you and me, are starting to ask the right questions.
Not “Is this AI good or bad?” but “Who benefits from this system?”
Not “Is this technology neutral?” but “What values are embedded in its code?”
The machine is not in charge. We are.
We must refuse to accept any story that says otherwise.
The algorithm is just code. The real villains are the people who build, fund, run, and protect it for profit at the expense of the rest of us.
It’s time we stop tolerating it.
Image: AI Generated
Sources:
With new work requirement, thousands lose Medicaid coverage in Arkansas | PBS NewsHour
As Republicans push work requirements in Medicaid, Arkansas offers a cautionary tale
Obituary | Adrian Thomas McGonigal of Bentonville | Funeralmation
Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor | Virginia Eubanks
What Happens When an Algorithm Cuts Your Health Care | Slate
Suit Filed Over Computer Program Making Medicaid Cuts | Arkansas Justice
When Algorithms Think You’re a “Risk,” It’s Almost Impossible to Prove Them Wrong | Jacobin
Rent Going Up? One Company’s Algorithm Could Be Why | ProPublica
RealPage Agrees to Settle Federal Rent-Collusion Case | The New York Times
DOJ: Greystar to Stop Using “Anticompetitive” Rent-Setting Software | ProPublica
Predictive Policing Software Terrible at Predicting Crimes | The Markup
Automated Dismissal Decisions, Data Protection and The Law of Unfair Dismissal – by Philippa Collins
We Saw Medicaid Work Requirements Up Close. You Don’t Want This Chaos.
Code Without Compassion, Part 1: Fired by Bot — The Amazon Flex Case
Justice Department to Settle Lawsuit Over Apartment Rental Pricing




Excellent analysis. I am saving this post in order to share in relevant conversations.