it is the right moment for everyone who builds systems of control to reflect. It’s the right moment to pierce those layers of abstraction that allow you to get through each day, and question why it’s so financially lucrative for the system you’re building to exist.
Because there is no abstraction as leaky as a man waiting outside your hotel at 6:45 in the morning with a gun and murderous intent
Amazingly written article, last line giving me chills
This one also a bit chilling:
A data scientist can tell most of someone’s life story given their zip code, and we try not to think too hard about why that’s always the most predictive feature in the model.
I didn’t recognise the ‘10 seconds of human consideration’ claim. I found this ProPublica report on it: How Cigna Saves Millions by Having Its Doctors Reject Claims Without Reading Them
There a former Cigna doctor says “It takes all of 10 seconds to do 50 at a time.” They claim to have seen documents for a two month period that put the average at 1.2 seconds per review. That’s using a specific review system that processed 300,000 claims over that period.
They don’t mention if there were other claims processed with different methods but still, the OP article seemed to be generous with that claim.
I don’t know what the “90%-error-rate AI” claim is about though. It’d be nice if the sources were actually cited.
It’s the right moment to pierce those layers of abstraction that allow you to get through each day, and question why it’s so financially lucrative for the system you’re building to exist.
I’m glad someone said it because this thought popped in my head yesterday. Been thinking about the consequences of my system, and really if it brings benefit to the users, but also who it affects indirectly.
So far, I’m ok with it. There is part of it that adds some safety for the business, the users, and people affected indirectly. But it still has a profit motive and that’s the uncomfortable part.
Edit: I should clarify that I’m talking about my software system. Not the healthcare system in the U.S. like the author is. It’s nowhere near as lucrative as making money off of people literally suffering from life. But the author mentioned how the CEOs see numbers not people. If the numbers my system collects ends up hurting people, that’s what I was reflecting on.
First time I’ve heard of the “Leaky Abstraction” concept, makes a lot of sense. Good metaphor too.
The Hacker News crowd uses this phrase every other sentence so it was almost humorous to see it used here. I thought this was a shitpost