-
when making more equitable tech, people start with “fairness”; while a step in the right direction, it is not a large enough one
-
consider two kids splitting a cookie; the mathematically fair answer is to split it in half, but that plan falls apart in the real world, as one half will nearly always be larger than the other
-
this situation can be a springboard to further negotiation; maybe they agree that a fair trade is the smaller portion for TV selection
-
there is a key distinction here between social fairness and mathematical fairness, and computers can only calculate the latter
-
people continue to insist on using technology to solve more problems because of technochauvinism, a kind of bias that considers computational solutions superior to all other solutions
-
this bias contains an a priori assumption that computers are better than humans; this is really a claim that the the people who make and program computers are better than other humans
-
technochauvinism is usually accompanied by equally bogus notions like “computers make neutral decisions because their decisions are based on math”
-
this is patently untrue, as computers constantly fail at making social decisions
-
“The next time you run into a person who insists unnecessarily on using technology to solve a complex social problem, please tell them about cookie division and the difference between social and mathematical fairness.”
-
equality is not the same as equity or justice
-
we cannot adequately address the shortcomings of our algorithmic systems until we acknowledge that racism, sexism, and ableism are not glitches; glitches are temporary and inconsequential, but these biases are baked into the very core of these systems
-
look into: Safiya Noble and Ruha Benjamin
-
sometimes we can make the tech less discriminatory; sometimes we can't and shouldn't use it at all; sometimes the solution is somewhere in between
-
consider the case of the racist soap dispenser, which first reached public prominence in a 2017 viral video; a dark-skinned man and a light-skinned man both tried to use an automatic soap dispenser; the dispenser refuses to work for the dark-skinned man until he covers his hand with a white paper towel, demonstrating that the dispenser only responds to light colors
-
every kind of sensor technology, from facial recognition to automatic faucets, is similarly discriminatory
-
this problem goes back to film tech; until the 1970s, Kodak tuned its film-development machines using Shirley cards, which contained an image of a light-skinned woman surrounded by bright primary colors
-
they only included darker skin colors on the Shirley cards because furniture manufacturers complained that their walnut and mahogany furniture looked muddy in color photographs; in other words, they only improved rendering for darker skin tones as a side effect of an unrelated decision because they stood to lose money from corporate clients
-
we need to start by recognizing the role that unconscious bias plays in the technological world; it seems most likely that the soap dispenser (for example) was designed by a small, homogeneous group of light-skinned people who tested it on themselves and assumed that it would work similarly for everyone else
-
“They probably thought, like many engineers, that because they were using sensors and math and electricity, they were making something 'neutral'. They were wrong.”
-
quoting Nikole Hannah-Jones: “Black Americans are amongst the most astute political and social observers of American power because our survival has and still depends on it.”
-
look into: Artificial Unintelligence (Broussard)
-
while many computer scientists have come around to the idea of making tech “more ethical” or “fairer”, this is not enough; we need to audit all of our tech to find out how it is racist, sexist, or ableist
-
here's something I hadn't thought about: if a city moves its public alert system to social media, then it is cutting off access for those who are Blind or who lack access to technology for whatever other reason
-
“We should not cede control of essential civic functions to these tech systems, nor should we claim they are “better” or “more innovative” until and unless those technical systems work for every person regardless of skin color, class, age, gender, and ability.”
-
we can start by improving diversity among engineering teams; Google's annual diversity report showed that only 3% of its employees were Black, 2% of its new hires that year were Black women, and Black, Latinx, and Native American employees left the company at the highest rates; this typical of (indeed the best among) major tech companies
-
instead of technochauvinism, we need to use the right tool for the task, whether or not it's a computer
-
look into: Algorithms of Oppression (Noble)
Understanding Machine Bias