Review: Race After Technology: Abolitionist Tools for the New Jim Code
Race After Technology: Abolitionist Tools for the New Jim Code by Ruha Benjamin
My rating: 5 of 5 stars
At the center of Benjamin's book is a very simple premise: technology is not neutral and when we pretend otherwise, it will amplify the biases and inequities of the society that produces the technology. Benjamin then shows through a variety of spaces and contexts how this proves to be true time and again. In particular, she examines how racism creeps into technological structures both as a result of unquestioned bias in creators and programmers (e.g. the fair of facial recognition to recognize darker-skinned faces) but also as a direct result of historical racism that becomes culturally encoded in the physical world and unquestioningly transformed in the digital world (the overabundance of using facial-recognition programs on brown and black faces). A particular approach she uses regularly throughout the book is to show readers how the idea that the discourse on technology is often framed as "objective"--that is, if a machine says some, then it's a clear rational decision, and one cannot really question it. This rhetorical tactic hides the fact that the algorithms guiding the technology often are hidden from public view, leaving individuals unable to challenge or even examine the underlining assumptions that went into the technology in the first place. In this way, she draws strong connections between supposedly objective technology and color-blind ideology that often reinforces racist norms. Benjamin's critique is powerful to read as she moves from examples throughout the US to the world at large, often exploring different technologies (A.I., DNA tests, financial systems, and more) and is a must-read for anyone looking to better understand the challenges imbued in a technological society that blindly believes in tools over people.
View all my reviews
My rating: 5 of 5 stars
At the center of Benjamin's book is a very simple premise: technology is not neutral and when we pretend otherwise, it will amplify the biases and inequities of the society that produces the technology. Benjamin then shows through a variety of spaces and contexts how this proves to be true time and again. In particular, she examines how racism creeps into technological structures both as a result of unquestioned bias in creators and programmers (e.g. the fair of facial recognition to recognize darker-skinned faces) but also as a direct result of historical racism that becomes culturally encoded in the physical world and unquestioningly transformed in the digital world (the overabundance of using facial-recognition programs on brown and black faces). A particular approach she uses regularly throughout the book is to show readers how the idea that the discourse on technology is often framed as "objective"--that is, if a machine says some, then it's a clear rational decision, and one cannot really question it. This rhetorical tactic hides the fact that the algorithms guiding the technology often are hidden from public view, leaving individuals unable to challenge or even examine the underlining assumptions that went into the technology in the first place. In this way, she draws strong connections between supposedly objective technology and color-blind ideology that often reinforces racist norms. Benjamin's critique is powerful to read as she moves from examples throughout the US to the world at large, often exploring different technologies (A.I., DNA tests, financial systems, and more) and is a must-read for anyone looking to better understand the challenges imbued in a technological society that blindly believes in tools over people.
View all my reviews
Did you enjoy this read? Let me know your thoughts down below or feel free to browse around and check out some of my other posts!. You might also want to keep up to date with my blog by signing up for them via email.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Comments
Post a Comment