
In "Race After Technology," Ruha Benjamin exposes how algorithms encode racism, creating a "New Jim Code" beneath tech's neutral facade. Required reading for understanding digital inequality, this groundbreaking work has become central to Black Lives Matter discussions on surveillance and systemic discrimination.
Ruha Benjamin, author of Race After Technology: Abolitionist Tools for the New Jim Code, is an award-winning scholar and professor of African American Studies at Princeton University, renowned for her groundbreaking work on systemic racism embedded in technology and science.
Her book, a critical exploration of how algorithms and digital tools perpetuate racial inequities, merges sociology, ethics, and tech criticism, reflecting her decades of research on innovation’s societal impacts.
Benjamin, who holds a PhD from UC Berkeley and completed postdoctoral fellowships at UCLA and Harvard, founded Princeton’s Ida B. Wells Just Data Lab to reimagine data-driven justice. Her other influential works include Viral Justice: How We Grow the World We Want and Imagination: A Manifesto, which further dissect structural inequality and advocate for liberatory futures.
A recipient of the MacArthur “Genius” Fellowship and the Stowe Prize, Benjamin’s insights have been featured in The Guardian, Nature, and TED Talks. Race After Technology, translated into over 15 languages, is widely taught in universities and cited in tech ethics reforms globally.
Race After Technology examines how emerging technologies like algorithms, facial recognition, and predictive policing reinforce systemic racism through what Benjamin calls the "New Jim Code" – systems that appear neutral but perpetuate discrimination. The book analyzes cases like biased healthcare algorithms and carceral technologies, while offering abolitionist frameworks to create equitable tech.
This book is essential for social justice advocates, tech developers, policymakers, and educators seeking to understand how racism embeds itself in digital systems. It’s particularly relevant for those interested in algorithmic bias, criminal justice reform, or ethical AI development.
Key concepts include the New Jim Code (tech-driven racial hierarchy), discriminatory design (tools that amplify inequity), and abolitionist tools (community-centered solutions). Benjamin argues that "neutral" technologies often automate historical prejudices, such as resume screeners filtering out Black-sounding names or risk-assessment tools targeting marginalized neighborhoods.
The term describes how coded technologies replicate and modernize racial segregation, mirroring the Jim Crow era’s exclusionary practices. Examples include biased loan-approval algorithms and policing tools that disproportionately surveil communities of color.
Benjamin advocates for abolitionist tools – solutions rooted in collective care over carceral control. This includes participatory design processes, transparency in AI training data, and prioritizing marginalized communities’ needs over profit-driven tech development.
“Innovation is more resource than revelation” underscores that tech progress must serve public good, not private gain. “The default setting of technology is justice” challenges developers to actively combat bias rather than assume neutrality.
The book critiques “ethical AI” as insufficient if it doesn’t address structural racism. Benjamin argues ethics committees often prioritize corporate interests, urging instead grassroots accountability models for machine learning systems.
Yes – Ruha Benjamin won a 2024 MacArthur “Genius” Fellowship for this work, and the book has become a seminal text in critical technology studies, taught in over 200 universities globally.
Some scholars argue Benjamin’s abolitionist approach lacks concrete implementation roadmaps. However, the book’s 2023 afterword addresses this by highlighting real-world initiatives like the Ida B. Wells Just Data Lab’s community-led AI audits.
With AI now dominating healthcare, education, and hiring, Benjamin’s warnings about encoded bias remain urgent. Recent controversies over ChatGPT’s racial stereotyping and drone surveillance in marginalized neighborhoods validate her critiques.
While Viral Justice focuses on grassroots collective action, Race After Technology provides a structural analysis of tech’s role in oppression. Both emphasize imagination as key to societal transformation but target different intervention points.
Yes – the book influenced companies like Microsoft and Google to adopt equity-focused design principles. Benjamin’s “bias stress tests” are now used to audit hiring algorithms and housing ad targeting systems.
Почувствуйте книгу через голос автора
Превратите знания в увлекательные, богатые примерами идеи
Захватите ключевые идеи мгновенно для быстрого обучения
Наслаждайтесь книгой в весёлой и увлекательной форме
Machines learn to reproduce our existing prejudices.
Algorithms are getting too prominent in the world.
Can robots be racist? They certainly can.
Move Fast and Break Things: what about the people broken in the process?
Algorithms encode human judgments and societal biases.
Разбейте ключевые идеи Race After Technology на понятные тезисы, чтобы понять, как инновационные команды создают, сотрудничают и растут.
Выделите из Race After Technology быстрые подсказки для запоминания, подчёркивающие ключевые принципы открытости, командной работы и творческой устойчивости.

Погрузитесь в Race After Technology через яркие истории, превращающие уроки инноваций в запоминающиеся и применимые моменты.
Задавайте любые вопросы, выбирайте голос и совместно создавайте идеи, которые действительно находят у вас отклик.

Создано выпускниками Колумбийского университета в Сан-Франциско
"Instead of endless scrolling, I just hit play on BeFreed. It saves me so much time."
"I never knew where to start with nonfiction—BeFreed’s book lists turned into podcasts gave me a clear path."
"Perfect balance between learning and entertainment. Finished ‘Thinking, Fast and Slow’ on my commute this week."
"Crazy how much I learned while walking the dog. BeFreed = small habits → big gains."
"Reading used to feel like a chore. Now it’s just part of my lifestyle."
"Feels effortless compared to reading. I’ve finished 6 books this month already."
"BeFreed turned my guilty doomscrolling into something that feels productive and inspiring."
"BeFreed turned my commute into learning time. 20-min podcasts are perfect for finishing books I never had time for."
"BeFreed replaced my podcast queue. Imagine Spotify for books — that’s it. 🙌"
"It is great for me to learn something from the book without reading it."
"The themed book list podcasts help me connect ideas across authors—like a guided audio journey."
"Makes me feel smarter every time before going to work"
Создано выпускниками Колумбийского университета в Сан-Франциско

Получите резюме книги «Race After Technology» в формате PDF или EPUB бесплатно. Распечатайте или читайте офлайн в любое время.
Imagine a beauty contest judged not by humans but by artificial intelligence-an algorithm selecting winners based on "objective" standards of beauty. When Beauty AI ran exactly this contest in 2016, the results were shocking: nearly all winners were white. This wasn't a glitch but a revelation of how machines learn to reproduce our existing prejudices. Welcome to the world of the "New Jim Code"-technologies that appear neutral or even beneficial while encoding and reproducing racial hierarchies. In an era where algorithms increasingly determine who gets jobs, loans, healthcare, and freedom, these digital systems aren't just reflecting our biases-they're amplifying them at unprecedented scale and speed. What makes this particularly troubling is how these technologies operate under a veneer of objectivity. When Facebook's algorithms reproduce racist patterns from users' behavior, or when Google's search results reinforce stereotypes, they're not neutral tools but active participants in perpetuating discrimination. The defining characteristic isn't just that they discriminate, but that they do so while claiming objectivity or even benevolence, often under the guise of efficiency and innovation. Why does this matter? Because Silicon Valley's "Move Fast and Break Things" ethos raises a critical question: what about the people broken in the process? When a woman with a tumor is denied a bank loan because an algorithm flags her as high risk, or when facial recognition systems consistently fail to identify people of color, we see how technology can encode human judgments and societal biases in ways that fundamentally reshape access and opportunity.