
In "Race After Technology," Ruha Benjamin exposes how algorithms encode racism, creating a "New Jim Code" beneath tech's neutral facade. Required reading for understanding digital inequality, this groundbreaking work has become central to Black Lives Matter discussions on surveillance and systemic discrimination.
Ruha Benjamin, author of Race After Technology: Abolitionist Tools for the New Jim Code, is an award-winning scholar and professor of African American Studies at Princeton University, renowned for her groundbreaking work on systemic racism embedded in technology and science.
Her book, a critical exploration of how algorithms and digital tools perpetuate racial inequities, merges sociology, ethics, and tech criticism, reflecting her decades of research on innovation’s societal impacts.
Benjamin, who holds a PhD from UC Berkeley and completed postdoctoral fellowships at UCLA and Harvard, founded Princeton’s Ida B. Wells Just Data Lab to reimagine data-driven justice. Her other influential works include Viral Justice: How We Grow the World We Want and Imagination: A Manifesto, which further dissect structural inequality and advocate for liberatory futures.
A recipient of the MacArthur “Genius” Fellowship and the Stowe Prize, Benjamin’s insights have been featured in The Guardian, Nature, and TED Talks. Race After Technology, translated into over 15 languages, is widely taught in universities and cited in tech ethics reforms globally.
Race After Technology examines how emerging technologies like algorithms, facial recognition, and predictive policing reinforce systemic racism through what Benjamin calls the "New Jim Code" – systems that appear neutral but perpetuate discrimination. The book analyzes cases like biased healthcare algorithms and carceral technologies, while offering abolitionist frameworks to create equitable tech.
This book is essential for social justice advocates, tech developers, policymakers, and educators seeking to understand how racism embeds itself in digital systems. It’s particularly relevant for those interested in algorithmic bias, criminal justice reform, or ethical AI development.
Key concepts include the New Jim Code (tech-driven racial hierarchy), discriminatory design (tools that amplify inequity), and abolitionist tools (community-centered solutions). Benjamin argues that "neutral" technologies often automate historical prejudices, such as resume screeners filtering out Black-sounding names or risk-assessment tools targeting marginalized neighborhoods.
The term describes how coded technologies replicate and modernize racial segregation, mirroring the Jim Crow era’s exclusionary practices. Examples include biased loan-approval algorithms and policing tools that disproportionately surveil communities of color.
Benjamin advocates for abolitionist tools – solutions rooted in collective care over carceral control. This includes participatory design processes, transparency in AI training data, and prioritizing marginalized communities’ needs over profit-driven tech development.
“Innovation is more resource than revelation” underscores that tech progress must serve public good, not private gain. “The default setting of technology is justice” challenges developers to actively combat bias rather than assume neutrality.
The book critiques “ethical AI” as insufficient if it doesn’t address structural racism. Benjamin argues ethics committees often prioritize corporate interests, urging instead grassroots accountability models for machine learning systems.
Yes – Ruha Benjamin won a 2024 MacArthur “Genius” Fellowship for this work, and the book has become a seminal text in critical technology studies, taught in over 200 universities globally.
Some scholars argue Benjamin’s abolitionist approach lacks concrete implementation roadmaps. However, the book’s 2023 afterword addresses this by highlighting real-world initiatives like the Ida B. Wells Just Data Lab’s community-led AI audits.
With AI now dominating healthcare, education, and hiring, Benjamin’s warnings about encoded bias remain urgent. Recent controversies over ChatGPT’s racial stereotyping and drone surveillance in marginalized neighborhoods validate her critiques.
While Viral Justice focuses on grassroots collective action, Race After Technology provides a structural analysis of tech’s role in oppression. Both emphasize imagination as key to societal transformation but target different intervention points.
Yes – the book influenced companies like Microsoft and Google to adopt equity-focused design principles. Benjamin’s “bias stress tests” are now used to audit hiring algorithms and housing ad targeting systems.
Sinta o livro através da voz do autor
Transforme conhecimento em insights envolventes e ricos em exemplos
Capture ideias-chave em um instante para aprendizado rápido
Aproveite o livro de uma forma divertida e envolvente
Machines learn to reproduce our existing prejudices.
Algorithms are getting too prominent in the world.
Can robots be racist? They certainly can.
Move Fast and Break Things: what about the people broken in the process?
Algorithms encode human judgments and societal biases.
Divida as ideias-chave de Race After Technology em pontos fáceis de entender para compreender como equipes inovadoras criam, colaboram e crescem.
Destile Race After Technology em dicas de memória rápidas que destacam os princípios-chave de franqueza, trabalho em equipe e resiliência criativa.

Experimente Race After Technology através de narrativas vívidas que transformam lições de inovação em momentos que você lembrará e aplicará.
Pergunte qualquer coisa, escolha a voz e co-crie insights que realmente ressoem com você.

Criado por ex-alunos da Universidade de Columbia em San Francisco
"Instead of endless scrolling, I just hit play on BeFreed. It saves me so much time."
"I never knew where to start with nonfiction—BeFreed’s book lists turned into podcasts gave me a clear path."
"Perfect balance between learning and entertainment. Finished ‘Thinking, Fast and Slow’ on my commute this week."
"Crazy how much I learned while walking the dog. BeFreed = small habits → big gains."
"Reading used to feel like a chore. Now it’s just part of my lifestyle."
"Feels effortless compared to reading. I’ve finished 6 books this month already."
"BeFreed turned my guilty doomscrolling into something that feels productive and inspiring."
"BeFreed turned my commute into learning time. 20-min podcasts are perfect for finishing books I never had time for."
"BeFreed replaced my podcast queue. Imagine Spotify for books — that’s it. 🙌"
"It is great for me to learn something from the book without reading it."
"The themed book list podcasts help me connect ideas across authors—like a guided audio journey."
"Makes me feel smarter every time before going to work"
Criado por ex-alunos da Universidade de Columbia em San Francisco

Obtenha o resumo de Race After Technology como PDF ou EPUB gratuito. Imprima ou leia offline a qualquer momento.
Imagine a beauty contest judged not by humans but by artificial intelligence-an algorithm selecting winners based on "objective" standards of beauty. When Beauty AI ran exactly this contest in 2016, the results were shocking: nearly all winners were white. This wasn't a glitch but a revelation of how machines learn to reproduce our existing prejudices. Welcome to the world of the "New Jim Code"-technologies that appear neutral or even beneficial while encoding and reproducing racial hierarchies. In an era where algorithms increasingly determine who gets jobs, loans, healthcare, and freedom, these digital systems aren't just reflecting our biases-they're amplifying them at unprecedented scale and speed. What makes this particularly troubling is how these technologies operate under a veneer of objectivity. When Facebook's algorithms reproduce racist patterns from users' behavior, or when Google's search results reinforce stereotypes, they're not neutral tools but active participants in perpetuating discrimination. The defining characteristic isn't just that they discriminate, but that they do so while claiming objectivity or even benevolence, often under the guise of efficiency and innovation. Why does this matter? Because Silicon Valley's "Move Fast and Break Things" ethos raises a critical question: what about the people broken in the process? When a woman with a tumor is denied a bank loan because an algorithm flags her as high risk, or when facial recognition systems consistently fail to identify people of color, we see how technology can encode human judgments and societal biases in ways that fundamentally reshape access and opportunity.