
In "Race After Technology," Ruha Benjamin exposes how algorithms encode racism, creating a "New Jim Code" beneath tech's neutral facade. Required reading for understanding digital inequality, this groundbreaking work has become central to Black Lives Matter discussions on surveillance and systemic discrimination.
Feel the book through the author's voice
Turn knowledge into engaging, example-rich insights
Capture key ideas in a flash for fast learning
Enjoy the book in a fun and engaging way
Break down key ideas from Race After Technology into bite-sized takeaways to understand how innovative teams create, collaborate, and grow.
Distill Race After Technology into rapid-fire memory cues that highlight Pixar’s principles of candor, teamwork, and creative resilience.

Experience Race After Technology through vivid storytelling that turns Pixar’s innovation lessons into moments you’ll remember and apply.
Ask anything, pick the voice, and co-create insights that truly resonate with you.

From Columbia University alumni built in San Francisco

Get the Race After Technology summary as a free PDF or EPUB. Print it or read offline anytime.
Imagine a beauty contest judged not by humans but by artificial intelligence-an algorithm selecting winners based on "objective" standards of beauty. When Beauty AI ran exactly this contest in 2016, the results were shocking: nearly all winners were white. This wasn't a glitch but a revelation of how machines learn to reproduce our existing prejudices. Welcome to the world of the "New Jim Code"-technologies that appear neutral or even beneficial while encoding and reproducing racial hierarchies. In an era where algorithms increasingly determine who gets jobs, loans, healthcare, and freedom, these digital systems aren't just reflecting our biases-they're amplifying them at unprecedented scale and speed. What makes this particularly troubling is how these technologies operate under a veneer of objectivity. When Facebook's algorithms reproduce racist patterns from users' behavior, or when Google's search results reinforce stereotypes, they're not neutral tools but active participants in perpetuating discrimination. The defining characteristic isn't just that they discriminate, but that they do so while claiming objectivity or even benevolence, often under the guise of efficiency and innovation. Why does this matter? Because Silicon Valley's "Move Fast and Break Things" ethos raises a critical question: what about the people broken in the process? When a woman with a tumor is denied a bank loan because an algorithm flags her as high risk, or when facial recognition systems consistently fail to identify people of color, we see how technology can encode human judgments and societal biases in ways that fundamentally reshape access and opportunity.