
In "Race After Technology," Ruha Benjamin exposes how algorithms encode racism, creating a "New Jim Code" beneath tech's neutral facade. Required reading for understanding digital inequality, this groundbreaking work has become central to Black Lives Matter discussions on surveillance and systemic discrimination.
저자의 목소리로 책을 느껴보세요
지식을 흥미롭고 예시가 풍부한 인사이트로 전환
핵심 아이디어를 빠르게 캡처하여 신속하게 학습
재미있고 매력적인 방식으로 책을 즐기세요
Imagine a beauty contest judged not by humans but by artificial intelligence-an algorithm selecting winners based on "objective" standards of beauty. When Beauty AI ran exactly this contest in 2016, the results were shocking: nearly all winners were white. This wasn't a glitch but a revelation of how machines learn to reproduce our existing prejudices. Welcome to the world of the "New Jim Code"-technologies that appear neutral or even beneficial while encoding and reproducing racial hierarchies. In an era where algorithms increasingly determine who gets jobs, loans, healthcare, and freedom, these digital systems aren't just reflecting our biases-they're amplifying them at unprecedented scale and speed. What makes this particularly troubling is how these technologies operate under a veneer of objectivity. When Facebook's algorithms reproduce racist patterns from users' behavior, or when Google's search results reinforce stereotypes, they're not neutral tools but active participants in perpetuating discrimination. The defining characteristic isn't just that they discriminate, but that they do so while claiming objectivity or even benevolence, often under the guise of efficiency and innovation. Why does this matter? Because Silicon Valley's "Move Fast and Break Things" ethos raises a critical question: what about the people broken in the process? When a woman with a tumor is denied a bank loan because an algorithm flags her as high risk, or when facial recognition systems consistently fail to identify people of color, we see how technology can encode human judgments and societal biases in ways that fundamentally reshape access and opportunity.
Race After Technology의 핵심 아이디어를 이해하기 쉬운 포인트로 분해하여 혁신적인 팀이 어떻게 창조하고, 협력하고, 성장하는지 이해합니다.
Race After Technology을 빠른 기억 단서로 압축하여 솔직함, 팀워크, 창의적 회복력의 핵심 원칙을 강조합니다.

생생한 스토리텔링을 통해 Race After Technology을 경험하고, 혁신 교훈을 기억에 남고 적용할 수 있는 순간으로 바꿉니다.
무엇이든 물어보고, 목소리를 선택하고, 진정으로 공감되는 인사이트를 함께 만들어보세요.

샌프란시스코에서 컬럼비아 대학교 동문들이 만들었습니다
"Instead of endless scrolling, I just hit play on BeFreed. It saves me so much time."
"I never knew where to start with nonfiction—BeFreed’s book lists turned into podcasts gave me a clear path."
"Perfect balance between learning and entertainment. Finished ‘Thinking, Fast and Slow’ on my commute this week."
"Crazy how much I learned while walking the dog. BeFreed = small habits → big gains."
"Reading used to feel like a chore. Now it’s just part of my lifestyle."
"Feels effortless compared to reading. I’ve finished 6 books this month already."
"BeFreed turned my guilty doomscrolling into something that feels productive and inspiring."
"BeFreed turned my commute into learning time. 20-min podcasts are perfect for finishing books I never had time for."
"BeFreed replaced my podcast queue. Imagine Spotify for books — that’s it. 🙌"
"It is great for me to learn something from the book without reading it."
"The themed book list podcasts help me connect ideas across authors—like a guided audio journey."
"Makes me feel smarter every time before going to work"
샌프란시스코에서 컬럼비아 대학교 동문들이 만들었습니다

Race After Technology 요약을 무료 PDF 또는 EPUB으로 받으세요. 인쇄하거나 오프라인에서 언제든 읽을 수 있습니다.