Embedded biases are very difficult to undo. Programmers make literally thousands of choices beneath the hood of smart tech that the rest of us can't see. Automation is increasingly being used to make vital and life-changing decisions for people. Therefore, the choices that programmers (overwhelmingly white men) make, based on their own experiences and backgrounds, become more important.
For instance, smart tech is increasingly used to screen applications for mortgages. It is illegal to ask questions about, say, race, in these applications, so programmers create “proxies,” or substitute questions, to create a profile of an applicant. For instance, a zip code could be used as a proxy for “safe neighborhood.” Safe generally means white, particularly for white programmers using their own life experiences. In addition, the data is needed to train smart tech systems. An automated mortgage screening process will use data from the enormous data sets from past mortgage application decisions. Black people were historically denied mortgages at astonishing rates and therefore will be woefully underrepresented in these data sets. In this way, seemingly benign programming decisions, mundane proxies, and historic data create embedded biases against people of color that is difficult to see from the outside.
Once bias is baked into smart tech, it stays there forever and becomes self-reinforcing over time. Nancy Smyth, former dean of the School of Social Work at the University of Buffalo, State University of New York, says, “Code is racist because society is racist.” 13
In her book Race After Technology , Ruha Benjamin describes a “New Jim Code.” It is a take on the old Jim Crow that powered decades of institutional racism in Reconstructionist southern states. She writes, “The animating force of the New Jim Code is that tech designers encode judgments into technical systems but claim that the racist results of their designs are entirely exterior to the encoding process. Racism thus becomes doubled—magnified and buried under layers of digital denial.” 14 She later writes, “… the outsourcing of human decisions is, at once, the insourcing of coded inequity.” We will explore the ethical use of smart tech throughout this book.
The book is divided into three parts. Part I, “Understanding and Using Artificial Intelligence,” focuses on the leadership needed to use smart tech well, the history of smart tech, and key issues for using it that organizations need to be prepared for: the need to stay deeply human-centered in planning and use of smart tech, the enormous amounts of clean data necessary to power the systems, and the ethical concerns and considerations necessary to ensure bias is mitigated.
Part II, “The Smart Nonprofit: Use-Case Examples and Management,” focuses on the applications of smart tech within organizations. It begins with a chapter to carefully and thoughtfully select a specific application of smart tech. Chapters on the use of smart tech for program delivery, fundraising, back-office operations, and philanthropy follow.
Part III, “Where We Go from Here,” concludes with a look about the probable future of nonprofits and social change in an automated world.
We wrote this book to help organizations prepare to benefit from automation and avoid mistakes. Smart tech can help nonprofits become more efficient and use that dividend of time to build better relationships with stakeholders inside and outside of the organization. Smart tech can better leverage data to better understand program impact. We want nonprofits to harness this technology for good, which requires organizational leaders to understand the limitations of smart tech and apply it strategically, ethically, and with responsibility.
Ultimately, the purpose of using smart tech shouldn't be to make organizations go faster but to make your organization better at solving problems and taking care of people inside and outside in more humane ways. This will only happen when everyone in the organization has the information, tools, and opportunity to shape their own lives and futures. That's the true mark of success for a smart nonprofit and what we will share in the rest of the book.
1 1. Sydney Brownstone, “This data tool helps homeless people get housing. If you're white, your chances are even better,” The Seattle Times (October 29, 2019), https://www.seattletimes.com/seattle-news/homeless/this-data-tool-helps-homeless-people-get-housing-if-youre-white-your-chances-are-even-better/.
2 2. Leah Post, author email interview on July 9, 2021.
3 3. Heather L. McCauley, ScD; Taylor Reid, BA, Michigan State University, “Assessing Vulnerability, Prioritizing Risk: The Limitations of the VI-SPDAT for Survivors of Domestic & Sexual Violence” (July 20, 2020), https://dcadv.org/file_download/inline/b1bb3b28-8039-4590-aa1d-daaef5fb6546.
4 4. Iain De Jong, author interview on June 25, 2021.
5 5. Jake Maguire, author interview on June 30, 2021.
6 6. Steve MacLaughlin, “The Impact of AI on Philanthropy” Engage Podcast Series (October 20, 2020), https://nofilternonprofit.blackbaud.com/raise-engage-podcast-series/episode-167-the-impact-of-ai-on-philanthropy.
7 7. Chris Arsenault, “Using AI, Canadian city predicts who might become homeless,” Reuters (October 15, 2020), https://www.reuters.com/article/us-canada-homelessness-tech-idCAKBN27013Y.
8 8. Heejae Lim, author interview on August 4, 2021.
9 9. Google AI Impact Challenge ( https://ai.google/social-good/impact-challenge/).
10 10. TalkingPoints website, https://talkingpts.org/talkingpoints-increases-parent-engagement-for-student-success/415/.
11 11. Nicholas Carr, The Glass Cage (New York: W. W. Norton & Company; September 8, 2015).
12 12. Cathy O'Neill, Naked Capitalism Blog, August 26, 2027, https://www.nakedcapitalism.com/2017/08/data-scientist-cathy-oneil-algorithms-opinions-embedded-code.html.
13 13. Nancy Smyth, author interview on May 25, 2021.
14 14. Ruha Benjamin, Race After Technology (New York: Wiley; June 17, 2019).
Конец ознакомительного фрагмента.
Текст предоставлен ООО «ЛитРес».
Прочитайте эту книгу целиком, купив полную легальную версию на ЛитРес.
Безопасно оплатить книгу можно банковской картой Visa, MasterCard, Maestro, со счета мобильного телефона, с платежного терминала, в салоне МТС или Связной, через PayPal, WebMoney, Яндекс.Деньги, QIWI Кошелек, бонусными картами или другим удобным Вам способом.