In this All Tech is Human July 15, 2020 webinar, Mutale Nkonde (CEO of AI for the People and Standford Digital Society Lab fellow) and Charlton Mcllwain (author of Black Software: The Internet & Racial Justice, From the AfroNet to Black Lives Matter & Vice Provost for Faculty Engagement and Development at NYU) discussed the challenges and opportunities in implementing anti-racist technology.
Key takeaways outlined by All Tech is Human
It is important for technology developers and designers to have a deeper understanding of racial history and dynamics. Mutale and Charlton explain how technologists cannot develop anti-racist technology if they do not understand race or refuse to talk about the connection between race and technology. They recommend that the starting point towards meaningful progress is to require technologists to learn and become fluent in black history and critical race theory so that they can understand the impact of what they are creating.
In order to drive large-scale change, we should look at how we incentivize companies. Charlton discusses the difficulty in thinking about anti-racist technology in an environment where there is an existing incentive to use technology to criminalize people of color. He discusses the IBM and NYPD partnership to show how law enforcement systems are already investing in these (often faulty) technologies. Mutale explains how the first step in changing this paradigm is to incentivize companies to (1) test their technology and algorithms to ensure that it is not racist or propagating racial biases and (2) develop anti-racist technology. She calls for the creation of third-party agencies or legislation to ensure that technology companies are adhering to “algorithmic accountability”.
When discussing the use of facial recognition, Mutale asks us to “gauge our radical imagination” and really think about what this technology is used for. Facial recognition is predominantly used in the security and law enforcement industries, and is the newest tool in a lineage of technology used to oppress people of color. Mutale illustrates the historic use of technology in black surveillance from the lantern law (which mandated that people of color must walk with a lantern at night to identify themselves) in the 18th century to current technology-enabled policing. She challenges us to think about a world where we do not need facial recognition for humans.
Charlton and Mutale urge individuals and organizations to have the uncomfortable but necessary conversations around race and social structures and do the work to understand the dynamics between race and technology. Charlton has developed a Critical Race & Digital Studies Syllabus of work from scholars of color, which can be used as a launchpad for learning and discussions around the relationship between race and the digital world.
“We have been at this junction before…where we could think about the intersection of race and technology and what the path forward looked like. We made the wrong decision and we’re seeing the repercussions everyday today. I think our pace of technological advancement is continuing, ramping up in many ways and so we have a moment to decide again ‘what will our future look like?’”- Charlton McIlwain