Algorithmic Bias: Better Policy and Practice for Civil Society

In Conversations, Events, Past Events, Virtual Roundtables by Digital ImpactLeave a Comment

Listen to Lucy Bernholz, Virginia Eubanks, Rashida Richardson, and Di Luong discuss the growing use of algorithms in our communities and how civil society organizations can work toward better policy for predictive technologies.


Pretrial risk assessment algorithms are often correlated with race and perpetuate decades of racial, ethnic, and economic bias. With funding from a 2016 Digital Impact Grant, Media Mobilizing Project is leading a project to understand pretrial risk assessment algorithms. The project aims to illuminate the bias and transparency issues inherent in many of these tools.

To help ensure that these tools benefit those they are meant to serve, the AI Now Institute at New York University is working with civil society organizations and research institutions to challenge government use of algorithmic decision systems. In her latest book, Automating Inequality, Virginia Eubanks gives examples of how data mining, policy algorithms, and predictive risk models affect poor and working-class people in America.

Civil society groups have a critical role in conducting and communicating research on these technologies. This discussion offers a glimpse of a roadmap for building coalitions around digital policymaking.

Featured Speakers

  • Moderated by Lucy Bernholz, Director of the Digital Civil Society Lab, the panel included:

    Watch the discussion using the media player above, read the transcript, or listen by using the audio player below or by visiting the Digital Impact podcast on iTunes.


    Here are a few bills that deal with the use of AI, IoT, and private data:


    Looking for more information? See these speaker-recommended resources:

    • From Virginia Eubanks
      • Our Data Bodies
        • Human rights and data justice project that works with local communities to investigate how digital information is collected, stored, and shared by government and corporations
      • Seattle Surveillance Ordinance
        • Designed to provide greater transparency to City Council and the public when the City acquires technology that meets the City’s definition of surveillance

    Have a question or case study to share on the use of predictive technologies in civil society? Comment below and share on social with #DataDiscrimination. Have an idea for a virtual roundtable? Tell us at hello@digitalimpact.org.

    Get the latest from Digital Impact. Subscribe to our newsletter, follow us on Twitter @dgtlimpact, or better yet, become a contributor.

    Leave a Comment