Amnesty toolkit will help researchers uncover discrimination by algorithm in the public sector

Amnesty International has released a new Algorithmic Accountability Toolkit to help rights defenders, community organizations, and journalists shed light on harms and discrimination caused by algorithmic decision-making “in the public sector realms of welfare, policing, healthcare and education.”

More algorithms are driving automated decision-making within government bodies, which increases the risk of exclusion from essential services and limits the extent to which individuals can challenge decisions made about them, Amnesty said. 

@futureofgood A new toolkit is helping uncover government algorithms. #AmnestyInternational #algorithm #ai #nonprofitsoftiktok ♬ original sound – Future of Good

Amnesty offers multiple avenues for researchers and organizations to investigate the use of algorithms in government, ranging from Freedom of Information requests to Subject Access Requests. The latter allows people to access information about the data that is held about them within a particular organization. 

Procurement information, such as Requests for Proposals, Memoranda of Understanding and loan agreements, can also uncover the technology supply chain that contributes to the development of an algorithm, Amnesty said. 

Amnesty also provides a means for researchers to audit algorithms using machine learning and statistical analysis themselves, but recognizes that not all community organizations have access to this technical expertise.  

The toolkit emphasizes the importance of participatory research methods and community-led data collection, particularly those that involve the people most affected by algorithmic decision-making.

Your job. Your mission. Your news.

With your support, the sector you're building gets the journalism it deserves, and you get a tax receipt. 

Author

Sharlene has been reporting on responsible business, environmental sustainability and technology in the UK and Canada since 2018. She has worked with various organizations during this time, including the Stanford Social Innovation Review, the Pentland Centre for Sustainability in Business at Lancaster University, AIGA Eye on Design, Social Enterprise UK and Nature is a Human Right. Sharlene moved to Toronto in early 2023 to join the Future of Good team, where she has been reporting at the intersections of technology, data and social purpose work. Her reporting has spanned several subject areas, including AI policy, cybersecurity, ethical data collection, and technology partnerships between the private, public and third sectors.

NO PAYWALLS HERE

Future of Good’s journalism is free — always.

Subscribe to our newsletter for essential social sector reporting found nowhere else in Canada.

Grab Your Copy Now

SIGN UP NOW

* indicates required
Close the CTA