What Social Impact Practitioners Are Missing About Artificial Intelligence

The Unintended Consequences of AI Blind Spots Can Be Ominous

Why It Matters

AI systems have massive potential, but issues of bias loom. Massive data sets, when put through various filters and systems, can lead to great harm. Impact-focused leaders must understand what this means and how to spot them.

var TRINITY_TTS_WP_CONFIG = {"cleanText":"What Social Impact Practitioners Are Missing About Artificial Intelligence. We made it clear in Part 1 that artificial intelligence in Canada is expanding quickly and will have an influence on the HR mix in the world of social impact. For all its benefits, we can\u2019t ignore the unintended consequences that arise, or what mathematician and author Cathy O\u2019Neil calls the \u201cWeapons of Math Destruction,\u201d which is also the name of her book, subtitled How Big Data Increases Inequality and Threatens Democracy. O\u2019Neil has spent time in academica, at a hedge fund, and in a tech startup. She has theoretical and applied experience worthy of our attention, and she is raising the alarm about potentially harmful applications of AI and machine learning.

Future of Good journalism and events produce insightful analysis and knowledge you need to work and grow.

Read this article with a free account or explore membership options for unlimited access.