To highlight
Misguided Evidence

Conclusions can only be as reliable (but also as neutral) as the data they are based on, and this can lead to bias.

An example is the COMPAS algorithm, used in the US to calculate the likelihood of recidivism in the Criminal Justice System. An investigation conducted by Pro-Publica revealed that, as the data on which it is based is biased against Black people, the algorithm generally outputs a higher risk-score for the latter than for White people.

Tools for Bias

Debiaswe – Removing Gender Stereotypes

This methodology comes from the work of Bolukbasi et al (2016) Man is to computer programmer as woman is to homemaker?. It is a methodology for modifying word embeddings to remove gender stereotypes.

Get involved!

AI for People is open for collaborations, funding and volunteers to make them reach a more mature stage. Help us to make them become a reality!

Receive Updates

Slack Channel
Join our discussions

Attend & meet us

Support AI for People