How is fairness in AI calculated?

AI Fairness
AI diversity
  1. ‘good fairness foundation’ helps build trust in the public, which is necessary for the adoption and use of AI in society;
  2. Having fair and responsible principles in the work behind the systems makes the system be prepared for compliance with regulations and laws.
AI fairness metrics
KOSA AI fairness metrics
  • including balanced data
  • including 5 responsible AI principles: Accountability, Fairness, Transparency, Safety and Robustness.
Responsible AI
Read our blog about Responsible AI principles.
  • Individual Fairness: also known as fairness through awareness. According to this metric, classifiers are “fair” if they predict similar outcomes for similar individuals, regardless of sensitive attributes. Individual fairness relies heavily on a heuristic “distance-metric”, which measures the divide among individuals; therefore, its applicability is limited to those areas where a reliable and non-discriminatory distance-metric is available.
  • Fairness through Unawareness: in this context, fairness is present if a predictor does not make explicit use of sensitive attributes in the predictive process. This condition would be met for any predictor that is not group-conditional. It is a particularly useful approach if it is not possible to specify any sensitive attributes and there is no other background knowledge available. Overall, it roughly corresponds to being “blind” to counter discrimination.
  • Equal Opportunity: it posits that all demographics should have equivalent true positive rates; in other words, positive outcomes should manifest at a similar rate in each group. This idea has an affinity with disparate mistreatment, a metric that asks for equivalence of misclassification across groups.
  • Equalized Odds: adding to the previous metric, here fairness is achieved if all groups present similar true positive rates and false positive rates. The notion supporting this metric is that individuals are evaluated meritocratically, regardless of the status of their sensitive attributes (e.g., gender or race).
AI Fairness is team effort



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store


Making technology more inclusive of all ages, genders, and races.