Offering New Techniques to Measure Social Bias in Software

A software fairness tester.

Share

Today’s banks are progressively utilizing programming to choose who will get a loan. In addition, courts and hospitals also doing the same. These employments of software make it critical that the product does not victimize gatherings or people.

Most of the time, Data-driven software holds the potential to shape human behavior. It affects the products we view and purchase, the news articles we read, the social interactions we engage in, and, ultimately our opinions.

software.

Meliou with professor Yuriy Brun said, “Unwanted biases in software run the risk of perpetuating biases in society. For example, prior work has demonstrated that racial bias exists in online advertising delivery systems, where online searches for traditionally-minority names were more likely to yield ads related to arrest records.”

Such behavior causes racial stereotypes and other grave societal consequences. Themis simply focuses on measuring causality in discrimination. Due to its software testing feature, it could perform hypothesis testing. Thus it asks questions like, whether changing a person’s race effects? Whether the software recommends giving that person a loan?

Brun said, “Our approach measures discrimination more accurately than prior work that focused on identifying differences in software output distributions, correlations or mutual information between inputs and outputs. Themis can identify bias in software whether that bias is intentional or unintentional, and can be applied to software that relies on machine learning, which can inject biases from data without the developers’ knowledge.”

When testing it on public software systems, Themis found discrimination can sneak in even when the software is explicitly designed to be fair. Themis discovered that a decision-tree-based machine learning approach specifically designed not to discriminate against gender was actually discriminating more than 11 percent of the time. Due to this, almost 11 percent of the individuals saw the software output affected just by altering their gender.

Scientists found that designing the software to avoid discrimination against one attribute may increase discrimination against others. Such systems learn discrimination from biased data, but without careful control for potential bias, a software can magnify that bias even further.

Themis Project Page: Themis

Trending