Technically Speaking

The Official Bigstep Blog

 

Is Big Data a Bigot? How Big Data Can Lead to Unintentional Discrimination

Discrimination based on one's age, sex, race, sexual orientation, handicap, and other issues is wrong. Not only is it wrong, but in many cases it is illegal. For example, it is illegal in most developed nations to deny employment based on race or to deny access to housing based on one's sex. In most cases, it's easy to tell when a person or agency has engaged in unfair discrimination. For example, it's easy to see that a company does not hire a reasonable percentage of minority workers or that females in a particular organization do not receive a fair number of promotions versus their male counterparts.

Discrimination based on one’s age, sex, race, sexual orientation, handicap, and other issues is wrong. Not only is it wrong, but in many cases it is illegal. For example, it is illegal in most developed nations to deny employment based on race or to deny access to housing based on one’s sex. In most cases, it’s easy to tell when a person or agency has engaged in unfair discrimination. For example, it’s easy to see that a company does not hire a reasonable percentage of minority workers or that females in a particular organization do not receive a fair number of promotions versus their male counterparts.

Usually, discrimination is the result of a bad attitude or bias against a group of people, such as people of color or women or those with handicaps. But in the age of Big Data an entirely new and completely unintentional form of discrimination is happening, and if you’re using data analytics to drive your operations or marketing efforts, you need to be aware of it so that you can help prevent it.

When Data Causes Discrimination

Data cannot understand issues like race, religion, and age, but the analytics can inadvertently discriminate among these groups.

How can data discriminate? It has no intention or feeling, and therefore cannot hold a bias. But in many cases, data does discriminate. For instance, The Princeton Review is a company that offers test preparation services. They use a geographic pricing model, based on data analytics, to price their services. According to this model, areas that have a heavy population of Asians tend to get quoted higher prices for online SAT tutoring. Though there is no indication that The Princeton Review has engaged in deliberate targeting of Asians with higher prices, the algorithms unintentionally lead to charging Asians more for those services.

Can companies using big data analysis be held legally accountable for the unintentional discrimination of their algorithms. Yes. For many years, lawyers have turned to the concept of “disparate impact,” which holds that a policy can be considered discriminatory even if there was no intent to discriminate if, in fact, the policy leads to actual discrimination. In other words, discrimination is against the law even if it wasn’t done on purpose.

Types of Discrimination That May Be Caused by Big Data

Tragically, big data has already led to Flickr images of African males being identified as non-human. Additionally, some algorithms have associated traditionally “black sounding” names with a higher amount of criminal activity. Data can unintentionally discriminate against people based on their race, income level, level of education, sex, or most any other attribute.

Say, for instance, that marketers use data analytics to avoid sending certain offers to zip codes associated with an ethnic group because that ethnic group is deemed unable to afford those products or unworthy of credit. Or, perhaps the algorithms fail to offer the same discounts to those without college degrees, or only to white males, or perhaps only to people under the age of 55. Even if the business did not mean to discriminate, they can still be held liable, both legally and ethically, for their discriminatory practices.

How to Assure Your Data Analysis Isn’t Causing Discrimination

It’s easy to see how this issue can be a legal and public relations minefield. What can you do to steer clear of any actual or unintentional discrimination with your big data efforts?

• Be mindful of the issue when building algorithms. When you are separating people based on income, education levels, zip codes, or other criteria, be sure your algorithms aren’t inadvertently singling out any particular race, sex, age group, etc.
• Check the results against known factors. After the analysis is complete, check the data for known or obvious discrimination. For instance, did it exclude or single out zip codes with a high population of immigrants?
• Self-audit your algorithms, analysis, and results. Find and fix any unintentional discrimination. If you do find some unintentional bias, be forthcoming, transparent, and honest about the results and what you’ve done to fix the problem.

As the true power and potential of big data become clear, there will likely be many issues like this to discover, discuss, and address. In the end, big data analytics could help end discrimination once and for all.

Got a question? Need advice? We're just one click away.
Sharing is caring:TwitterFacebookLinkedinPinterestEmail

Readers also enjoyed:

Is It Time for Our DB Administrator to Delve Into NoSQL?

Along with all of the media and blogging buzz about big data, NoSQL databases have crept into the headlines. Many proponents of big data are adamant that…

Big-Data-As-A-Service (BDaaS): Sparking the Next Big Data Evolution

BDaaS allows enterprise organizations to simplify deployment in the cloud.2017 may finally be the year for Big Data as a Service (BDaaS). Businesses have…

Leave a Reply

Your email address will not be published.

* Required fields to post your comments.
Please review our Privacy Notice in order to understand how we process your personal data and what are your rights in this respect.