Does Facial Recognition Software Used by Law Enforcement have Bias?

Several concerns exist over the widespread use of facial recognition software by police departments. This new, unregulated technology has integral bias. The data sets were not inclusive when they were collected, and the algorithms that function as the instructions to the machine learning platforms have built-in, unconscious bias. There is not, at this point, a system of third-party assessment or monitoring of this new technology for accuracy. Police departments are, without question, targeting African-Americans when they use this unregulated technology.

The Center for Privacy and Technology at Georgetown Law released a study in October, 2016, with many key points related to the probable bias in the data sets. African-Americans and young females were less likely to be included in the facial recognition data sets, leading to a decreased ability for the machines to recognize these groups with accuracy.

While no one is suggesting that coders intentionally encoded bias into the algorithms that are the instruction for the machine learning systems, the presence of unconscious bias is well documented across socioeconomic strata. The use of checks and balances, testing, and third party assessments in other industries has allowed for these types of unintentional errors to be caught and removed before systems were put into widespread use.

The excitement over the ability of AIs and machine learning platforms to perform near-miraculous feats of big data analytics may have led us to rush this technology into use before a system of checks and balances allowed external agencies to check data sets and algorithms for unconscious errors. With much financial and criminal justice decision making being left to predictive analytics by machine learning platforms, the possibilities for both financial and economic exclusion and targeted racial profiling are no longer just possibilities, but regularly documented occurrences in America.

For more information, or if you believe you have been a victim of discriminatory profiling, please contact us.

DISCLAIMER: The information contained in this article does not constitute an attorney-client relationship. Please contact attorney Kirk Anderson for an initial consultation.