Facial recognition remains largely ungoverned - and dangerous - in Minnesota

Facial recognition technology is everywhere. We use it to unlock our phones, prove our identity before boarding a plane, and send money from our virtual wallets to buy groceries and baseball tickets. Because this technology is nearly unavoidable in today’s world, it is logical to assume that it is rigorously monitored and regulated. It is not. In fact, the technology is deeply flawed and mostly unregulated.  

Technology does not exist outside of the biases and racism that are prevalent in our society. Studies show that facial recognition is least reliable for people of color, women, and nonbinary individuals. And that can be life-threatening when the technology is in the hands of law enforcement.  

Facial recognition automates discrimination.  

The ACLU-MN is fighting to end law enforcement's use of facial recognition technology in Minnesota. More than a dozen large cities have banned the technology, including Minneapolis, Boston, and San Francisco. If Minnesota adopts a policy to ban the technology, it would be the first state to do so.  

Here’s why the ACLU-MN will fight this legislative session to ban facial recognition tech:  

  1. It gives blanketed and indiscriminate surveillance to authorities to track you.  
  2. It is inaccurate and intensifies racial and gender biases that already exist in law enforcement, which lead to disparate treatment.  
  3. It can be used to target and identify vulnerable groups, such as immigrants and refugees.  
  4. It can be used to track your personal movements, including going to abortion clinics or drug treatment. 
  5. Facial recognition technology violates our Constitutional rights.  

Indiscriminate Surveillance  

Minnesota state law controls how data is collected, created, stored, used, and released by the government. However, data collected from facial recognition technology is not included in the Government Data Practices Act.  

People generally do not opt in to being tracked by facial recognition technology; you aren't able to give consent. And yet, law enforcement can use this software without appropriate checks and balances. "It's like walking around with your driver's license stuck to your forehead," said ACLU-MN Policy Associate Munira Mohamed.  

Racial and Gender Biases  

Studies show that facial recognition technology is biased. The error rate for light-skinned men is 0.8%, compared to 34.7% for darker-skinned women, according to a 2018 study titled “Gender Shades” by Joy Buolamwini and Timnit Gebru, published by MIT Media Lab. A 2019 test by the federal government concluded the technology works best on middle-age white men. The accuracy rates weren’t impressive for people of color, women, children, and elderly individuals.  

Law enforcement and the criminal justice system already disproportionately target and incarcerate people of color. Using technology that has documented problems with correctly identifying people of color is dangerous.  

The ACLU-MN has an appalling firsthand example here in Minnesota: We sued on behalf of Kylese Perryman, an innocent young man who was falsely arrested and detained based solely on incorrect facial identification. 

The injustice does not stop there. A 2020 ACLU study found that since Black people are more likely to be arrested than white people for minor crimes, their faces and personal data are more likely to be in mugshot databases. That makes them more likely to be misidentified as suspects and forced to sit in lineups – despite the documented lack of accuracy behind all these means of supposed identification. The ACLU found that police surveillance cameras are disproportionately installed in Black and Brown neighborhoods, which again exacerbates systematic racism.  

It is a common assumption that technology is unbiased, and thus infallible. This simply isn't true. All technology is created by people and people are biased.  

Photo with a blue filter of woman with hair in a top bun and black shirt. There are dots and lines on her face suggesting a facial scan.

Targeting Vulnerable Groups  

It’s no secret that the Department of Homeland Security and its sub-agencies ICE and Customs and Border Protection have already committed horrific abuses – the ACLU sees the tragic consequences. With facial recognition, these agencies could potentially pinpoint the location of immigrants across the country, marking them for detention and deportation on an unprecedented scale. 

In 2017 alone, ICE, DHS and other government agencies used this technology to locate and arrest 400 family members and caregivers of unaccompanied migrant children, separating families and leaving children in detention. 

Constitutional Rights Violations  

The First Amendment ensures the right to protest and make your voice heard. Facial recognition technology could have a chilling effect on democracy: People may decide not to protest out of fear they’ll be documented by this technology. 

The Fourth Amendment protects against unreasonable searches and seizures, and it’s getting an unnecessary workout. Faulty facial recognition already has led to searches and arrests of innocent people.  

“The criminal justice system can really trap people into false arrest, false identification,” said Mohamed. “You have to prove that you're innocent because a computer said that you're guilty." 

Facial recognition technology isn’t all bad. For example, it’s been helpful in missing persons cases, and identifying victims of natural disasters and at crash scenes. 

But technology is developing much more quickly than the laws we need to protect our rights. Anyone with a smartphone or social media account knows how rapidly technology changes, while legislating can take a long, long time. It is essential to proactively put policies into place that will protect our rights as new and more advanced technology is constantly released.  

Especially when that technology – and law enforcement members using it - have been shown over and over to be biased toward marginalized groups.  

Help Combat Facial Recognition Abuse