Civil liberties group warns about face recognition system considered by Florida law enforcement departments

Share
face recognition
Surveillance camera installed before the Republican National Convention in downtown Tampa. By Seán Kinane / WMNF News (2012).

Civil liberties advocates are warning about powerful face recognition software that could be coming to a police department near you.

The Tampa Bay Times is reporting that about a dozen law enforcement agencies in Florida have tested or purchased a facial recognition system from a company called Clearview AI.

WMNF interviewed Nate Freed Wessler, a staff attorney at the American Civil Liberties Union. He’s with the ACLU Speech, Privacy, and Technology Project.

“Face recognition technology is a real threat, a categorically new and disturbing kind of threat to our privacy. This is technology that police proport to be able to use to identify people quickly when there would be no other way to try to figure out who they were.

“But the technology raises a couple of really important concerns. The first is that it has serious error rates. Numerous independent tests, including by respected academics and by federal government agencies, have showed that facial recognition technology fails to accurately pick the right match a significant portion of the time. And that those failure rates are even worse for darker skin people — people of color in other words — for women and for younger people. That’s partly because of bad training data that was used for some of these algorithms and partly because of inherent limitations in the camera technology.

“As a result, we are in a situation where police are using these powerful and invasive tools, potentially, to start investigating the wrong people. To arrest them to prosecute them. And that raises really serious questions.

“But even in a hypothetical world where this technology worked 100% of the time accurately — which I think is impossible, but let’s imagine that — this would still be dangerous. This is technology, that if left unchecked, would give police the power to identify anyone and everyone at a political protest, everyone at a house of worship, at an Alcoholics Anonymous meeting and anywhere else. It’s technology that’s extraordinarily chilling because it threatens to end our ability to maintain any semblance of privacy as we go about our lives.”

The Tampa Bay Times is reporting that about a dozen Florida law enforcement agencies are trying out, or have bought access to, the facial recognition system from a company that’s known as Clearview AI. Where does Clearview get its images?

“Clearview has said that it has amassed more than three billion, with a B, face scans. In other words three billion photos of people, scan those faces, and dump them into a giant database that it then searches with its face recognition technology.

“Those photos come off the internet from scraping of our social media profiles, scraping of other websites that have our names and our photos linked to them. Nobody who is using Facebook or Instagram or Twitter or any other corner of the internet did that with an expectation that some private company is going to grab all of their photos, create an artificial-intelligence-based face scan, and then use it as a giant engine for police to search and look for us anywhere we might go.”

Are there other things that this company could be using these images for besides sharing with police?

“We know from this company’s marketing materials that it is trying very aggressively to sell its system to police departments. It’s doing that with no safeguards. In most places, there are no legal limitations that have yet been passed by legislators or by city counselors about face recognition technology because it’s so new. And so there really are nowhere close to sufficient guardrails on the police side.

“This company is also selling it to other private companies, security companies and others. Which means that there are myriad dangers in this technology. Not just the wrong people are going to be investigated, arrested, and prosecuted, but that they will be subject to other kinds of privacy violations by private companies.

“There’s a particular danger of a private company like Clearview running this kind of system and having access to all of the images in its database and to the images that police departments upload when they’re investigating.

“There was a New York Times reporter, who published the first blockbuster story about Clearview system last month, writes about how she asked a few local police officers who had access to the system if they would run a test with her photos so she could see how it worked. When one of those officers did that he got a call very soon after from employees of Clearview asking why he was running the face of a New York Times reporter through their system. That means that there are employees of this private company looking over the shoulder of law enforcement investigators during what are supposed to be confidential and secure investigations. That is ridiculous. And for those reasons and many more this technology is terrifying, this technology is dangerous and this technology has no place in our society.”

What do you know about Florida and its use of facial recognition software by police?

“We know from recent reporting that a number of police departments in Florida have at least been testing out the Clearview system. But this is not the first time the police in Florida have been using face recognition technology… far from it.

“More than a decade ago, the Pinellas County [Sheriff’s] Department got a federal grant to develop its own face recognition system. And over the years it has made that available to virtually every local county and state law enforcement agency in the state. That gives police a tremendous ability to violate people’s privacy. And perhaps most disturbingly its been mostly happening in secret.

“There is an epidemic across the state of police and prosecutors using that face recognition system to try to identify suspects and then prosecute them. But hiding that information from people who are accused of crimes, from their defense attorneys, from judges, and from other people in our criminal justice system.

“Without transparency there can’t be accountability and there can’t be a chance for people to prove that, in fact, that there was a false match or some other error in the system. And we know these systems are riddled with errors. That’s dangerous and it’s disturbing. And the prospect of police supercharging those efforts with Clearview’s technology raise extraordinary concerns.”

Listen:

 

Leave a Reply

  • (will not be published)

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

You may also like

student meal
Next school year Hillsborough public schools are offering free meals

Hillsborough Public Schools are offering students free meals for the...

Correspondence Through Poetry. A Mind-Numbing Week.

Father Verses Sons: A Correspondence in Poems by Herbert Gold...

The sound of change: Music’s influence on anti-war and human rights movements

Throughout history, music has served as a powerful catalyst for...

a man in a tye dye shirt talking on a radio microphone
Recreational pot for Florida is on the ballot this fall—let’s talk about it

In four months, Florida voters have the opportunity to vote...

Ways to listen

WMNF is listener-supported. That means we don't advertise like a commercial station, and we're not part of a university.

Ways to support

WMNF volunteers have fun providing a variety of needed services to keep your community radio station alive and kickin'.

Follow us on Instagram

Ultrasounds Radio Show with Eluv
Player position: