Skip to main content

Why Some Cities Are Banning Facial Recognition Technology

A handful of US cities have banned government use of facial recognition technology due to concerns over its accuracy and privacy. WIRED's Tom Simonite talks with computer vision scientist and lawyer Gretchen Greene about the controversy surrounding the use of this technology.

Released on 08/06/2019

Transcript

Have you noticed that facial recognition

is everywhere at the moment?

It's how we unlock our phones, tag people on Facebook,

and at some US airports, it's being used to speed up

the process of boarding an aircraft.

But there are also concerns about this technology as well

from lawmakers and civil rights groups.

I'm here with Gretchen Greene, a computer vision expert

and lawyer, who's going to help us understand

this technology, thank you for joining us Gretchen.

Hi Tom, great to be here.

The uses that are really driving

the public debate right now, are around

the government uses, and so we've seen

that the city of San Francisco where we are now

recently banned use of facial recognition

by it's agencies, Oakland across the bay,

did so too, and so did some of lower Massachusetts.

It's very unusual for a government at any level

to completely ban a technology, what's driving this?

Right, so one of the uses, or maybe the primary use

that I think drives that is law enforcement's use,

and so besides it borders, it's also local government.

You've seen it on TV episodes,

where on crime shows, they just sort of search,

and it's like, oh that's who that is,

one of the issues is how that intersects

with overall surveillance possibilities.

The thing which is unprecedented right now is the number

of surveillance cameras, both private and public.

If you were to connect those into

a network where it was easy to get that data

in a continuous way, and then you combine it

with tools like facial recognition

that could allow the automated processing of the data.

You could track everyone, potentially, all of the time,

backwards to as far as you had surveillance data.

Now we're not there, but if you've got a camera

there's still the question of can the police

get a search warrant, or can they just ask you for it.

Tell us a little bit about this technology,

so how does it work and how are these systems being made?

[Gretchen] Ultimately, what it does

is it's finding patterns.

And there's multiple layers that are finding

different kinds of patterns, but you can imagine

with your eyes that if you really simplified it,

so you zoomed way out and you had this blurry image,

there's kind of a dark band right here.

So that's in one of the layers, one of the patterns

that the model is picking up, and then you have to give it

examples of pictures that you've labeled,

either it can be a yes no, there's a face in the picture,

there's not, or it can be where the face is,

or it can be who the face is, and it might be millions

of examples that you need to give it

for it to figure out what are the patterns

that it should be looking for.

Where does that data come from?

Well it depends actually on who's using it.

Where does government get images,

so I would say government databases,

departments of motor vehicle, state department records,

those kinds of records are starting to be assembled

into databases that the FBI for instance

has access to, Not all states but some.

Where does, big tech get images from as a primary source?

Well people post them all the time.

I said what you would need is a picture of someone

and a label that said who is this.

So if you put up a picture or you see a picture

and you say oh, well this is my friend Jack,

and this is my friend Sue, now you've provided them

that labeled training data that they can use

for their facial recognition algorithms.

It's also got much more accessible,

so not only the big tech companies,

with deep pockets and a lot of scientists

can do this sort of thing, so it's

been proliferating, absolutely.

They're showing up in a lot of different places,

commercial, all kinds of uses.

How are they being used?

Private security, so it's being used

at large events, there was a report in China

about someone who had an outstanding warrant

or the equivalent going to a big concert

and being identified by facial recognition.

It's being used for advertisements,

a closely related technology is a motion recognition,

which can be done in a lot of ways,

but one of them is through looking at the face.

So if you can see how someone reacts

to an electronic billboard outside a store,

you can change the electronic billboard

to try to make it more enticing to them.

It's also being used in education

with robot human interactions, having a robot

that has a nice personality, but trying to interact,

and so it has to understand when someone is looking at it.

Yeah, so a really broad span, and lots of them

seem fairly innocuous, some of them

I guess may seem a bit unusual,

we don't expect billboards to scan our faces

or look at us, and why does the government

want this technology, so in law enforcement for example

what benefits do they see from having the ability

to search faces in a database?

Law enforcement is interested in tools

to help them do their job better.

If you've got a database full of pictures

and their identities, and then you have a picture

of somebody from the 7/11 that got robbed.

Right now you would take that picture and pass it around

to the officers who do work in that area,

you'd maybe go around the neighborhood,

say does anybody know who this is.

Well one, that'll take longer,

but the other thing is, maybe nobody knows.

And maybe that picture is somewhere in this database,

but your database has 10,000 pictures in it.

It'll take a very long time to go one by one.

It'll matter probably less in the case

of a 7/11 that you find them quickly,

but the longer that it takes to solve a case,

the less likely that it is to be solved,

and there are other kinds of crimes, kidnappings,

where it's very important to solve it quickly.

Raises a question that a lot of civil liberties

groups ask, which is about, what if the facial recognition

software is wrong, what if it misidentifies someone,

and I've heard concerns that there'll be

different error rates for different

demographics and different communities.

Right, so there's two kinds of concerns to have.

What if it's right, should we be using something

in a certain way at all, and then what if it's wrong,

even if it's only wrong sometimes,

what are the effects of it being wrong.

It changes how the police officer

will react, which could be good or bad.

We want the officer to be safe,

but they will be more likely to take more extreme action,

and to think that they're in danger,

when maybe they're actually not.

So for instance if it's a misidentification,

there is research out of MIT, where dark skin

is not as likely to be correctly identified as light,

particularly women with dark skin,

not as likely to be correctly identified.

There are also general concerns about

the effects on anyone from any community

of just knowing that the government

may be tracking your face, may be watching you.

Even if I trust the government,

I do care, I would rather live in a world

where I feel like I have some privacy,

even in public spaces, that not all is known,

because if people know where you are,

you might not go there, you might not do those things,

even though they're things that are bedrock of what we think

people in this country should be able to do.

For instance, coming out as gay

is less problematic professionally now

than it was in the US, but still potentially

problematic, and so if an individual

wants to make the choice when to publicly disclose that,

then they don't want facial recognition technology

identifying that they are walking down the street

to the LGBTQ center, so there are kinds of membership

issues of certain groups in society

where we're not as a government trying to stop

or as a society really trying to stop

certain kinds of actions, we're not trying to stop

people from going to church, we're not trying

to stop them from going to community centers.

But we will if they are afraid of

what will the implications be in an environment

that is hostile to, for instance,

a certain ethnicity, or a certain religion.

So really it's very difficult to opt out

of either government facial recgonition

or commercial, it sounds like.

It is very difficult, and that's one reason

that it is more controversial than some other things

like fingerprints would be, because it can be done

at a distance when you don't know it's being done,

and in a way that it's very difficult to opt out.

And some people might think that, well,

there would be some kind of privacy laws,

or something that might restrict ideas like that,

I mean, is there any regulation at the federal

or state level that specifically

regulates facial recognition?

We're seeing more happening on the local levels,

state or city, right now, than at the federal,

the federal government has not said,

we are regulating this and therefore

local governments cannot, and then because of that

we're seeing a patchwork of states and cities

thinking about doing something.

I wonder what do you think the future looks like?

Are we in a period that we will look back on

and say, wow, this technology was really unfettered

back then, but now we have some protections?

Is that where things are going,

or is it too late and this is just how it's gonna be?

There aren't that many companies

where facial recognition is their quo business,

and it's not deeply embedded in what the government is doing

and how we're functioning, even if it were,

I don't think it would be necessarily too late

to say as a society, what are the implications

and the effect that government use of technologies

that can be used for surveillance,

like facial recognition, can have on other rights.

So freedom of speech, expression, religion,

do we want this or not, this is a choice that we're making.

And there are a number of ways to say that

we don't wanna make that choice,

or that we do, we should decide as a society.

Thank you for joining us Gretchen,

it'll be interesting to see how all this plays out.

Good to be here Tom.

[electronic music]

Up Next