IE 11 is not supported. For an optimal experience visit our site on another browser.

A fight over facial recognition is dividing Detroit — with high stakes for police and privacy

Detroit’s police chief says facial recognition is making the city safer. Some residents fear it could create a surveillance state.
Image: Gas station owner Nasser Beydoun
Nasser Beydoun, a gas station owner in Detroit, has seen the benefits of linking his business with a police surveillance network but worries about police overreach with facial recognition technology. Anthony Lanzilote / for NBC News

DETROIT — When the Detroit police department proposed a network of high-definition surveillance cameras that would stream live video from gas stations, liquor stores and other all-night businesses to a central police command center, Nasser Beydoun was one of the first to sign up.

In early 2016, Beydoun, owner of a Marathon gas station on the city’s west side, enthusiastically paid $6,000 to install cameras around his station and connect them with the crime-fighting effort, called Project Green Light. Since then, he said, crime at his gas station has fallen and revenue has climbed.

“We don’t have the trouble that we used to have,” he said. “There’s an element that used to come to the station to cause problems that no longer shows up.”

But now, as those Project Green Light cameras have expanded to 578 locations across the city, Beydoun is among the Detroiters grappling with the news that police can zero in on anyone who is filmed — including customers who are simply pumping gas — and collect personal information about them.

Image: A green light is illuminated on a McDonalds in Detroit
A green light signifies that a McDonald's is connected with Detroit's Project Green Light surveillance network.Anthony Lanzilote / for NBC News

Detroit police officials say they’re only using facial recognition technology to identify suspects in violent crimes — not to spy on ordinary citizens. But, in a city that is about 80 percent black, with a sizable population of Middle Eastern, Asian and Latin American immigrants, critics have blasted the police for using technology that has been shown to be more likely to misidentify people with dark skin, without fully explaining it to the public.

“What happens when this software misidentifies one single person that doesn’t have the resources for a good legal defense?” asked Willie Burton, an elected member of Detroit’s Board of Police Commissioners, a civilian body that oversees the police department. “Detroit is the poorest, blackest city in America. It should be the last city where we start implementing facial recognition.”

Police commissioner Willie Burton arrested during meeting over facial recognition technology
Willie Burton, a member of Detroit's Board of Police Commissioners, is arrested during a heated meeting where facial recognition technology was discussed.WDIV

The debate over facial recognition in Detroit has grown so heated that Burton was dragged out of a recent police commissioners board meeting in handcuffs and charged with disorderly conduct after the board’s chairwoman complained that he was interrupting the meeting. (The charge was later dropped.) The oversight board, along with Michigan’s state Legislature and the Detroit City Council, are all weighing proposals to restrict or ban the police use of facial recognition. The oversight board’s vote could come in the next few weeks.

This debate has put Detroit at the center of an escalating national conversation about high-tech crime-fighting tools that are promoted as beneficial to the public but may lead to more intrusive forms of government surveillance. Facial recognition is a policing tool that was uncontroversial two years ago but is now so contentious that several cities, including San Francisco, have preemptively banned its use by police, and Sen. Bernie Sanders, a Democratic presidential candidate, recently called for a national ban.

In Detroit, though, police have been using the technology for more than two years. That, combined with the city’s high crime rate, its demographics and the expansive network of Project Green Light cameras, has made the discussion here remarkably combative — and closely watched across the country.

Detroit police Chief James Craig is alarmed by what he calls “hysteria” over tools he believes are making one of the nation’s most dangerous cities safer. He says trained police analysts verify the identity of suspects before they’re arrested, so computer misidentification isn’t an issue. And he warns that if lawmakers stop police from using the technology, Detroiters will suffer — specifically the victims of violent crimes.

James Craig
Detroit Police Chief James Craig.Carlos Osorio / AP file

“It would be tragic,” he said. “This is about safety. This is about the victims. This is about identifying violent suspects. This is not about Big Brother and taking that man who just left that store and we got facial recognition running and say, ‘Oh, that’s Mr. Jones who was arrested six years ago.’ We don’t do that. That’s absolutely wrong.”

Beydoun hasn’t attended the raucous community meetings this summer, but as a Lebanese immigrant and a Muslim, he says he has reason to worry the government could target him or his neighbors. He chairs Detroit’s Arab-American Civil Rights League, which has joined 11 other local civil rights groups to call for a ban on facial recognition technology.

He’s also a business owner who relies on the police to keep his customers and employees safe. That puts him in a difficult place, struggling along with his neighbors and fellow business owners to figure out if police surveillance has gone too far.

Image: Project Green Light cameras monitor a Mobil gas station in Detroit
Project Green Light cameras monitor a Mobil gas station in Detroit.Anthony Lanzilote / for NBC News

“If somebody has committed a crime, you want to be able to assist,” Beydoun said as he sat in his gas station’s office beside a monitor streaming 16 live video feeds from inside and outside the station, creating a record that police could access.

At the same time, he said, “I’m against anything that creates a surveillance state.”

‘It was so dangerous’

Detroit last year reported its lowest number of homicides in more than 50 years (261 in 2018), but its murder rate is still twice as high as Chicago’s and more than 10 times higher than New York City’s.

As Craig sees it, the police need every tool available to make the city safer. He says the technology he’s put into place in recent years, including Project Green Light, is a key element in his strategy.

Image: Project Green Light evolved from a 2014 initiative that put flashing green lights in several city gas stations
Project Green Light evolved from a 2014 initiative that put flashing green lights in several city gas stations.Anthony Lanzilote / for NBC News

The project’s thousands of cameras — widely recognized by flashing green lights — transmit round-the-clock video from stores, schools, restaurants and community centers to the police department’s downtown headquarters.

Project Green Light evolved from a 2014 initiative that put flashing green lights in several city gas stations. Called Project Lighthouse, it was meant to turn gas stations into neighborhood safe havens where people could use the bathroom, change a tire or get other kinds of help at night. Several of those Project Lighthouse locations, including Beydoun’s gas station, then became the first businesses to install the cameras in early 2016.

At that point, Detroit was a little more than a year removed from the largest municipal bankruptcy in American history. The city had endured years when a stretched-thin police department struggled to respond to calls promptly, and broken street lights left large swaths of the city in deep darkness at night.

Moussa Bazzi, whose three gas stations on Detroit’s east side were among the first to join Project Green Light, said his customers were scared of criminals who targeted gas stations, smashing car windows and stealing valuables.

Image: Moussa Bazzi
Moussa Bazzi, who owns three gas stations on Detroit's east side, said he trusts police to use facial recognition technology. Anthony Lanzilote / for NBC News

“It was so dangerous,” he said. “There were people that would not come to any gas station in the city of Detroit.”

Now, he said, customers see the green light and feel safer.

Police credit the cameras with a 60 percent reduction in carjackings since 2015.

“People know that if they see a green light, that they’re on camera,” Beydoun said. “If they do something stupid, the police are able to see it.”

Critics of high-tech policing, however, note that there are other ways of reducing crime.

The 12 civil rights organizations that wrote a joint letter opposing facial recognition argue that the city should instead focus on reducing blight, improving schools and creating economic opportunities.

The city should make “real changes in our neighborhoods to improve public safety rather than in finding new ways to use technology to police our neighborhoods from afar,” they wrote.

‘It was not a national conversation’

The early adopters of Project Green Light had no reason to think the images gathered on their cameras would be analyzed with facial recognition software. It wasn’t until 2017 that the police department spent just over $1 million to purchase the powerful software, called FACE Watch Plus, made by DataWorks Plus, which gives police the ability to scan as many as 100 surveillance cameras in real time.

The police department received City Council approval for the purchase, but it attracted little attention at the time. That changed several months ago, when a report from Georgetown Law’s Center on Privacy & Technology zeroed in on a Project Green Light camera installed outside an abortion clinic. The report warned that with facial recognition systems like the one Detroit police had purchased, “all people caught on camera — passersby, patrons, or patients — are scanned, their faces compared against the face recognition database on file.”

To patients going into and out of the clinic, the camera “sounds less like a guarantee of safety and more like an invasion into a deeply personal moment in their lives,” the Georgetown researchers wrote.

Craig blasted that report’s assertions about how facial recognition could be used in Detroit, calling it “absolutely a lie.” He says the police have used the software just 500 times in the past two years, always to match screen grabs or photos of suspects with existing databases of mug shots.

Opponents are skeptical. They question why the department purchased such powerful tools if it wasn’t planning to use them to their full capacity. And they note that Craig did not seek guidance from the police oversight board until this year.

“We can’t just be expected to trust they are going to do what they say they are going to do given the fact that they had this for over a year and haven’t approached the Board of Police Commissioners about any policy,” Rodd Monts, campaign outreach coordinator for the American Civil Liberties Union of Michigan, said. “I don’t think they should expect folks to trust them based on their track record.”

Craig said his initial plan was to only rarely activate the software’s real-time facial recognition feature, which would scan cameras across the city in search of a particular person, in the event of a credible terrorist threat. He has since backed down from that possibility in the face of public opposition. His latest policy proposal to the oversight board would limit the technology to still photographs connected to violent crime investigations and only after several layers of approvals within the department. He’s also pledged not to share images with U.S. Immigration and Customs Enforcement for immigration investigations or use facial recognition to identify people at political events like protests.

But Amy Doukoure, a staff attorney from the Michigan chapter of the Council on American-Islamic Relations, an organization that advocates for American Muslims, says it’s hard to believe him.

“We can't rely on them to do the right thing because they didn't do the right thing in the first place,” she said.

The chief says he didn’t initially go before the police oversight board because he didn’t realize the issue would be controversial, and he didn’t see it as a significant change to department practice.

Back in 2017, “it was not a national conversation,” he said.

‘You’ve got to have a balance’

When Detroit’s police have used facial recognition technology, they’ve gotten a match about a third of the time, and some of those cases have led to arrests, Craig said. That includes charges filed in June in a triple homicide in which a shooter wearing a mask opened fire at a house party, killing two gay men and a transgender woman. The accused shooter pleaded not guilty at his arraignment in June, and the case is ongoing.

Without facial recognition, Craig said, that case “would have been a whodunit because no one at that party could give a description of the shooter’s face. The arrest was only possible, he said, because one of the surviving victims reported meeting the accused shooter earlier in the night at a Project Green Light gas station. The victim didn’t know the man’s name, but police got a clear image of his face.

Lakeita Carter, a sister of Paris Cameron, one of the victims killed at the party, has spoken in support of the technology.

“If y’all wouldn’t have had that, at that gas station when that guy walked in, they would still be trying to figure out who the killer was,” she told the Board of Police Commissioners at a meeting this month.

But Carter was in the minority among those who have addressed the board in recent weeks. Most railed against facial recognition’s privacy implications and “techno racism.”

Craig, who attends the meetings, repeatedly implores his critics to take a tour of the real-time crime center where the technology is in use.

There, visitors stand beneath scores of live video feeds and watch a police analyst demonstrate how he used the technology to identify a man who is suspected of opening fire in a convenience store last fall.

Crime Analyst Breanna Lingo monitors video from Project Green Light Detroit locations
Crime analyst Breanna Lingo monitors video from Project Green Light locations.Project Green Light

The analyst runs a surveillance image of the man’s face through a database of mug shots and pulls up 170 possible matches.

The first match — the one the computer identified as the most likely choice — looks nothing like the suspect. The same is true of the next dozen images, until the analyst comes to someone who looks like the shooter. Then he does additional research, pulling up more mug shots, the man’s driver’s license photo and his social media pages. The analyst points to images found on the man’s Facebook page, which show him wearing a sweatshirt like the one the shooter was wearing, and sitting on a car like the one used on the night of the crime. (Detroit police have issued an arrest warrant for the man but are still looking for him.)

The critics of facial recognition aren’t wrong when they say the technology is unreliable, Craig said, but police aren’t only relying on the technology. They’re getting a result and then analyzing it. The computer match is “only where it begins,” he said.

Some Detroiters, like Bazzi, believe him.

“I don’t think the Detroit police will take it too far,” Bazzi said. “If they have to solve a crime, who wouldn’t support them?”

But Beydoun is worried about where the technology could lead. In 2014, he sued the federal government alleging that his rights were violated when he was wrongly placed on the government’s “selectee list,” which he said led to extra screening at airports. Though the suit was unsuccessful, because the courts said missing flights did not violate his constitutional rights, the experience has made him wary of government surveillance.

Image: Nasser Beydoun was one of the first to sign up for Project Green Light
Beydoun is worried about where facial recognition technology could lead.Anthony Lanzilote / for NBC News

“We need to be vigilant to make sure these policies protect innocent people,” Beydoun said.

Still, he was proud that his cameras had picked up a suspect in a neighborhood shooting several years ago.

“They were able to get a clear shot, which put him at the scene of the crime,” he said.

But what if the police didn’t know the name of that suspect? What if they needed facial recognition to potentially get a violent criminal off the street?

Beydoun paused, considering the question.

“I think I would say no,” he said. “I would say use your other police tools.”

Any decisions about surveillance technology should not be made by police alone, he said, but rather through open discussion, weighing safety against freedom.

“You’ve got to have a balance,” he said.