June 3, 2016 – Updated with Sheriff’s response
Surveillance cameras have become a ubiquitous part of urban life: lamp posts, traffic signals, building corners, dash cams; they’re impossible to avoid. But in the Hennepin County Sheriff’s Office’s hopes for the future, these networks of cameras—possibly including privately owned cameras—could soon use real-time automatic facial recognition to create a database of everywhere you go.
Since 2013, inmates booked into the Hennepin County Jail in Minneapolis have had three-dimensional maps of their facial features involuntarily enrolled into a facial recognition database using software from Cognitec, a German R&D firm. Facial recognition technology uses algorithms to measure and analyze relative sizes, shapes, and positions of an individual’s facial features for later identification and comparison.
Hennepin County Sheriff Rich Stanek did not respond to a request seeking comment about his department’s use of the technology, but after nine months of delays and litigation over a public records request, the Sheriff’s Office was ordered by a court to provide access to emails, which shine some light on the Sheriff’s current capabilities and future plans.
When conducting an investigation involving an unidentified suspect, Hennepin County Sheriff’s Office (HCSO) investigators can copy photographs, social media images, and still frames from surveillance video into Cognitec’s Examiner software to search its database of faces. Investigative leads developed with facial recognition analysis have already led to arrests and convictions in Hennepin County.
“[The] system is so good I’ve found possible matches that turned out to be close relatives,” wrote an HCSO Criminal Information Analyst in a 2013 email, “It costs a shit-ton … but I love it.” The price tag: about $200,000 including annual maintenance fees and training costs.
The Sheriff wouldn’t say how many individuals have been enrolled into its facial recognition database, but emails and vendor invoices document that the technology was originally purchased to store up to 1.2 million faces, with both adult and juvenile arrestees enrolled into the system.
“This is not an application that I want advertised to anyone other than Sheriff's Office employees.” — HCSO Principal IT Specialist in internal 2014 email
HCSO developed a policy on facial recognition along with their acquisition of the technology in 2013. While the policy creates no accountability measures through audits, reporting, secure storage requirements, or information sharing procedures, it does mandate that only images related to a criminal investigation can be processed.
However, the Sheriff’s Office has fielded image face-search requests from outside agencies that do not have their own facial recognition policies, apparently with enough frequency that HCSO created a canned response to include when returning results to outside agencies:
“Attached is a possible investigative lead based on comparison to HenRAP booking photos. If you have any questions or need additional information please contact the Hennepin County [Criminal Information Sharing and Analysis] unit…”
Last year, the Minneapolis Police Department (MPD) denied use or even discussion of using facial recognition technology. Asked again last week, MPD Public Information Officer John Elder confirmed, “We do not have that technology at the MPD.” However, emails show MPD has sent images to the Hennepin County Sheriff’s Office for facial recognition searches. Minneapolis has no policy on use of the technology.
A pivotal shift to a mass surveillance society is the way real-time facial recognition has been described, and it’s a proposal that has been discussed in the Hennepin County Sheriff’s Office.
When law enforcement began using automatic license plate readers, privacy advocates expressed concern that police were creating a log of when and where people not wanted for any crime were driving, but law enforcement pointed to situations where having a database of historical vehicle location data helped solve crimes. Ultimately, a bill requiring destruction of data after 60 days was signed into law.
This debate is not all that dissimilar to real-time facial recognition, but instead of tracking license plates on cars, law enforcement can track human faces: a network of surveillance cameras that can track and save a log of every person a camera sees, all without suspicion or warrant. There are no state laws on facial recognition, let alone real-time facial recognition.
So, does the Hennepin County Sheriff’s Office want to use this technology? Their internal emails suggest as much.
When conducting an investigation, law enforcement often asks for footage from surveillance cameras at nearby businesses. Police must first track down the owner of the camera, which takes time and money. A Minnesota company named Securonet sought to solve that problem with a product called SafeLink, which allows businesses to register the location of their privately-owned security cameras. By subscribing to this service, law enforcement knows where those private cameras are, and who to contact if video footage is needed.
Last year, Securonet took SafeLink a step further with the introduction of VideoLink, giving law enforcement direct access to real-time video streams, touted as “improving situational awareness for law enforcement.”
Both the Minneapolis Police Department and Hennepin County Sheriff’s Office have contracted with Securonet. It has been reported that over 400 local cameras were in the Securonet system, half of which were privately owned, but emails suggest as many as 3,000 additional cameras are on their way into the network.
With a network of hundreds of live video streams in one hand, and facial recognition technology in the other, thoughts of connecting the two weren’t far behind.
In a 2015 email, a Hennepin County Sheriff’s Office employee wrote, “…we need to hook up Securonet with Cognitec and their VideoScan product.” According to marketing documents, VideoScan “detects and identifies persons of interest in real time while computing demographic and behavioral data.”
County emails appear to show Securonet attempting to provide technical details to support such an integration, but the results are unclear. Attempts to obtain clarification from Securonet were not successful.
Improvements driven by economic investment into facial recognition technology have made it more efficient and accurate, capable even of real-time identification of faces in low-resolution, shaky, and blurry videos of crowded streets and stadiums. Facial recognition capabilities can now be embedded into devices such as mobile phones and police body cameras, connecting to databases filled with millions of comparison images.
HCSO staff discussed these and other future possibilities in emails, including behavioral analysis with artificial intelligence and crowd iris scanning.
“A little scary but probably the next step to surveillance.” — HCSO Principal IT Specialist in a 2014 internal email
This future may not be far off: emails discussing surveillance technology repeatedly reference the 2018 Super Bowl at the new U.S. Bank Stadium in Downtown Minneapolis, which could be classified by the U.S. Department of Homeland Security as a National Special Security Event. As locals know from St. Paul’s hosting of the 2008 Republican National Convention, guests may go home, but an influx of federal funding for new law enforcement equipment does not, changing policing for years to come.
State officials balked at the Hennepin County Sheriff’s attempts to “further increase the database size” of faces, which would provide better investigative leads. The Bureau of Criminal Apprehension was resistant to the idea of giving HCSO a copy of the statewide Minnesota Repository of Arrest Photos (MRAP) and Driver’s License and state ID photos. Instead, HCSO took matters into their own hands, circumventing the state and directly approaching neighboring counties, asking for copies of their mugshot databases.
Sheriff Stanek would not say which counties were participating in such an arrangement, but emails with the Washington, Dakota, Sherburne, and Anoka County Sheriff’s Offices indicated interest and HCSO had since obtained a new vendor quote to upgrade the capacity of its facial recognition database to 2.5 million faces.
Asked about their agency’s cooperation with the program, Anoka County Sheriff Commander Paul Sommer said that while his department does not currently share its mugshots with Hennepin County, “we have been requested to do so and it is a proposal we are looking into.” Anoka County emails show the County’s vendor “…is scheduled to create the new interface to send the data to Hennepin County’s server” alongside its existing connection to state servers.
Dakota County Sheriff Captain Patrick Enderlein said that his department “does not use any form of facial recognition technology” and does “not automatically share any of [its] booking photos with Hennepin County.” However, a Dakota County email suggested 130,000 booking photos of all inmates since 2000 “would be available to add to facial [recognition]” on Hennepin County’s servers.
Sherburne County Sheriff Joel Brott did not respond to a request seeking comment. Emails suggest that HCSO would “ask Sherburne to pay their share” of the costs of facial recognition, and other emails discuss Sherburne County’s “commitment to pay” and state that “Sherburne [is] still working on figures.” Responding to a public records request, a Sherburne County Sheriff’s Office representative said the agency had no contracts, agreements, policies, or invoices on the matter, but has not yet produced their own internal emails.
Assistant Washington County Attorney Rick Hodsdon said the Washington County Sheriff’s Office is currently “not participating” in the Hennepin County facial recognition mugshot sharing program, but confirmed that “there’s contemplation” and stated, “we haven’t made any decisions either way.” Hodsdon said there were no written policies or agreements in place, and that he didn’t see how the program could start without something in writing, citing the complexities of state law on government data practices.
The legal and public policy implications of these arrangements is “something policymakers at the state legislature would have a strong interest in,” said Rich Neumeister, an open-government and privacy advocate.
Neumeister pointed to a provision of state law allowing the exchange of information between law enforcement agencies for investigation purposes, but said the provision was written nearly 40 years ago, long before law enforcement contemplated the bulk surveillance and databases in use today.
“Political backlash” over use of facial recognition technology was noted in an HCSO staffer’s email as having occurred in San Diego, which might be one of the reasons the Sheriff has been so tight-lipped.
In early 2013, a San Diego news station reported about a local agency’s testing of facial recognition technology similar to that used in Hennepin County. The story quoted an ACLU spokesperson, concerned that the lack of public input was disturbing, drawing comparisons to a George Orwell novel. The reporter closed out the piece by suggesting 1984 was “a piece of fiction no longer inconceivable.”
But in San Diego, law enforcement representatives actually spoke to news media about—and even demonstrated—facial recognition technology.
In May 2015, Minneapolis CBS affiliate WCCO reported that “no local law enforcement in Minnesota regularly use this technology yet.” But blocks away, the Sheriff had been using the technology for over two years, conducting training sessions and demonstrations for law enforcement across the metro area.
Instead of correcting WCCO’s story, a Sheriff’s Office staffer laughed about the story to their facial recognition vendor: “That’s funny … It is in our best interest to stay out of that type of limelight with this technology.”
HCSO is the only local agency known to own facial recognition technology for routine law enforcement purposes, but the Minnesota Department of Public Safety has also used it to perform anti-fraud “scrubs” of Driver’s License and state ID photo databases. In 2010, 1,200 IDs were canceled, and in 2012, there were 23,705 cases of possible fraud where an individual’s face appeared on more than one ID.
Public reaction to use of facial recognition technology seems to depend on how it’s used, and there is apparent sensitivity in the Sheriff’s Office, with internal efforts to re-brand it to sound less sinister. “Is there anything else that this could be named—Identification Image Technology?” asked a high-ranking HCSO Major. “I used the fanciest words I could think of, we’re not supposed to say ‘facial recognition’ now and I have no idea why,” wrote an HCSO Criminal Information Analyst. The Sheriff’s Office settled on “Investigative Imaging Technology.”
No matter the name, to say there are public policy and privacy considerations involved with facial recognition technology—especially automated, real-time facial recognition—would be an understatement.
“History repeats itself,” said Neumeister, who could readily point to numerous incidents over decades of Minnesota law enforcement history, where databases were created and technologies were adopted, only to be later shut down or regulated once the public and legislature found out.
Should law enforcement be permitted to perform real-time automated facial recognition on large networks of interconnected street surveillance networks? Can law enforcement create a database of everywhere a camera caught a glimpse of your face? Who gets access to such a database? Can drones and police body cameras use facial recognition, too? And what about commercial use?
An internal Hennepin County Sheriff’s Office email concedes major problems:
Indeed, jail mugshots are traditionally public information, so can corporations get three-dimensional maps of arrestees’ faces and enroll them into shopping mall security systems? Facial recognition is used today in casinos to keep out cheaters, in shopping malls to keep out shoplifters, and even schools and churches to track attendance.
These sound like questions for the state legislature to tackle; a grueling prospect when law enforcement isn’t talking.
“Is Sheriff Rich Stanek here today? Sheriff Rich Stanek?” beckoned Civil Law Committee chair Rep. John Lesch at a hearing about government use of surveillance technology in 2014, hoping to hear about HCSO’s use of KingFish, a cell phone tracking and exploitation device.
The Sheriff didn’t show.
“The rapid pace [of] data collection and advancing technology is creating new datasets faster than we can classify them,” wrote Rep. Lesch in a committee presentation later that year, “Vast amounts of government data is being collected that can enhance or restrict the lives of Minnesotans.”
In what has become routine in recent years, Minnesota law enforcement has pushed the envelope on adoption of new mass surveillance and data-gathering technology, but lawmakers finding out about the new technology was “rarely due to disclosure by government agencies themselves,” Rep. Lesch wrote.
Hennepin County emails show that law enforcement complains of state law being “antiquated” and “built backwards” while simultaneously withholding vital information from lawmakers. As Rep. Lesch wrote, “The legislature is responsible for oversight … [but] law enforcement agencies have no incentive to volunteer information about their operations to the legislature.”
Data is worthless if kept in a silo. Corporations have for years combined datasets for better decision-making through the use of machine learning algorithms. And now, government is catching up.
Last month, the Chicago Tribune’s editorial board commended the use of “big data” to create “smart cities” for improved government efficiency and quality of life, but the board expressed some reservations on privacy: “Not only do we need assurances that this technology is not about surveillance, we need the transparent means to double-check those assurances.”
The City of Minneapolis recently adopted IBM’s Intelligent Operations software, tracking over 1,250 metrics, including crime, housing complaints, road repair, parking restrictions, and utility data. The City would be remiss to not use its data to best serve taxpayers, but when these metrics include individually-identifiable data collected through mass surveillance programs, City Hall may begin to feel oppressively omnipresent to some.
In 2004, the Minneapolis City Council authorized installation of red-light traffic cameras throughout the city, a move intended to improve public safety, instead resulting in years of litigation that made its way up to the Minnesota Supreme Court. Dubbed the “PhotoCop ordinance,” opponents to the program successfully argued that the City was attempting to define a camera as a “police officer,” eliminating due process protections guaranteed by state law. While other cities across the country actively ticket drivers through red light camera programs, Minneapolis agreed to pay out $2.6 million in citation refunds to 15,000 drivers.
Red light cameras of a decade ago are child’s play as compared to software like IBM’s Intelligent Video Analytics. Authorities can connect surveillance camera systems and train the software to detect events like illegal U-turns, trespass, or loitering. Combined with facial recognition and license plate readers, it’s not a stretch to imagine criminal citations and complaints being automatically filed with the courts within seconds of a person allegedly committing a crime.
Predictably, the Transportation Security Administration (TSA) is investing heavily in the development of new biometric technologies: nearly $1 billion has gone toward a behavior and facial expression detection program intended to eventually “develop a unique and operationally relevant dataset to research and test automated video analytic solutions for hostile intent detection and tracking.” But while developing the criteria for the program, TSA officers estimated 80% of those singled-out have been minorities.
And therein lies another problem with this technology: can law enforcement trigger or influence predictive policing based on dark skin color, facial features that indicate Middle Eastern descent, or the presence of religious clothing?
Data doesn’t lie, as they say, but algorithms are only as equitable as the human programmers creating them. Such was the concern of the authors of a White House report released earlier last month on the intersection of civil rights and ‘big data’:
“The algorithmic systems that turn data into information are not infallible—they rely on the imperfect inputs, logic, probability, and people who design them. Predictors of success can become barriers to entry; careful marketing can be rooted in stereotype. Without deliberate care, these innovations can easily hardwire discrimination, reinforce bias, and mask opportunity.”
“It is all really scary.” — HCSO staffer, referring to a Washington Post article about facial recognition technology
As Minnesota and other states adopt biometric technologies, the federal government eyes getting their own copy of the data. The FBI’s Next Generation Identification (NGI) program replaces federal fingerprint databases to become a repository of even more biometric characteristics, including iris scans, palm prints, tattoo recognition, and faceprints. A 2015 Minnesota Department of Public Safety email indicates Minnesota would not be immediately participating in the Interstate Photo System, but appears to accept it as an inevitability:
A coalition of 45 civil liberties and human rights organizations authored a letter last week expressing concern over the FBI’s program as the government seeks to exempt the database from transparency laws:
“It runs on a database holding records on millions of Americans, including millions who have never been accused of a crime. While the database is partially built using mugshots and arrest records submitted by state and local law enforcement agencies, it also includes the fingerprints and photos of people getting background checks – and people applying to become permanent residents or naturalized citizens.
As a result, the NGI system may not affect everyone equally. Instead, it likely includes a disproportionate number of African Americans, Latinos, and immigrants. This is a problem from a technical perspective, as a body of research – including research authored by FBI personnel – suggests that some of the biometrics at the core of NGI, like facial recognition, may misidentify African Americans, young people, and women at higher rates than whites, older people, and men, respectively.
The likely disparate impact of NGI is not limited to facial recognition. Arrest records in NGI often fail to indicate whether a person was convicted, acquitted, or if charges against them were dropped.”
Which brings us back to Hennepin County: as metaphorical firehoses of arrest and face data travel across the region to reach Sheriff Stanek’s elected office, what precautions are being taken to ensure the underlying accuracy, equity, and civil rights in these programs and databases as the Hennepin County Sheriff’s Office overtakes the state as the go-to agency for this technology?
Prior to publication of this story, the Sheriff’s Office declined to provide insight into any successful uses of facial recognition, its plans for the future, its policies, and the sharing of face-scan data.
Instead, after publication, the Sheriff’s Office responded with a Facebook post titled “Facial Recognition: The Real Story,” providing an example where the technology was used on a robbery suspect, the first known public disclosure by the Sheriff’s Office of its three-year run using this technology.
The Sheriff’s post disputes no facts in this story, but rather says the agency has no intention of using real-time technologies to “monitor the actions of residents at street corners.”
But the emails the Sheriff’s Office produced—be they explorations or direct intentions, be they for a one-time Super Bowl event or everyday use in Downtown Minneapolis—do in fact discuss and take steps toward potential implementation of the technology, and make clear that use of the technology is ripe for policymaking discussion.
As to other questions posed to the Sheriff regarding data sharing, accountability, and costs, no comment was provided. These, among many other questions, remain unanswered as the Sheriff defends its program as complying with the law—in a state with no laws on facial recognition technology.
These technologies aren’t fantasy. They’re here to stay, as is law enforcement’s evasion of transparency and oversight; a problem that will continue as long as public concern battles police secrecy.
Revelations of the Hennepin County Sheriff’s facial recognition program, described publicly for the first time here, did not come easily or quickly. Hennepin County’s compliance with the Minnesota Government Data Practices Act—the state’s freedom of information law—was secured only through complex, time-consuming, and expensive litigation.
If Sheriff Stanek was able to keep his agency’s facial recognition program out of the news for three years, does this stonewalling foreshadow the future?
The Hennepin County Attorney’s Office withheld some data as “security information” and other law enforcement agencies have withheld data as “trade secret information.” As policing transitions from one-on-one human interactions to lenses, sensors, datacenters, vendors, and algorithms, freedom of information litigation may become the public’s only way to see sunlight.
It will take a unabashedly adversarial news media, with great attorneys, to ensure government remains transparent—and the public remains informed—as we forge into the future.
I’d like to thank Scott Flaherty of Briggs and Morgan, P.A. for representing me pro bono in Webster v. Hennepin County and the Hennepin County Sheriff’s Office, which is now at the Minnesota Court of Appeals. The full court docket, updates, and larger sets of data produced by the County are being posted here.