San Francisco supervisors vote to ban facial recognition software
The San Francisco Board of Supervisors voted 8-1 today to approve the Stop Secretive Surveillance ordinance, which outlaws the use of facial recognition software or retention of information obtained through facial recognition software systems. A second reading and vote will take place at a May 21 Board of Supervisors meeting to officially approve or reject the ordinance, according to the city clerk’s office.
Supervisor Catherine Stefani, the sole vote against the ordinance, said amendments fail to address her questions or concerns related to public safety. Once passed, San Francisco will become the first city in the United States to outlaw the use of facial recognition software by city departments, including the San Francisco Police Department.
“The propensity for facial recognition technology to endanger civil rights and civil liberties substantially outweighs its purported benefits, and the technology will exacerbate racial injustice and threaten our ability to live free of continuous government monitoring,” the ordinance reads.
Ordinance author Supervisor Aaron Peskin called facial recognition a “uniquely dangerous technology,” and cited facial recognition software being used to track the Uighur population in western China and an ACLU test of Amazon’s Rekognition that misidentified 28 members of Congress as criminals.
Peskin called the ordinance an attempt to balance security with guarding against a surveillance state.
“This is really about saying we can have security without being a security state. We can have good policing without being a police state,” he said.
The legislation, which amends city administrative code, will require city departments to create surveillance tech use policy. City departments are also required to submit annual surveillance reports that explain how city departments use devices like license plate readers, drones, or sensor-equipped streetlights.
Acquisition of new surveillance technology will require approval by the Board of Supervisors, and if new tech is approved, city departments will be required to adopt “data reporting measures” to “empower the Board of Supervisors and the public to verify that mandated civil rights and civil liberties safeguards have been strictly adhered to.”
Several human rights, privacy, and racial justice organizations supported the ordinance, citing deadly interactions with police that have occurred in the San Francisco Bay Area in recent years.
A group sent a joint letter in support of the ordinance last month and includes the ACLU of Northern California, Asian Law Alliance, the Council on American Islamic Relations, Data for Black Lives, Freedom of the Press Foundation, and the Transgender Law Center.
During a Rules Committee meeting last month, many group members cited audits of facial recognition software that found them lacking in their ability to recognize women and people of color. That same criticism has been frequently lobbed at companies like Amazon and Microsoft, who in the past year have tested or sold their facial recognition AI with law enforcement and government agencies.
Others who spoke in support of the ordinance talked about a fear of misuse of such tech not just by local police, but by the Department of Homeland Security’s ICE, which detains people in the United States without a visa, citizenship, or green card. San Francisco is a sanctuary city.
The ordinance provides no specific definition of public input that city departments should seek about use of surveillance tech — it just states the need for public hearings to take place.
The group opposed an exemption for the San Francisco County Sheriff’s Department and District Attorney if either can provide the City Controller with justification in writing for why new surveillance tech is necessary to carry out prosecution. Exemptions can also be made in life-threatening exigent circumstances.
Some people oppose the ordinance as written due to concern that video or information obtained through the use of private video cameras that deploy facial recognition software cannot be shared with police without approval.
More than a dozen letters were sent to the Board of Supervisors by members of the group Stop Crime SF requesting an amendment to portions related to sharing video with police.
“Many in our residential and commercial neighborhoods have private security cameras whose video footage is readily available to the SFPD to support their efforts to catch criminals, especially auto burglars and package thieves. Supporting the SFPD is the primary –if not the only — reason why we have these private video cameras,” local resident Peter Fortune said in a letter.
The Stop Secret Surveillance ordinance was first proposed in January by Supervisor Aaron Peskin. Cosponsors include Supervisor Shamann Walton, who represents the historically African American Bayview-Hunters Point neighborhood, and Supervisor Hillary Ronen, who represents the historically Latinx Mission District.
The passage of the ordinance comes at a time when a number of government bodies are forming their own policies for the acquisition or use of AI systems.
A bipartisan group of U.S. Senators last week resubmitted the AI in Government Act aimed at creating federal standards. The U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) is also exploring the formation of federal standards as part of orders from the American AI initiative Trump executive order.
Outside the United States, the European Commission recently enacted an AI ethics pilot program, and the World Economic Forum will convene its first Global AI Council later this month.
Only 33 of 193 United Nations member nations have enacted national AI policies, according to FutureGrasp, an organization working with the U.N.