San Francisco has become the first US city to ban the use of government facial recognition ID systems.
The move is the result of the city's Board of Supervisors voting 8-1 in favor of a bill called the Stop Secret Surveillance Ordinance. This prevents the city government - including the police - from using facial recognition technology on the public.
- Hertz car rental employs facial recognition and fingerprints in biometric security drive
- Biometric security and how it is evolving
- Tokyo 2020 Olympic Games to use facial recognition technology
Set to take effect in a month's time, the bill also requires San Francisco city agencies to get board approval for the future use of non-facial surveillance technology, like license plate scanning systems. Audits of surveillance systems already in use will also now be set up.
Supervision Aaron Peskin, who led the bill, said: "This is not really the top secret surveillance ordinance, it's really an ordinance that is about having accountability around surveillance ordinance. And with the narrow exception of the facial recognition technology, this is actually not designed to stop the use of any technologies that we currently employ or may use in the future."
Instead of issuing an outright block on surveillance technology, Peskin said the bill is in place to "ensure the safe and responsible use of surveillance technology" by government departments.
In a statement praising the passing of the new bill, the American Civil Liberties Union of Northern California said: "We applaud the San Francisco Board of Supervisors for bringing democratic oversight to surveillance technology, and for recognizing that face surveillance is incompatible with a healthy democracy".
The ACLU added: "The law sets a strong standard for public safety in the digital age. We encourage other communities to say no to face surveillance, and to put rules in place to make sure technology works for the people, not against them."
Those in favor of the technology say that - once it works reliably and accurately - it will aid policing efforts and make the general public safer.
security CCTV camera or surveillance system on the street with tourists on blurry background
Joel Engardio, vice president of Stop Crime SF, said in a statement: "We agree there are problems with facial recognition ID technology and it should not be used today. But the technology will improve and it could be a useful tool for public safety when used responsibly. We should keep the door open for that possibility. Especially when facial recognition technology can help locate missing children, people with dementia and fight sex trafficking. We are disappointed there was not an exception for large public events."
The ruling in San Francisco comes as cities around the world are considering how facial recognition could be used to aid policing - and as pressure groups rally against the technology, amid concerns of its failings, inaccuracies, and breaches of privacy and civil freedom.
As smartphone features like Face ID on the iPhone work with high degrees of accuracy and reliability, facial recognition systems used to pick out faces in a crowd are notoriously unreliable. A 2018 test of such technology on New York City's Robert F Kennedy Bridge couldn't detect a single face "within acceptable parameters," according to a Metropolitan Transportation Authority email obtained by the Wall Street Journal.
It was recently reported in the UK how facial recognition technology used by the Metropolitan Police in London misidentified members of the public as potential criminal 96 percent of the time. The technology was used in eight trials carried out in London between 2016 and 2018, but 96 percent of the results were false-positives, where an innocent person was wrongly identified by the system as a criminal, by comparing them to photos in a police database. Two tests outside a major shopping mall in west London in 2018 had a 100 percent failure rate.
Privacy rights group Big Brother Watch said of the technology: "It must be dropped immediately" after it was found to produce a racial bias.