Booker Introduces Bill Banning Facial Recognition Technology in Public Housing
Booker Introduces Bill Banning Facial Recognition Technology in Public Housing
Bill follows Booker’s effort to survey law enforcement agencies using facial recognition software
WASHINGTON, D.C. – U.S. Senator Cory Booker (D-NJ) introduced legislation barring the use of surveillance technology, such as facial recognition, in federally-funded public and assisted housing. The No Biometric Barriers to Housing Act prohibits the use of facial recognition and other biometric identification technology in public housing units, an alarming trend that poses significant privacy risks for vulnerable communities. It would also require that the Department of Housing and Urban Development (HUD) send to Congress a detailed report on how it’s using the software.
Congresswomen Yvette Clarke (D-NY), Ayanna Pressley (D-MA), and Rashida Tlaib (D-MI) introduced companion legislation in the U.S House of Representatives in July.
“Using facial recognition technology in public housing without fully understanding its flaws and privacy implications seriously harms our most vulnerable communities,” Booker said. “Facial recognition technology has been repeatedly shown to be incomplete and inaccurate, regularly targeting and misidentifying women and people of color. We need better safeguards and more research before we test this emerging technology on those who live in public housing and risk their privacy, safety, and peace of mind.”
The use of facial recognition technologies in public housing could have a deeply harmful impact on accessibility to fair and affordable housing. This emerging technology could serve as the basis for the denial of building access or even unjust arrest for trespassing, particularly for members of vulnerable and marginalized communities. There is also significant concern over the possibility that data from biometric identification technology could be shared with outside organizations without the knowledge of tenants.
This bill follows Booker’s efforts to better understand how law enforcement agencies across the country are using facial recognition software. Last July, along with Senator Wyden, Booker surveyed 39 federal law-enforcement agencies about their use of facial recognition technology, and what policies, if any, they have installed to prevent abuse, misuse, and discrimination. That same month he also joined Senators in requesting the Government Accountability Office study commercial and government uses of facial recognition technology, and examine what safeguards companies and law enforcement have implemented to prevent misuse and discrimination based on race, gender, and age. And in October 2018, Booker secured language in the FAA Authorization Act requiring the TSA to report to Congress on methods to eliminate bias on race, gender, and age as TSA begins deployment of facial recognition technology for domestic flights.
Background on Booker’s record fighting algorithmic bias:
Booker was one of the first lawmakers to call for increased antitrust scrutiny of major tech companies and he has consistently fought to ensure that extant discrimination and bias in our society does not become augmented and automated in the future.
Last month, Booker submitted public comments opposing the proposed Housing and Urban Development (HUD) rule that dismantles vital federal protections for victims of housing discrimination. The proposed rule would allow housing providers to avoid liability for algorithmic discrimination if they use algorithmic tools developed by a third party. With this proposed rule, HUD, the nation’s main civil rights enforcer for housing discrimination, effectively provides a road map for landlords and lenders who wish to discriminate by algorithm.
Earlier this year, along with Senator Ron Wyden (D-OR) and Congresswoman Clarke, Booker introduced the Algorithmic Accountability Act, which requires companies to fix discriminatory algorithms and outlines methods the federal government can use to mitigate the impacts of such algorithms.
Last year, he secured a commitment from Facebook CEO Mark Zuckerberg — for the first time — to conduct a civil rights audit at Facebook. Among other things the audit is meant to address the algorithmic biases on Facebook’s platforms.
###