Apr 12, 2022Facial recognition and other surveillance technologies also enable more precise discrimination, especially as law enforcement agencies continue
Dec 21, 2021 overpoliced (Big Brother Watch, 2018). It further raises questions of fundamental rights such as the right to privacy, the right to freedom
Dec 21, 2021Soft biometrics are used as proxies for demographic categories to improve the accuracy of the system from the perspective of law enforcement,
Dec 21, 2021Soft biometrics models are codified in a similar way through Universal Background Models, which are speaker independent models that can be
Law enforcement use of facial recognition is particularly dangerous because it exacerbates protest policing and political repression, over-policing of minority communities, and risk of wrongful identification and wrongful arrest.
Law enforcement use of facial recognition is particularly dangerous because it exacerbates protest policing and political repression, over-policing of minority communities, and risk of wrongful identification and wrongful arrest.
Face Recognition in Racial Discrimination by Law Enforcement
Another key source of racial discrimination in face recognition lies in its utilization.
In 18th century New York, “lantern laws” required enslaved people to carry lanterns after dark to be publicly visible.
Advocates fear that even if face recognition algorithms are made equitable, the technologies could be applied with the same spirit, disproport.
How much is the government 'facial biometrics' market worth?
The government “facial biometrics” market — which includes ,federal, state and local law enforcement — is expected to soar from $136.9 million in 2018 to $375 million by 2025, according to an estimate by market research firm Grand View Research.
Inequity in Face Recognition Algorithms
Face recognition algorithms boast high classification accuracy (over 90%), but these outcomes are not universal.
A growing body of research exposes divergent error rates across demographic groups, with the poorest accuracy consistently found in subjects who are female, Black, and 18-30 years old.
In the landmark 2018 “Gender Shades” project, an int.