Detroit Police Implements New Limits on Facial Recognition

Detroit Police Implements New Limits on Facial Recognition

By
Matteo Rossi
2 min read

Detroit Police Department Implements New Rules for Facial Recognition Technology

The Detroit Police Department (DPD) has enacted new regulations that restrict the use of facial recognition technology (FRT) following a settlement with Robert Williams, a Black man who was wrongfully arrested in 2020 due to a misidentification. This settlement, while not completely banning FRT, prohibits the DPD from making arrests based solely on facial recognition results or conducting lineups based only on FRT leads. Williams' arrest resulted from a false match of his expired driver’s license photo with that of a shoplifter, leading to a humiliating arrest in front of his family. The ACLU, involved in securing the settlement, has highlighted at least two other wrongful arrests of Black individuals facilitated by FRT. The new rules mandate additional evidence beyond FRT for arrest warrants and necessitate training on FRT's racial biases. Furthermore, all cases since 2017 involving FRT for arrest warrants will undergo auditing.

Key Takeaways

  • Detroit Police Department adopts new rules limiting facial recognition use after a wrongful arrest settlement.
  • Police cannot arrest or conduct lineups solely based on facial recognition results.
  • New policies require independent evidence linking suspects to crimes for arrest warrants.
  • Detroit PD to undergo training on facial recognition's racial bias and audit past cases.
  • Settlement ensures facial recognition cannot replace basic investigative work.

Analysis

The DPD's constraints on facial recognition technology (FRT) after the Williams settlement draw attention to the technology's racial bias issues. The immediate impacts include heightened scrutiny on FRT deployments and the potential for legal challenges against other police departments. In the long run, the broader adoption of FRT may slow down, compelling tech firms to address bias in algorithms. Financial implications could affect companies like Amazon, which has faced demands to halt police use of its Rekognition software. The settlement underscores the necessity for comprehensive FRT regulation and ethical AI development.

Did You Know?

  • Facial Recognition Technology (FRT):
    • Facial Recognition Technology (FRT) is a biometric software that can identify or verify a person's identity using their facial features. It functions by analyzing distinct facial features such as the distance between the eyes, the depth of the eye sockets, the shape of the cheekbones, and the width of the nose. FRT is extensively used in security systems and is present in various applications, including smartphones, surveillance cameras, and access control systems.
  • Racial Bias in Facial Recognition Technology:
    • Racial bias in facial recognition technology refers to the tendency of these systems to perform less accurately on individuals of certain racial backgrounds, particularly those who are Black or of African descent. This bias can stem from the datasets used to train the algorithms, which may not adequately represent diverse populations. Consequently, these systems can produce false matches or fail to identify individuals correctly, leading to issues such as wrongful arrests or security breaches.
  • Settlement in Legal Context:
    • In a legal context, a settlement is an agreement reached between two or more parties before, during, or after a court case to resolve a dispute without further litigation. Settlements can involve financial compensation, policy changes, or other actions to rectify the situation. In the case of the Detroit Police Department and Robert Williams, the settlement resulted in new rules limiting the use of facial recognition technology and requiring additional training and auditing to address racial biases in its application.

You May Also Like

This article is submitted by our user under the News Submission Rules and Guidelines. The cover photo is computer generated art for illustrative purposes only; not indicative of factual content. If you believe this article infringes upon copyright rights, please do not hesitate to report it by sending an email to us. Your vigilance and cooperation are invaluable in helping us maintain a respectful and legally compliant community.

Subscribe to our Newsletter

Get the latest in enterprise business and tech with exclusive peeks at our new offerings