DHS Facial Scans: New Street Tech Worries Critics

The deployment of the Mobile Fortify app marks a significant shift in federal law enforcement, moving facial recognition from a “back-office” investigative tool to a real-time, street-level utility. As of February 2026, this technology is being used not just at borders, but in residential neighborhoods and at public protests across the U.S.


How “Mobile Fortify” Works

Developed primarily for Customs and Border Protection (CBP) and Immigration and Customs Enforcement (ICE), Mobile Fortify turns a standard smartphone into a powerful biometric scanner.

  • Real-Time Biometrics: Agents point their phones at a person’s face or even their fingers (for “contactless fingerprints”).
  • The “Super Query”: The app instantly cross-references the scan against massive databases:
    • IDENT/HART: The DHS primary biometric database containing over 270 million identities.
    • Traveler Verification Service: Photos from passports, visas, and flight manifests.
    • Automated Targeting System (ATS): A system that tracks “high-risk” travelers and individuals.
  • Data Retention: Even if no match is found, or if the person is a U.S. citizen, the photo is often stored in the Automated Targeting System for 15 years.

The Legal and Civil Rights Standoff

The aggressive use of this technology has triggered a wave of litigation and legislative pushback centered on the Fourth Amendment (protection against unreasonable searches).

  • The “Definitive” Status: Lawmakers, including Rep. Bennie Thompson, have raised alarms that ICE agents are instructed to treat a Mobile Fortify “match” as a definitive determination of status, even if the individual presents physical proof of U.S. citizenship.
  • The 100,000+ Scans: A lawsuit filed by the State of Illinois and the City of Chicago alleges the app has been used over 100,000 times in the field since 2025, including on children, without warrants or consent.
  • Intimidation Tactics: Activists in Minneapolis and Portland report that agents use the app to scan the faces of “constitutional observers” and protesters. On February 3, 2026, Senator Ed Markey demanded clarity on reports of a “domestic terrorist” database used to track these individuals.

Accuracy and Bias Concerns

A primary criticism from civil liberties groups like the ACLU is the technology’s documented failure rate among specific demographics.

Demographic GroupError Rate (Avg)Risks Identified
Light-skinned Men0.8%Lowest risk of false identification.
Middle-aged White Men~1.0%The “baseline” for most training datasets.
Darker-skinned Women34.7%Highest risk of “false positive” matches.
Children/ElderlyHighChanges in facial structure lead to frequent misidentification.

Leave a Reply

Your email address will not be published. Required fields are marked *