ICE and CBP’s Face-Recognition App Can’t Actually Verify Who People Are

The US Department of Homeland Security (DHS) has deployed a facial recognition app, Mobile Fortify, which is meant to identify immigrants and citizens alike during federal operations. However, the technology does not actually verify identities but rather provides candidate matches that need to be confirmed by humans.

According to records reviewed by WIRED, Mobile Fortify has been used over 100,000 times since its launch in May 2025. The app was designed to determine or verify the identities of individuals stopped or detained during federal operations and was initially approved without public scrutiny.

The technology relies on algorithms developed by NEC Corporation of America and has been shown to struggle with poor framing or head tilt when images are taken outside controlled settings, such as in real-world conditions like immigration lanes. Street photos captured by cell phones lack the controls that tightly controlled visa photos used at ports of entry.

DHS officials have claimed that Mobile Fortify is designed to operate at scale under imperfect conditions, and the company's patents acknowledge a core trade-off between speed, scale, and accuracy. However, experts say that this approach can lead to false positives and a lack of transparency around the technology's operation and use.

The app has been used to scan the faces not only of targeted individuals but also people later confirmed to be US citizens and others who were observing or protesting enforcement activity. Reporting has documented federal agents telling citizens they were being recorded with facial recognition and that their faces would be added to a database without consent.

DHS has declined to detail the methods and tools used by agents, despite repeated calls from oversight officials and nonprofit privacy watchdogs. The department's use of Mobile Fortify is part of a broader shift towards low-level street encounters followed by biometric capture like face scans, with limited transparency around the tool's operation and use.

Experts argue that the lack of safeguards and oversight has real-world consequences for our privacy, civil liberties, and civil rights. Senator Ed Markey says that DHS has deployed an "arsenal of surveillance technologies" that are being used to monitor both citizens and non-citizens alike, calling it "the stuff of nightmares."

The use of facial recognition technology has been criticized as a threat to free expression and civil liberties, with some arguing that privacy safeguards are essential to preventing wrongful targeting by unvetted tools like Mobile Fortify.
 
🤔 this whole thing is just so...human, you know? We're all caught up in the idea of security, of knowing who's who, but at what cost? 🤷‍♂️ it's like we think we can control everything with these fancy tech tools, but really, we're just playing a game of cat and mouse. 💻 they deploy this app, and suddenly, our faces are being scanned, recorded, and stored without our consent. that's not freedom, that's surveillance. 🚫 and what about the false positives? do we really want to live in a world where an algorithm is more accurate than human judgment? 🤯 it's like we're trading our humanity for convenience. 📊 we need to slow down, take a step back, and think about what we're getting ourselves into with this kind of technology. are we just giving up on our right to be unknown? 🤔
 
I'm literally shook by this news 🤯. The idea that our faces can be scanned without consent just to identify people during federal operations is so messed up 🚫. I mean, what's next? Random biometric scans at the grocery store or something 😂? No, seriously though, this raises some major concerns about surveillance and civil liberties. We already live in a world where our data is being collected and used for all sorts of purposes, but facial recognition tech is just another way to get under people's skin 🤔.

I also don't like that the app was deployed without public scrutiny 🔍. What if there are issues with accuracy or bias? We can't just assume it's perfect because some fancy algorithms were used 🤷‍♀️. And the fact that it's being used to scan not just targeted individuals but also people who were just observing protests is just creepy 👻.

I'm all for security and safety, but we need to make sure our rights aren't getting trampled in the process 💪. We should be having a national conversation about this stuff, not just ignoring it behind closed doors 🗣️. This is exactly why we need more oversight and transparency around biometric tech 👊
 
This whole thing is super unsettling 😬. I mean, how much power do we give the government when it comes to tracking our faces? It's one thing to use facial recognition for national security purposes, but this feels like a whole different story 🤔. The fact that they're using an app that can't even accurately identify people in real-world conditions is just crazy 🚨. And now we know that they're using it on US citizens too? That's just not right 👊. I think we need some serious reform and more transparency around how these technologies are being used 💡. We need to make sure our civil liberties aren't being trampled in the name of "security" 🤝. This is a major wake-up call for us all 🚨.
 
I'm not totally sure about this whole thing... I mean, on one hand, it's kinda cool that they're trying to use tech to make immigration ops more efficient 🤖. But on the other hand, I'm all for some serious oversight and transparency when it comes to biometric data and surveillance tools 💡. 100k uses since May is a pretty big number, and if it's struggling with just basic image quality (poor framing, head tilt, etc.), how can we trust it in real-world scenarios? 🤔 It sounds like they're playing with fire here, relying on algorithms that might lead to false positives or wrongful targeting... it's not worth the risk, imo 😬. And what's up with scanning people who are just trying to exercise their rights? 🚫 That's some pretty sketchy stuff right there 👮‍♂️.
 
man I'm so worried about this... the whole thing is just too suspicious 🤔 mobile fortify sounds like something out of a sci-fi movie where they're scanning people's faces left and right without any oversight 📺 it's like they're treating us like suspects or something

I remember when I was traveling in asia and someone scanned my face at a visa port of entry... it was such a weird experience 😳 but at least it was controlled and with some explanation. this mobile fortify thing is just using algorithms to make educated guesses without any human intervention 🤖 it's like they're playing a high-stakes game of "guess who"

and what really freaks me out is that the government isn't being transparent about how this tech is being used... or even what the rules are 🙅‍♂️ I mean, i get it, security is important but we need to make sure our rights aren't being trampled in the process too 🤝
 
😒 I think we're playing right into the hands of tech giants here... Facial recognition apps like Mobile Fortify might seem convenient for law enforcement, but it's just a slippery slope to mass surveillance 🚨. We need more transparency and oversight, not less 🤔. The fact that agents are scanning faces without consent is just not cool 🙅‍♂️. What's next? Facial recognition in everyday life, like on the street or in stores? No thanks 👎.
 
🤔 I'm getting super uneasy about this Mobile Fortify app being used by the DHS... it's like they're just winging it with no real checks on how it works and who gets scanned. 🚨 I mean, we know it's not perfect, but 100k+ uses already? That's a lot of people who could be misidentified or wrongly targeted. And what about consent? 🤷‍♀️ They're basically saying "we'll scan your face and add you to the database without asking". Like, no thanks! 😒
 
omg this is soooo concerning 🤯 the fact that they're deploying this facial recog app in immigration lanes is giving me the heebie-jeebies 💀 and it's not even verifying identities but just providing matches for humans to confirm? that's, like, super unreliable 🤦‍♀️ i mean, what if someone gets a bad angle or has poor lighting on their face? or what about those who don't want to be scanned but are detained anyway? it's all so shady 😏 and the lack of transparency from DHS is, like, totally unacceptable 🙅‍♂️
 
I'm so worried about this 🤕 Facial recognition tech is already sketchy enough, but the fact that it's being used without proper consent from citizens... that just makes me really uncomfortable 😬 And the fact that people have been misidentified or matched with someone who wasn't even in the database? That's like something out of a sci-fi movie 🚀 How are we supposed to trust these algorithms when they're not transparent about how they work?

And what really gets my blood boiling is the lack of accountability from DHS 🤯 They're just shrugging it off and saying it's all part of their "arsenal of surveillance technologies"... but at what cost? Our freedoms, our anonymity... are we really willing to trade that for a few extra security measures? 🤔 I don't think so.

I know we need to stay safe, but this feels like an overreach by the government 🙅‍♂️ Can't they see how invasive and damaging these tools can be? We need to have open conversations about this stuff, not just sweep it under the rug. 💬
 
🤔 the more I think about this, the more I'm reminded of how vulnerable we are in our daily lives 🌐. I mean, who hasn't been stopped and searched or asked for their ID at some point? It's a slippery slope when it comes to consent and transparency 🙅‍♂️. The fact that they're using facial recognition tech without explicit consent from citizens is just mind-boggling 😱. And the more I read about this, the more I wonder if we're trading our freedom for convenience 💻.

I think what bothers me most is that there's no real accountability here 👮‍♂️. DHS officials are saying it's all about speed and scale, but at what cost? 🤯 We need to have a conversation about what kind of society we want to live in 🗣️. Do we value security over our personal liberties? Can we really trust these algorithms to get it right? 💡 I'm not sure the answer is clear-cut 😕.

It's like, when are we going to learn from our mistakes? 🤦‍♂️ We've seen this play out before in history – from the Red Scare to the Patriot Act. It's always about control and power 👑. But what if we're missing something more important? What if our biggest threat isn't terrorism or crime, but our own willingness to surrender our individuality? 🤔
 
I'm so worried about this new app 😱. I mean, think about it... they're using facial recognition to identify people during federal operations? That's just creepy 🤯. And the fact that it doesn't actually verify identities but just gives candidate matches is super sketchy 🚫. I don't know how many times I've seen those old "Wanted" posters with pictures of people and their names, but this app is like something out of a sci-fi movie 🎥.

And have you heard that it's been used to scan the faces of people who were just minding their own business? Like, innocent bystanders being recorded without consent? That's just not right 👊. I know we're living in times where surveillance is becoming more common, but this takes it to a whole new level 🚨.

I'm with Senator Ed Markey on this one - it does seem like an "arsenal of surveillance technologies" and it gives me the heebie-jeebies 😳. We need some serious safeguards in place before we start relying on these kinds of tools. Our civil liberties are worth protecting, you know? 🙏
 
IT'S SO OVERRATED HOW MUCH TRUST WE'RE SUPPOSED TO HAVE IN THESE FACIAL RECOGNITION APPS! I MEAN, 100K USES SINCE MAY 2025 IS JUST CRAZY AND NOT EVEN A FULL YEAR IN! IT'S LIKE THEY WANT US TO BELIEVE THIS IS PERFECTLY SAFE AND SECURE WHEN REALLY IT'S JUST A TOOL FOR MASS SURVEILLANCE. AND THE WORST PART IS, IT'S GETTING USED ON CITIZENS WHO AREN'T EVEN SUSPECTED OF ANYTHING WRONG WITH THEM!

AND CAN WE TALK ABOUT HOW INACCURATE THIS TECHNOLOGY IS?! I'VE SEEN STORIES WHERE PEOPLE HAVE HAD THEIR FACES RECOGNIZED INCORRECTLY BECAUSE THEY HAD A POOR HEAD TILT OR THE LIGHTING WAS OFF. IT'S LIKE, COME ON, CAN'T WE GET SOMETHING BETTER THAN THIS?!

I FEEL LIKE OUR LEADERS ARE JUST PASSING THE BUCK WHEN IT COMES TO EXPLAINING HOW THESE APPS WORK AND WHO'S ALLOWED TO USE THEM. IT'S LIKE THEY WANT US TO JUST TRUST THEM WITHOUT ASKING QUESTIONS. WELL, I'M NOT buying it!
 
🤣 I mean, what's next? A government ID app that can scan your mood 🤪? But seriously, this whole thing is a bit of a mess. Like, who thought it was a good idea to give agents carte blanche to scan people's faces without consent? It's like they're trying to create a digital version of the Sopranos' "fuggetaboutit" 😂.

And don't even get me started on the whole "scale under imperfect conditions" thing. That just sounds like code for "we'll get it right... eventually?" 🤦‍♂️. I mean, come on, if they can't trust their own technology to give accurate results in real-world settings, how are we supposed to trust them with our biometric data? Not to mention the whole issue of false positives and a lack of transparency... it's like they're trying to solve a mystery movie plot 🎃.

But hey, at least it's not just us immigrants who are getting caught up in this facial recognition madness 👀. It seems like anyone can become a "person of interest" if the right agents come knocking 😳. I guess that's what we get when you mix government surveillance with Silicon Valley tech 🤖... not exactly the kind of innovation we need, if you ask me 😒.
 
🤔 I'm not buying the 'imperfect conditions' excuse for this facial recognition app. Like, what's next? Deploying surveillance drones over schools or something? 🚁 The fact that it's been used on people who aren't even targets but were just in the wrong place at the wrong time is super sketchy. And the lack of transparency from DHS is wild – like, how can you expect people to trust an app when they don't know what's going on behind the scenes? 🤫 It's all about 'scale' and 'speed' for these folks, but I'd rather have accurate info than some 'efficient' solution that just invites mistakes. 😒
 
I'm not sure I buy the whole 'operating at scale under imperfect conditions' narrative 🤔. This tech is being used to scan faces of people who might be US citizens, and even those who aren't doing anything wrong. It's just too convenient for law enforcement agencies to have this kind of data lying around without proper oversight 👮‍♂️. And what about the false positives? I mean, you can't just chalk it up to 'imperfect conditions' when innocent people are being wrongly identified and added to a database 🤦‍♂️. We need more transparency and accountability here, not just a band-aid solution 📝.
 
I'm getting really uneasy about this Mobile Fortify app 🤖. I mean, who needs consent from citizens when you're taking their face scan? It's like they're saying 'trust us' without even asking 😒. And what's up with the 100k+ times it's been used already? That's a lot of potential for mistakes or false positives... not to mention the lack of transparency from DHS 🤐. I get that speed and scale are important, but at what cost to our civil liberties? 🚨 It's like they're playing with fire, using this tech without proper safeguards in place 🔥. We need more oversight and accountability here 👮‍♀️
 
omg is this real?? 🤯 they're using this app on ppl just walking down the street? that's like something out of a movie! i dont get why they need to scan faces all the time, cant they just ask ppl for id or somethin? and whats with all these false positives? that sounds so unfair 😩 also what if someone takes a selfie with their phone and its not them in it? 📸🤔
 
Back
Top