The Tech Company Bringing Surveillance Dystopia to Your Town | The New Republic
local network down

The Tech Company Bringing Surveillance Dystopia to Your Town

Flock claims that its technology will literally eliminate crime. But what it’s unleashing may be just as insidious.

In this photo illustration, Flock Safety logo is being displayed on a mobile phone screen.
Osmancan Gurdogan/Getty Images
Photo illustration of Flock Safety logo on a mobile phone screen

The North Texas law enforcement officers knew right away there was something off about the boyfriend’s request that they search for his girlfriend. She’d taken abortion medication two weeks before, but he was only now concerned for her safety. He explained he’d needed “to process the event.” The district attorney had told them there was nothing with which they could charge her. (While selling abortion pills is illegal in Texas, taking them isn’t.) The police ran the search anyway, using surveillance tools from a company called Flock Safety.

In the required “reason” section of the search form, a sergeant entered “had an abortion.”

When the woman walked into the sheriff’s office a week later, detectives recognized her name. They assumed she was there “to tell her side of the story” and asked her to write a timeline of the events. Only after interviewing her did officers learn something that cast her boyfriend’s request in a new light. She was there not to defend her actions or explain herself but to report the boyfriend for assaulting her less than an hour after she had taken the medication: He “choked her, put a gun to her head, and made her beg for her life.”

The assault report was only uncovered after initial coverage of the event; before that happened, Johnson County Sheriff Adam King attempted a spin containing a pitiful grain of truth: “Her family was worried about her,” he told 404 Media. If he meant her parents, and not her piece of shit partner, well, they should have been. They probably should be still.

But amid this sordid story, it was another party to these events that raised eyebrows. Asked about the apparent misuse of Flock technology to find this woman, Flock Safety CEO Garrett Langley told Forbes, “When I look at [the Johnson County case], everything is working as it should be.

The line between personal vengeance and public power, always thin, is being methodically erased by market forces, resurgent bigotry—and new and unaccountable surveillance tools brought to us by our oleaginous friends in the tech industry. The Torment Nexus is here, and it’s attached to a telephone pole in your neighborhood.

We already know that the weaponization of government can be a lot more intimate than Trump turning the Justice Department into a gangster-style enforcement cabal, and it’s not confined to single bad actors. Johnson County was not the first place where the tools of law enforcement have been put to use in service of intimate partner violence; it wasn’t even the most immediate version of how it can happen: Flock technology and other surveillance networks have been used by stalkers who are themselves law enforcement officers.

But this case—and Flock’s involvement, in particular—illustrates an expansion of the horrifying capacity to meld personal obsessions and the state’s interest at a systemic level. In big and small towns across the country (upward of 6,000), police departments are adopting Flock’s growing arsenal and, more importantly, buying into Flock’s stated ambition: “Our mission is to eliminate crime. Full stop.” As Forbes put it, Langley “is convinced that America can and should be a place where everyone feels safe.” Safety is how the powerful sell control.

In interviews, Langley is somewhat less declarative about the company’s goals, but what’s telling is how often he and the technology reporters writing about him center their discussion on ending “crime” and not, say, “violence” or even “breaking the law.”

The company seeks to do this from the bottom up, marketing its tools as aids to those that have aggressive designs on cracking down on crime but lack the personnel resources: not just smaller police departments but homeowners associations; community organizations; schools K through college; retailers such as Simon Properties, Home Depot, and Lowe’s; corporations; and as of this month, individual owners of Ring doorbell cameras.

They bill themselves as “the largest [relative to what, they don’t say] public-private safety network,” boasting, “Our intelligent platform unites communities, businesses, schools and law enforcement, combining their power to solve and deter crime together.” Flock’s pitch is a touch quaint compared to the howling brutality of the administration itself. Its framing works for suburbanites turned off by outright thuggery because it turns anxiety into authority, makes paternalism feel like public service, and dresses threat in the uniform of care.

There’s an obvious violation of intimacy when an abuser or assaulter uses Flock to track his victim. This contagion-like spread of the panopticon through the very institutions we like to believe are refuges (I’ll even count the mall) is a violation, as well, no less triggering because it’s a neighbor, the corner store, and your local cops—and not the Biggest Brother of them all.

But “crime” is a loose idea; “ending” it is a totalizing vision, necessarily slippery and dangerously oblique. That vagueness is baked into Flock’s software, which until recently only required search operators to input a “reason” for the search but didn’t demand evidence a law was broken, or even whether a violation is suspected. Just this week, it announced that searches must include one of the FBI’s “offense categories” to proceed, as well as a case number. That doesn’t provide a bulwark from would-be stalkers: One of those categories is “missing person.”

Deliberate ambiguity about “crime” and what justifies pursuit is what allowed the Johnson County detectives to decide to pursue a woman that they well knew could not be charged with a crime. They knew she did something.

Beyond controversy around the Texas self-managed abortion case, Flock has had to respond to evidence that local law enforcement agencies have used their data to assist Immigration and Customs Enforcement. It now has offered assurances that jurisdictions proactively banning data sharing related to immigration status or abortion seeking will be excluded from national searches, as long as the local yahoo with tactical undershorts is dumb enough to put “ICE” or “abortion” in the required reason field.

But it turns out that once you’ve built a massive distributed surveillance network, it’s hard to rein in its use. The state of Washington explicitly bans sharing data or equipment with federal officers for the purpose of immigration enforcement, yet the University of Washington found dozens of examples of exactly that. Some local departments explicitly opened up their Flock data to the feds despite the state law; others had their information siphoned off without their knowledge via an unspecified technological error.

The university study and an investigation by 404 Media found another category of information sharing that also subverted state attempts to fend off immigration overreach: federal officers just asking really nice if the local guy could run a search on their behalf and the local guy happened to use “ICE” or “ICE warrant” or “illegal immigration” in the local search (tactical undies recognizes tactical undies, you know?). Worth noting: A local officer well informed about jurisdictional data-sharing limitations would just not enter “ICE” as the reason for the search, and we have no idea how many of those cannier cops there are.

Already terrified? It gets worse: Flock is turning over more and more of its monitoring to AI, a feature that Flock (and the entire technology-media industrial complex) sells as a neutral efficiency. But the problem with AI is how deeply human it really is—trained on biased data, it can only replicate and amplify what it already knows. Misogyny and white supremacy are built into surveillance DNA, and using it to search for women seeking abortions or any other suspected “criminal” can only make the echo chamber more intense.

This month, an AI-powered security system (not Flock, surprisingly) tossed out an alarm to a school resource officer, and he called the police to the scene of a Black teenager eating chips. The teen described “eight cop cars that came pulling up to us [and] they started walking toward me with guns.” You can fault the resource officer for not clocking the chip bag; at least we know the point of failure.

Now, imagine that situation using Flock’s new “aerodrone automated security product, designed to work with almost no oversight: It “integrates with your sensors and workflows to deliver real-time intelligence across massive footprints—all at roughly the cost of a single guard.” The same potentially fatal false alarms will occur, with fewer and fewer touches of a human hand to even pause the system.

AI pattern matching is a black box. You can’t know how a decision is made until it’s too late to unmake it. A private surveillance firm plugging AI into policing doesn’t democratize safety or create objectivity, it accelerates suspicion based on existing grievances.

Except when it’s designed to suspect nothing. Flock’s response to controversies about privacy has included supposed “transparency” features, as well as tools that it claims will enable “public audits” of searches and results. And if your small police department that’s turned to Flock as a “force multiplier” doesn’t have the staff to run audits? No worries: “To support agencies with limited resources for audit monitoring, we are developing a new AI-based tool.… This tool will help agencies maintain transparency and accountability at scale.” Using an AI to monitor an AI is a level of absurdity Philip K. Dick never quite got to. Maybe someone can write a ChatGPT prompt for a novel in his style.

I think Dick would recognize another irony: AIs surveilling AIs surveilling us sounds like a dispassionate threat from without, but the ghost in the machine is that we cannot scrub away the passions and resentments that incite the obsession to begin with. The paternalism that launches the drone for our good doesn’t curb the risk that something will go wrong. When you use sophisticated technology to pursue vengeance, you are not elevating the action to a cause. Involving an AI doesn’t make violence an abstraction. An automated vigilante isn’t impersonal, just efficient.