Skip to content
The Present

Should law enforcement be using AI and cell phone data to find rioters?

The attack on the Capitol forces us to confront an existential question about privacy.

Credit: Saul Loeb/AFP via Getty Images

Supporters of US President Donald Trump enter the US Capitol's Rotunda on January 6, 2021, in Washington, DC.
Key Takeaways
  • The insurrection attempt at the Capitol was captured by thousands of cell phones and security cameras.
  • Many protestors have been arrested after their identity was reported to the FBI.
  • Surveillance experts warn about the dangers of using facial recognition to monitor protests.

If ever there were a reason to wear masks, the insurrection at the Capitol last week would have been it. But many of those present believed the anti-mask rhetoric being used as a distraction from the nation’s skyrocketing death rate. In fact, the day might even prove to have been a superspreader event, with at least two congresspeople becoming infected after the siege.

Those involved in the attempted coup d’état were not concerned about a virus. Nor, apparently, were they worried about shielding themselves from the tens of thousands of hours of recorded video taken by thousands of phones. In a strange merging of social media and dark web chat rooms come to life, separating actual insurrectionists from revolutionary tourists could prove to be a cumbersome vocation. One thing is certain: identifying them is not difficult.

Instagram-worthy sieges bring us to a longstanding existential question: should law enforcement be allowed to use AI and cell phone data to prosecute offenders?

Of the many security failures that day, one stood out: the small number of arrests for a breach of outsized magnitude. As the nation ogled at an unemployed actor turned conspiracy shaman behind the speaker’s chair in real-time, scenes of horrendous violence took hours, even days, to be released. In a game of seemingly futile catch-up, federal agencies opened tip lines to identify the insurrectionists that should have easily been in their grasp.

But the public responded.

Brad Templeton: Today’s Surveillance Society is Beyond Orwellianwww.youtube.com

There’s the ex-wife of a retired Air Force lieutenant colonel whose neck gaiter was pulled down; the patriotic cohort of Internet detectives crowd-sourcing information for the FBI; the director of the infamous pseudoscience film, “Plandemic,” praising the “patriots” that breached the building moments after he left the siege himself; and that unemployed actor who regularly attended QAnon events leaving the most public trail imaginable, and who is currently in custody facing serious charges.

Fish in barrels, all of them. What of the remaining thousands?

This privacy discussion is not new. Arthur Holland Michel, founder and co-director of the Center for the Study of the Drone at Bard College, warned Big Think in 2019 about the dangers of surveillance technology—specifically, in this case, a camera known as Gorgon Stare.

“Say there is a big public protest. With this camera, you can follow thousands of protesters back to their homes. Now you have a list of the home addresses of all the people involved in a political movement. If on their way home you witness them committing some crime—breaking a traffic regulation or frequenting a location that is known to be involved in the drug trade—you can use that surveillance data against them to essentially shut them up. That’s why we have laws that prevent the use of surveillance technologies because it is human instinct to abuse them. That’s why we need controls.”

Late last year, University of Miami students pushed back against school administrators using facial recognition software for potentially insidious means—a protest not limited to that campus. Can you place students refusing to attend classes during a pandemic with armed insurrectionists attempting to change the results of a democratic election? Not even close. More to the point, however, we should leave political leanings out of the equation when deciding who we think should be monitored.

Protesters enter the U.S. Capitol Building on January 06, 2021 in Washington, DC. Congress held a joint session today to ratify President-elect Joe Biden’s 306-232 Electoral College win over President Donald Trump.Credit: Win McNamee/Getty Images

Shortly after the siege, the New Yorker’s Ronan Farrow helped reveal the identity of the aforementioned lieutenant colonel while conservatives claim the riots were actually antifa—a conspiracy theory that’s been peddled before. Politics simply can’t be avoided in this age. Still, Albert Fox Cahn, founder of the Surveillance Technology Oversight Project, doesn’t believe the insurrection attempt justifies an uptick in facial recognition technology.

“We don’t need a cutting-edge surveillance dragnet to find the perpetrators of this attack: They tracked themselves. They livestreamed their felonies from the halls of Congress, recording each crime in full HD. We don’t need facial recognition, geofences, and cell tower data to find those responsible, we need police officers willing to do their job.”

Smarter faster: the Big Think newsletter
Subscribe for counterintuitive, surprising, and impactful stories delivered to your inbox every Thursday

The New Orleans City Council recently banned similar surveillance technologies due to fears that it would unfairly target minorities. San Francisco was the first city to outright ban facial recognition nearly two years ago. Cahn’s point is that the FBI shouldn’t be using AI to cover for the government’s failure to protect the Capitol. Besides, the insurrectionists outed themselves on their own social media feeds.

When Pandora’s box cracks open, it’s hard to push the monster back in. Naomi Klein detailed the corporate takeover of New Orleans after Hurricane Katrina in “The Shock Doctrine.” Real estate brokers, charter school companies, and government agencies didn’t cause the flood, but they certainly profited from it. The fear is that companies like Clearview AI, which saw a 26 percent spike in usage of its facial recognition service following the attack, will be incentivized, as will police departments to use such technology for any means they choose.

Cahn comes to a similar conclusion: don’t expose American citizens to the “anti-democratic technology” known as facial recognition. New Yorkers had to endure subway backpack checks for nearly a decade after 9/11; this slope is even slipperier.

As the US braces for further “armed protests” in all 50 states over the coming week, phones need to keep capturing footage. Bystanders need to remain safe, of course. But if last week was any indication, the insurrectionists have difficulty deciphering between social media and real life. Their feeds should reveal enough.

Stay in touch with Derek on Twitter and Facebook. His most recent book isHero’s Dose: The Case For Psychedelics in Ritual and Therapy.”


Related

Up Next