The FBI Wants AI Surveillance Drones With Facial Recognition

The FBI is looking for ways to incorporate artificial intelligence into drones, according to federal procurement documents.

On Thursday, the FBI put out the call to potential vendors of AI and machine learning technology to be used in unmanned aerial systems in a so-called “request for information,” where government agencies request companies submit initial information for a forthcoming contract opportunity.

“It’s essentially technology tailor-made for political retribution and harassment.”

The FBI is in search of technology that could enable drones to conduct facial recognition, license plate recognition, and detection of weapons, among other uses, according to the document.

The pitch from the FBI immediately raised concerns among civil libertarians, who warned that enabling FBI drones with artificial intelligence could exacerbate the chilling effect of surveillance of activities protected by the First Amendment.

“By their very nature, these technologies are not built to spy on a specific person who is under criminal investigation,” said Matthew Guariglia, a policy analyst at the Electronic Frontier Foundation. “They are built to do indiscriminate mass surveillance of all people, leaving people that are politically involved and marginalized even more vulnerable to state harassment.”

The FBI did not immediately respond to a request for comment.

Law enforcement agencies at local, state, and federal levels have increasingly turned to drone technology in efforts to combat crime, respond to emergencies, and patrol areas along the border.

The use of drones to surveil protesters and others taking part in activities ostensibly protected under the Constitution frequently raises concerns.

In New York City, the use of drones by the New York Police Department soared in recent years, with little oversight to ensure that their use falls within constitutional limits, according to a report released this week by the Surveillance Technology Oversight Project.

In May 2020, as protests raged in Minneapolis over the murder of George Floyd, the Department of Homeland Security deployed unmanned vehicles to record footage of protesters and later expanded drone surveillance to at least 15 cities, according to the New York Times. When protests spread, the U.S. Marshals Service also used drones to surveil protesters in Washington, D.C., according to documents obtained by The Intercept in 2021.

“Technically speaking, police are not supposed to conduct surveillance of people based solely on their legal political activities, including attending protests,” Guariglia said, “but as we have seen, police and the federal government have always been willing to ignore that.”

“One of our biggest fears in the emergence of this technology has been that police will be able to fly a face recognition drone over a protest and in a few passes have a list of everyone who attended. It’s essentially technology tailor-made for political retribution and harassment,” he said.

Related

AI Tries (and Fails) to Detect Weapons in Schools

In addition to the First Amendment concerns, the use of AI-enabled drones to identify weapons could exacerbate standoffs between police and civilians and other delicate situations. In that scenario, the danger would come not from the effectiveness of AI tech but from its limitations, Guariglia said. Government agencies like school districts have forked over cash to companies running AI weapons detection systems — one of the specific uses cited in the FBI’s request for information — but the products have been riddled with problems and dogged by criticisms of ineffectiveness.

“No company has yet proven that AI firearm detection is a viable technology,” Guariglia told The Intercept. “On a drone whirling around the sky at an awkward angle, I would be even more nervous that armed police will respond quickly and violently to what would obviously be false reports of a detected weapon.”

The post The FBI Wants AI Surveillance Drones With Facial Recognition appeared first on The Intercept.

This post was originally published on The Intercept.