Axon, America’s largest manufacturer of police body cameras, may add facial recognition to its body cameras, a move that has sparked concerns among civil libertarians.
The company, formerly known as Taser, is an industry leader in electroshock weapons and body cameras, with more than 18,000 law enforcement customers in more than 100 countries. Within the United States, 38 out of the 68 major law enforcement agencies have purchased Axon body cameras.
On Thursday, Axon announced that it has created an artificial intelligence ethics board dedicated to developing responsible AI technologies. The board will meet twice a year to “guide the development of Axon’s AI-powered devices and services,” with a focus on how the products will impact communities. While the announcement did not directly mention facial recognition, company founder Rick Smith tells The Washington Post that such technologies are “under active consideration.”
Body cameras with facial recognition could identify the faces of individuals using biometric data—facial features, retina scans, and other identifiers—in real time. Every person the officer comes across could be flagged and identified. Artificial intelligence would make sense of the footage. It could track suspects, scan for wanted individuals, and identify people onscreen.
While acknowledging the technology’s potential for “bias and misuse,” Smith argues that the tech’s benefits cannot be ignored. “I don’t think it’s an optimal solution, the world we’re in today, that catching dangerous people should just be left up to random chance, or expecting police officers to remember who they’re looking for,” he says in the Post. “It would be both naive and counterproductive to say law enforcement shouldn’t have these new technologies. They’re going to, and I think they’re going to need them. We can’t have police in the 2020s policing with technologies from the 1990s.”
Cameras with real-time facial recognition are already being used in China and the U.K.
Currently, photographs of 117 million Americans—nearly half the country’s population—are stored in a facial recognition database that can be accessed by the FBI.
According to a report from the Government Accountability Office, the FBI’s use of face recognition technology has scant oversight and the bureau does little to test for false positives and racial bias when looking for suspects. Yet facial recognition technologies often struggle with identifying members of some ethnic groups.
Forty-two different groups, including the American Civil Liberties Union, the National Association for the Advancement of Colored People, and the National Urban League, raised these concerns in an open letter to Axon’s ethics board. In their statement, the groups called body cameras with real-time facial recognition “categorically unethical to deploy”:
Axon has a responsibility to ensure that its present and future products, including AI-based products, don’t drive unfair or unethical outcomes or amplify racial inequities in policing. Axon acknowledges this responsibility—the company states that it “fully recognize[s] the complexities and sensitivities around technology in law enforcement, and [is] committed to getting it right.”
Certain products are categorically unethical to deploy. Chief among these is real-time face recognition analysis of live video captured by body-worn cameras. Axon must not offer or enable this feature. Real-time face recognition would chill the constitutional freedoms of speech and association, especially at political protests….Real-time face recognition could also prime officers to perceive individuals as more dangerous than they really are and to use more force than the situation requires. No policy or safeguard can mitigate these risks sufficiently well for real-time face recognition ever to be marketable.
from Hit & Run https://ift.tt/2JDybbV
via IFTTT