An increasing number of police departments across the country are requiring officers to wear body cameras in an attempt to improve officer accountability. Bodycams have already succeeded in providing evidence in police-involved shootings, and cops have even been caught committing crimes on their own cameras.
Taser, one of the primary manufacturers of police body cameras, recently announced that it had acquired the artificial intelligence startup Dextro Inc. The company's self-titled software is a video analysis tool that can be trained to recognize objects in camera footage. Once it has identified objects or movements in a video, it creates a timeline of when they appear in the footage.
When police release bodycam videos to the public, all faces and visible tattoos must be blurred. It’s a time-consuming process when done by hand, but can be exponentially sped up using the new technology. Dextro also allows police to pinpoint the exact times that objects appear in videos. For example, in a situation where a cop shoots someone after mistaking a toy or phone for a gun, the software can identify the exact moment that mistake was made.
While this all sounds well and good, activists are concerned that Dextro’s technology is ripe for abuse as a mass surveillance tool. “This gives police departments as a whole massive search capabilities that could be used to turn these tools into surveillance cameras,” said Jay Stanley, senior policy analyst with the ACLU. “Police body cameras capture a lot of video about a lot of people that’s not in the public interest...and for privacy reasons should not be indexed in the way this AI proposes to do so.”
The Dextro software would essentially allow a police department to turn their entire backlog of bodycam footage into a searchable database. Anyone who happened to be caught on a cop's camera, even if they are completely innocent of any crime, could now be a part of the database. Republican lawmakers are pushing legislation in 18 states that would criminalize protesters, and this technology could allow police to retroactively identify faces in a crowd.
“Taxpayers wanted an accountability tool, not a surveillance tool,” Dr. Alvaro Bedoya, Executive Director of the Center on Privacy & Technology at Georgetown Law said.