San Francisco police can now use robots to kill

Last week, we talked about killer robots. That piece was inspired by a proposal that would allow San Francisco police to use robots for killing “when risk of loss of life to members of the public or officers is imminent and outweighs any other force option available to SFPD.” Last night, that proposal passed the city’s board of supervisors with an 8-3 vote.

The language was included in a new “Law Enforcement Equipment Policy” filed by the San Francisco Police Department in response to California Assembly Bill 481, which requires a written inventory of the military equipment utilized by law enforcement. The document submitted to the board of supervisors includes -- among other things -- the Lenco BearCat armored vehicle, flash-bang grenades and 15 submachine guns.

The inventory also names 17 robots owned by the SFPD -- 12 of which are fully functioning. None are designed specifically for killing. They’re mostly used to detect and dispose of bombs -- something police departments have been doing for years. The language included in the proposal effectively allows for these -- or other -- robots to kill in order to save the lives of officers or the public.

As we noted last week, the proposal seems to fit the definition of “justified” deadly force. Police in the U.S. are authorized to shoot when a situation meets a number of criteria, including self-defense and cases where others are facing death or serious bodily harm. A robot is not a gun, of course (though we are now aware of robots that sport guns), but the 8-3 vote effectively approves the weaponization of robots in these sorts of cases.

“Robots equipped in this manner would only be used in extreme circumstances to save or prevent further loss of innocent lives,” Allison Maxie, a spokesperson for SFPD said in a statement. Maxie added that robots could be armed with explosives, “to contact, incapacitate, or disorient violent, armed or dangerous suspect.”

Such applications certainly appear to run counter to the purpose for which these robots were both built and acquired. There is precedent for this, however. In July 2016, the Dallas Police Department killed a suspect using a robot armed with a bomb for what’s believed to be the first time in U.S. history. “We saw no other option but to use our bomb robot and place a device on its extension for it to detonate where the suspect was,” police chief David Brown told the press after the incident.

As more robots are being developed for military applications, it’s easy to see how such language could open the door for the acquisition of systems that are weaponized out of the box. Police use of military equipment has become commonplace in U.S. police departments in the wake of the National Defense Authorization Act for Fiscal Year 1997. Section 1033 of the bill allows for the military’s “transfer of excess personal property to support law enforcement activities” for the sake of drug enforcement. Maxie says the SFPD currently has no plans to stick guns on robots.

Last year, the Electronic Frontier Foundation warned of “mission creep” with regards to the use of armed robots, noting:

Time and time again, technologies given to police to use only in the most extreme circumstances make their way onto streets during protests or to respond to petty crime. For example, cell site simulators (often called “Stingrays”) were developed for use in foreign battlefields, brought home in the name of fighting “terrorism,” then used by law enforcement to catch immigrants and a man who stole $57 worth of food. Likewise, police have targeted BLM protesters with face surveillance and Amazon Ring doorbell cameras.

The proposal’s approval appears to run counter to San Francisco’s image as one America’s most liberal cities. The debate around the issue was lively, running more than two hours. It comes at a time when many left-leaning politicians are concerned about appearing antagonistic toward police.

“I think there’s larger questions raised when progressives and progressive policies start looking to the public like they are anti-police,” board member Rafael Mandelman noted during the meeting. “I think that is bad for progressives. I think it’s bad for this Board of Supervisors. I think it’s bad for Democrats nationally.”

SF’s Board of Supervisors Rules Committee chair Aaron Peskin had previously attempted to insert language condemning the use of robots for killing. The line, “Robots shall not be used as a Use of Force against any person” was reportedly crossed out by the SFPD.

Last month Oakland fought a similar battle across the bay. Their debate ended differently. Following public backlash, the police department wrote:

The Oakland Police Department (OPD) is not adding armed remote vehicles to the department. OPD did take part in ad hoc committee discussions with the Oakland Police Commission and community members to explore all possible uses for the vehicle. However, after further discussions with the Chief and the Executive Team, the department decided it no longer wanted to explore that particular option.

San Francisco Board of Supervisors President Shamann Walton used their own debate to warn against the impact of such a proposal on people of color. “We continuously are being asked to do things in the name of increasing weaponry and opportunities for negative interaction between the police department and people of color,” he noted during the meeting. “This is just one of those things.”