'A Certain Dangerous Engine': Private Security Robots, Artificial Intelligence, and Deadly Force

17 Pages Posted: 6 Oct 2017

See all articles by Elizabeth E. Joh

Elizabeth E. Joh

University of California, Davis - School of Law

Date Written: October 5, 2017

Abstract

Robots equipped with artificial intelligence will transform existing notions of work in fields as diverse as fast food, health care, manufacturing and the military. A recent use of a remote controlled robot equipped with lethal force has raised the question of how police might use robots to supplement or replace existing police work. Those same questions apply equally to private individuals who will want security robots, some of which are already in development and for lease. What about a future in which people employ autonomous and lethally armed security robots for protecting their homes and themselves? How would courts characterize security robots? One possibility involves spring guns. While spring guns may not ultimately be well suited as a way to discuss security robots, it turns out that many of the questions considered in the spring gun cases may begin the framework that courts will have to develop as robots enter the security business.

Keywords: Police, Robots, Artificial Intelligence, Robotics, Big Data, Criminal Justice, Fourth Amendment, Algorithm, Private Security, Private Policing

Suggested Citation

Joh, Elizabeth E., 'A Certain Dangerous Engine': Private Security Robots, Artificial Intelligence, and Deadly Force (October 5, 2017). UC Davis L. Rev., 2017, Forthcoming, Available at SSRN: https://ssrn.com/abstract=3048394

Elizabeth E. Joh (Contact Author)

University of California, Davis - School of Law ( email )

400 Mrak Hall Drive
Davis, CA 95616-5201
United States

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
449
Abstract Views
2,941
Rank
119,245
PlumX Metrics