Auditing Human-Machine Communication Systems Using Simulated Humans
Handbook of Human-Machine Communication
20 Pages Posted: 2 Sep 2021
Date Written: August 31, 2021
Abstract
Audit methods have emerged as a powerful approach for helping to describe and understand the outputs of algorithmic systems. In this chapter we examine the methodological considerations and limitations associated with simulating humans in order to systematically audit the behavior of Human-Machine Communication (HMC) systems. We first detail two recent audits we undertook—one of the Amazon Alexa voice assistant and the other of the Twitter algorithmic timeline—where we developed techniques for simulating users to systematically gather data and assess various aspects of each system. We then elaborate methodological considerations in terms of system access, audit scope, simulation fidelity, and other ethical issues, drawing on our case studies and other experiences to offer concrete illustrations and advice for researchers interested in applying audit methods to HMC systems. New opportunities for methodological development are also discussed.
Keywords: Auditing, Algorithm Auditing, Algorithmic Curation, Twitter, Alexa, Simulation, Algorithms
Suggested Citation: Suggested Citation