Quality-Based Pricing for Crowdsourced Workers

46 Pages Posted: 24 Jun 2013

See all articles by Jing Wang

Jing Wang

Hong Kong University of Science & Technology

Panagiotis G. Ipeirotis

New York University - Leonard N. Stern School of Business

Foster Provost

New York University

Date Written: June 2013

Abstract

The emergence of online paid crowdsourcing platforms, such as Amazon Mechanical Turk (AMT), presents us huge opportunities to distribute tasks to human workers around the world, on-demand and at scale. In such settings, online workers can come and complete tasks posted by a company, and work for as long or as little as they wish. Given the absolute freedom of choice, crowdsourcing eliminates the overhead of the hiring (and dismissal) process. However, this exibility introduces a di erent set of ine ciencies: verifying the quality of every submitted piece of work is an expensive operation, which often requires the same level of e ort as performing the task itself. There are many research challenges that emerge in this paid-crowdsourcing setting. How can we ensure that the submitted work is accurate? How can we estimate the quality of the workers, and the quality of the submitted results? How should we pay online workers that have imperfect quality? We present a comprehensive scheme for managing quality of crowdsourcing processes: First, we present an algorithm for estimating the quality of the participating workers and, by extension, of the generated data. We show how we can separate systematic worker biases from unrecoverable errors and how to generate an unbiased "worker quality" measurement that can be used to objectively rank workers according to their performance. Next, we describe a pricing scheme that identi es the fair payment level for a worker, adjusting the payment level according to the contributed information by each worker. Our pricing policy, which pays workers based on their expected quality, reservation wage, and expected lifetime, estimates not only 1 the payment level but also accommodates measurement uncertainties and allows the workers to receive a fair wage, even in the presence of temporary incorrect estimations of quality. Our experimental results demonstrate that the proposed pricing strategy performs better than the commonly adopted uniform-pricing strategy. We conclude the paper by describing strategies that build on our quality control and pricing framework, to build crowdsourced tasks of increasingly higher complexity, while still maintaining a tight quality control of the process, even if we allow participants of unknown quality to join the process.

Suggested Citation

Wang, Jing and Ipeirotis, Panagiotis G. and Provost, Foster, Quality-Based Pricing for Crowdsourced Workers (June 2013). NYU Working Paper No. 2451/31833, Available at SSRN: https://ssrn.com/abstract=2283000

Jing Wang

Hong Kong University of Science & Technology ( email )

Lee Shau Kee Business Building
Clearwater Bay
Kowloon
Hong Kong

HOME PAGE: http://www.bm.ust.hk/isom/faculty-and-staff/directory/jwang

Panagiotis G. Ipeirotis

New York University - Leonard N. Stern School of Business ( email )

44 West Fourth Street
Ste 8-84
New York, NY 10012
United States
+1-212-998-0803 (Phone)

HOME PAGE: http://www.stern.nyu.edu/~panos

Foster Provost

New York University ( email )

44 West Fourth Street
New York, NY 10012
United States

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
305
Abstract Views
1,699
Rank
181,711
PlumX Metrics