Examining the Presence of Gender Bias in Customer Reviews Using Word Embedding
32 Pages Posted: 24 Feb 2019
Date Written: January 31, 2019
Abstract
Humans have entered the age of algorithms. Each minute, algorithms shape countless preferences from suggesting a product to a potential life partner. In the marketplace, algorithms are trained to learn consumer preferences from customer reviews because user-generated reviews are considered the voice of customers and a valuable source of information to firms. Insights, minded from reviews, play an indispensable role in several business activities ranging from product recommen-dations, targeted advertising, promotions, segmentation, etc. In this research, we question whether reviews might hold stereotypic gender bias that algorithms learn and propagate. Utilizing data from millions of observations and a word embedding approach, GloVe, we show that algorithms designed to learn from human language output, also learn gender bias. We also examine why such biases occur: whether the bias is caused because of a negative bias against females or a positive bias for males. We examine the impact of gender bias in reviews on choice and conclude with policy implications for female consumers, especially when they are unaware of the bias, and the ethical implications for the firms.
Keywords: Gender Bias, Natural Language Processing, Customer Reviews, Text Analysis, Word Embedding
Suggested Citation: Suggested Citation