On the Spillover Effects of Online Product Reviews on Purchases: Evidence from Clickstream Data

Author(s):  
Young Kwark ◽  
Gene Moo Lee ◽  
Paul A. Pavlou ◽  
Liangfei Qiu
Author(s):  
Young Kwark ◽  
Gene Moo Lee ◽  
Paul A. Pavlou ◽  
Liangfei Qiu

We study the spillover effects of the online reviews of other covisited products on the purchases of a focal product using clickstream data from a large retailer. The proposed spillover effects are moderated by (a) whether the related (covisited) products are complementary or substitutive, (b) the choice of media channel (mobile or personal computer (PC)) used, (c) whether the related products are from the same or a different brand, (d) consumer experience, and (e) the variance of the review ratings. To identify complementary and substitutive products, we develop supervised machine-learning models based on product characteristics, such as product category and brand, and novel text-based similarity measures. We train and validate the machine-learning models using product pair labels from Amazon Mechanical Turk. Our results show that the mean rating of substitutive (complementary) products has a negative (positive) effect on purchasing of the focal product. Interestingly, the magnitude of the spillover effects of the mean ratings of covisited (substitutive and complementary) products is significantly larger than the effects on the focal product, especially for complementary products. The spillover effect of ratings is stronger for consumers who use mobile devices versus PCs. We find the negative effect of the mean ratings of substitutive products across different brands on purchasing of a focal product to be significantly higher than within the same brand. Lastly, the effect of the mean ratings is stronger for less experienced consumers and for ratings with lower variance. We discuss implications on leveraging the spillover effect of the online product reviews of related products to encourage online purchases.


2018 ◽  
Vol 51 (1-3) ◽  
pp. 25-49
Author(s):  
Ravi KUMAR ◽  
Teja SANTOSH DANDIBHOTLA ◽  
Vishnu VARDHAN BULUSU

2018 ◽  
Vol 13 (4) ◽  
pp. 192 ◽  
Author(s):  
Li Yang

It is widely proved that positive online word-of-mouth (WOM) can boost sales and negative online WOM harm sales. Then will more positivity or negativity of messages in online product reviews text have greater impact on product sales? This research attempts to tackle this ignored research question. The answer is counter-intuitive: it depends on how positive or negative they are! The results of a two-way fixed-effects panel data analysis based on the data about tablet market in Amazon and a novel sentiment analysis technique demonstrate that the most and least polarized online product reviews actually have no effect on sales and only moderate positive / negative reviews can affect sales. Such effects can be explained by the optimal arousal theory and attribution theory. Inspired by the findings, three strategies for user-generated content (UGC) management are proposed.


2021 ◽  
Author(s):  
Joanne DiNova

This paper examines the use of language in user generated online product reviews on the website Yelp.ca. Using both Relevance Theory and the Co-operative Principle this study identifies nine linguistic devices to analyze within restaurant reviews on this website. Yelp.ca administrators identify some reviewers as “Elite Reviewers.” This study contrasted twenty-five Elite reviews with twenty-five Non-Elite reviews in order to determine which linguistic devices were more prevalent within Elite reviews. The findings illustrate that there are concrete differences between these two types of reviews. Assuming that Elite Reviews are in fact more persuasive, these findings suggest that there may be concrete attributes of a review that make it more persuasive in an online, user generated context.


Sign in / Sign up

Export Citation Format

Share Document