Published 2023 | Version v2
Journal article

Measuring the Impact of Explanation Bias: A Study of Natural Language Justifications for Recommender Systems

Description

Description

Despite the potential impact of explanations on decision making, there is a lack of research on quantifying their effect on users' choices. This paper presents an experimental protocol for measuring the degree to which positively or negatively biased explanations can lead to users choosing suboptimal recommendations. Key elements of this protocol include a preference elicitation stage to allow for personalizing recommendations, manual identification and extraction of item aspects from reviews, and a controlled method for introducing bias through the combination of both positive and negative aspects. We study explanations in two different textual formats: as a list of item aspects and as fluent natural language text. Through a user study with 129 participants, we demonstrate that explanations can significantly affect users' selections and that these findings generalize across explanation formats

Details

Title Measuring the Impact of Explanation Bias: A Study of Natural Language Justifications for Recommender Systems
Authors
  • Balog, Krisztian
  • Radlinski, Filip
  • Petrov, Andrey
  • Publisher Preprint
    Year of publication 2023