학술논문

Explaining Search Result Stances to Opinionated People
Document Type
Working Paper
Source
Subject
Computer Science - Information Retrieval
Computer Science - Artificial Intelligence
Computer Science - Machine Learning
Language
Abstract
People use web search engines to find information before forming opinions, which can lead to practical decisions with different levels of impact. The cognitive effort of search can leave opinionated users vulnerable to cognitive biases, e.g., the confirmation bias. In this paper, we investigate whether stance labels and their explanations can help users consume more diverse search results. We automatically classify and label search results on three topics (i.e., intellectual property rights, school uniforms, and atheism) as against, neutral, and in favor, and generate explanations for these labels. In a user study (N =203), we then investigate whether search result stance bias (balanced vs biased) and the level of explanation (plain text, label only, label and explanation) influence the diversity of search results clicked. We find that stance labels and explanations lead to a more diverse search result consumption. However, we do not find evidence for systematic opinion change among users in this context. We believe these results can help designers of search engines to make more informed design decisions.
Comment: 24 pages, 6 figures (World Conference on eXplainable Artificial Intelligence xAI 2023)