• HOME
  • NEWS
  • The Impact of AI’s Explanation Tone on Human Decision-Making
Topic

The Impact of AI’s Explanation Tone on Human Decision-Making

A study conducted by Ayano Okoso et al., in collaboration with the University of Tokyo, has been accepted for presentation at the ACM CHI Conference on Human Factors in Computing Systems (CHI 2025).

AI systems are increasingly assisting human decision-making, from daily recommendations to complex domains like law and finance. In such contexts, how AI systems explain their suggestions is becoming critical to building trust and usability. While previous research has focused on the accuracy and rationale of AI's suggestions, the influence of explanation tone on user decisions remains largely unexplored.

This study investigates how people’s decisions are influenced when AI presents the same content using different tones—such as casual, formal, or authoritative. Through large-scale user studies, we found that not only does tone affect decision-making, but its impact also varies depending on user attributes such as age and personality.

These findings suggest that when AI supports human decisions, it must not only be accurate, but also carefully designed in how it conveys information. This research provides valuable insights for designing AI systems that support human decision-making in a more reliable and user-centered manner.

Title: Do Expressions Change Decisions? Exploring the Impact of AI's Explanation Tone on Decision-Making
Authors: Okoso, A., Yang, M., Baba, Y.
Journal Name: CHI’ 2025
Published: April 28, 2025
https://doi.org/10.1145/3706598.3713744

Back to list
PAGE TOP