Expected Utility with Relative Loss Reduction: A Unifying Decision Model for Resolving Four Well-Known Paradoxes

WenJun Ma, Yuncheng Jiang, Weiru Liu, Xudong Luo, Kevin McAreavey

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

98 Downloads (Pure)


Some well-known paradoxes in decision making (e.g., the Allais paradox, the St. Petersburg paradox, the Ellsberg paradox, and the Machina paradox) reveal that choices conventional expected utility theory predicts could be inconsistent
with empirical observations. So, solutions to these paradoxes can help us better understand humans decision making accurately. This is also highly related to the prediction power of a decision-making model in real-world applications. Thus, various models have been proposed to address these paradoxes. However, most of them can only solve parts of the paradoxes, and for doing so some of them have to rely on the parameter tuning without proper justifications for such bounds of parameters. To this end, this paper proposes a new descriptive decision-making model, expected utility with relative loss reduction, which can exhibit the same qualitative behaviours as those observed in experiments of these paradoxes without any additional parameter setting. In particular, we introduce the concept of relative loss reduction to reflect people’s tendency to prefer ensuring a sufficient minimum loss to just a maximum expected utility in decision-making under risk or ambiguity.
Original languageEnglish
Title of host publication32nd AAAI Conference on Artificial Intelligence (AAAI'18)
PublisherAAAI Press
Number of pages9
Publication statusPublished - 11 Apr 2018


  • decision making
  • expected utility with relative loss reduction
  • Allais paradox
  • Ellsberg paradox
  • St. Petersburg paradox
  • Machina paradox


Dive into the research topics of 'Expected Utility with Relative Loss Reduction: A Unifying Decision Model for Resolving Four Well-Known Paradoxes'. Together they form a unique fingerprint.

Cite this