Predicting Humorousness and Metaphor Novelty with Gaussian Process Preference Learning

Edwin Simpson, Erik-Lân Do Dinh, Tristan Miller, Iryna Gurevych

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

199 Downloads (Pure)

Abstract

The inability to quantify key aspects of creative language is a frequent obstacle to natural language understanding. To address this, we introduce novel tasks for evaluating the creativeness of language—namely, scoring and ranking text by humorousness and metaphor novelty. To sidestep the difficulty of assigning discrete labels or numeric scores, we learn from pairwise comparisons between texts. We introduce a Bayesian approach for predicting humorousness and metaphor novelty using Gaussian process preference learning (GPPL), which achieves a Spearman’s ρ of 0.56 against gold using word embeddings and linguistic features. Our experiments show that given sparse, crowdsourced annotation data, ranking using GPPL outperforms best–worst scaling. We release a new dataset for evaluating humour containing 28,210 pairwise comparisons of 4,030 texts, and make our software freely available.
Original languageEnglish
Title of host publicationProceedings of the 57th Annual Meeting of the Association for Computational Linguistics
EditorsPreslav Nakov, Alexis Palmer
PublisherAssociation for Computational Linguistics
Pages5716-5728
Number of pages13
DOIs
Publication statusPublished - Jul 2019
Event57th Annual Meeting of the Association for Computational Linguistics (ACL) - Fortezza da Basso, Florence, Italy
Duration: 28 Jul 20192 Aug 2019
http://www.acl2019.org/EN/index.xhtml

Conference

Conference57th Annual Meeting of the Association for Computational Linguistics (ACL)
Abbreviated titleACL 2019
Country/TerritoryItaly
CityFlorence
Period28/07/192/08/19
Internet address

Fingerprint

Dive into the research topics of 'Predicting Humorousness and Metaphor Novelty with Gaussian Process Preference Learning'. Together they form a unique fingerprint.

Cite this