Abstract
Using a single model across various tasks is beneficial for training and applying deep neural sequence models. We address the problem of developing generalist representations of text that can be used to perform a range of different tasks rather than being specialised to a single application. We focus on processing short questions and developing an embedding for these questions that is useful on a diverse set of problems, such as question topic classification, equivalent question recognition, and question answering. This paper introduces QBERT, a generalist model for processing questions. With QBERT, we demonstrate how we can train a multi-task network that performs all question-related tasks and has achieved similar performance compared to its corresponding single-task models.
Original language | English |
---|---|
Pages | 472-483 |
Number of pages | 12 |
DOIs | |
Publication status | Published - Apr 2023 |
Bibliographical note
Publisher Copyright:© 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.
Keywords
- Multi-task Learning
- Deep Learning
- Text Processing
Fingerprint
Dive into the research topics of 'QBERT: Generalist Model for Processing Questions'. Together they form a unique fingerprint.Student theses
-
Processing questions with multi-task sentence embedding
Xu, Z. (Author), Cristianini, N. (Supervisor), Lewis, M. (Supervisor) & Damen, D. (Supervisor), 9 May 2023Student thesis: Doctoral Thesis › Doctor of Philosophy (PhD)
File