TY - JOUR
T1 - Out of distribution detection with attention head masking for multimodal document classification
AU - Constantinou, Christos
AU - Ioannides, Georgios
AU - Chadha, Aman
AU - Elkins, Aaron
AU - Simpson, Edwin
N1 - © The Author(s) 2026.
PY - 2026/1/20
Y1 - 2026/1/20
N2 - Detecting out-of-distribution (OOD) data is critical for ensuring the reliability and safety of deployed machine learning systems by mitigating model overconfidence and misclassification. While existing OOD detection methods primarily focus on uni-modal inputs, such as images or text, their effectiveness in multi-modal settings, particularly documents, remains underexplored. Moreover, most approaches prioritize decision mechanisms over optimizing the underlying dense embedding representations for optimal separation. In this work, we introduce Attention Head Masking (AHM), a novel technique applied to Transformer-based models for both uni-modal and multi-modal OOD detection. Our empirical results demonstrate that AHM enhances embedding quality, significantly improving the separation between in-distribution and OOD data. Notably, our method reduces the false positive rate (FPR) by up to 10%, outperforming state-of-the-art approaches. Furthermore, AHM generalizes effectively to multi-modal document data, where textual and visual information are jointly modeled within a Transformer architecture. To encourage further research in this area, we introduce FinanceDocs, a high-quality, publicly available document AI dataset tailored for OOD detection. Our code and dataset is available at https://github.com/constantinouchristos/OOD-AHM.
AB - Detecting out-of-distribution (OOD) data is critical for ensuring the reliability and safety of deployed machine learning systems by mitigating model overconfidence and misclassification. While existing OOD detection methods primarily focus on uni-modal inputs, such as images or text, their effectiveness in multi-modal settings, particularly documents, remains underexplored. Moreover, most approaches prioritize decision mechanisms over optimizing the underlying dense embedding representations for optimal separation. In this work, we introduce Attention Head Masking (AHM), a novel technique applied to Transformer-based models for both uni-modal and multi-modal OOD detection. Our empirical results demonstrate that AHM enhances embedding quality, significantly improving the separation between in-distribution and OOD data. Notably, our method reduces the false positive rate (FPR) by up to 10%, outperforming state-of-the-art approaches. Furthermore, AHM generalizes effectively to multi-modal document data, where textual and visual information are jointly modeled within a Transformer architecture. To encourage further research in this area, we introduce FinanceDocs, a high-quality, publicly available document AI dataset tailored for OOD detection. Our code and dataset is available at https://github.com/constantinouchristos/OOD-AHM.
U2 - 10.1038/s41598-025-32328-9
DO - 10.1038/s41598-025-32328-9
M3 - Article (Academic Journal)
C2 - 41484199
SN - 2045-2322
VL - 16
JO - Scientific Reports
JF - Scientific Reports
IS - 1
M1 - 2449
ER -