TY - GEN
T1 - CRAFT
T2 - 2025 IEEE/CVF Winter Conference on Applications of Computer Vision, WACV 2025
AU - Karunanayake, Naveen
AU - Seneviratne, Suranga
AU - Chawla, Sanjay
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Out-of-distribution (OOD) detection remains a key challenge preventing the rollout of key AI technologies like autonomous vehicles into the mainstream as classifiers trained on in-distribution (ID) data are unable to gracefully handle OOD data. While OOD detection remains an active area of research, current post-hoc methods often suffer from limited separability between ID and OOD, and outlier exposure-based methods lack generalisation to unseen outlier types. We present CRAFT, a fine-tuning approach for arming pre-trained classifiers against OOD inputs without requiring access to outliers. The key insight that underpins our approach is that during pre-training, classifiers implicitly learn a ranking across the ID classes that is not respected by OOD data. Therefore, a form of fine-tuning without outliers of a pre-trained classifier can sharpen the rank order of the classes, making them sensitive to the presence of OOD data. Furthermore, the fine-tuned model does not impact the ability of the classifier to correctly classify ID inputs to their respective classes. Experiments on CIFAR-10, CIFAR-100, and ImageNet-200 demonstrate that CRAFT outperforms 33 existing methods, particularly in the more challenging near-OOD detection, as well as in overall OOD detection consistency and ID classification accuracy.
AB - Out-of-distribution (OOD) detection remains a key challenge preventing the rollout of key AI technologies like autonomous vehicles into the mainstream as classifiers trained on in-distribution (ID) data are unable to gracefully handle OOD data. While OOD detection remains an active area of research, current post-hoc methods often suffer from limited separability between ID and OOD, and outlier exposure-based methods lack generalisation to unseen outlier types. We present CRAFT, a fine-tuning approach for arming pre-trained classifiers against OOD inputs without requiring access to outliers. The key insight that underpins our approach is that during pre-training, classifiers implicitly learn a ranking across the ID classes that is not respected by OOD data. Therefore, a form of fine-tuning without outliers of a pre-trained classifier can sharpen the rank order of the classes, making them sensitive to the presence of OOD data. Furthermore, the fine-tuned model does not impact the ability of the classifier to correctly classify ID inputs to their respective classes. Experiments on CIFAR-10, CIFAR-100, and ImageNet-200 demonstrate that CRAFT outperforms 33 existing methods, particularly in the more challenging near-OOD detection, as well as in overall OOD detection consistency and ID classification accuracy.
KW - deep neural networks
KW - out-of-distribution detection
UR - http://www.scopus.com/inward/record.url?scp=105003626733&partnerID=8YFLogxK
U2 - 10.1109/WACV61041.2025.00405
DO - 10.1109/WACV61041.2025.00405
M3 - Conference contribution
AN - SCOPUS:105003626733
SN - 979-8-3315-1084-8
T3 - Ieee Winter Conference On Applications Of Computer Vision
SP - 4119
EP - 4128
BT - 2025 Ieee/cvf Winter Conference On Applications Of Computer Vision, Wacv
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 28 February 2025 through 4 March 2025
ER -