Abstract
We describe the fourth edition of the CheckThat! Lab, part of the 2022 Conference and Labs of the Evaluation Forum (CLEF). The lab evaluates technology supporting three tasks related to factuality, and it covers seven languages such as Arabic, Bulgarian, Dutch, English, German, Spanish, and Turkish. Here, we present the task 2, which asks to detect previously fact-checked claims (in two languages). A total of six teams participated in this task, submitted a total of 37 runs, and most submissions managed to achieve sizable improvements over the baselines using transformer based models such as BERT, RoBERTa. In this paper, we describe the process of data collection and the task setup, including the evaluation measures, and we give a brief overview of the participating systems. Last but not least, we release to the research community all datasets from the lab as well as the evaluation scripts, which should enable further research in detecting previously fact-checked claims.
Original language | English |
---|---|
Pages (from-to) | 393-403 |
Number of pages | 11 |
Journal | CEUR Workshop Proceedings |
Volume | 3180 |
Publication status | Published - 2022 |
Event | 2022 Conference and Labs of the Evaluation Forum, CLEF 2022 - Bologna, Italy Duration: 5 Sept 2022 → 8 Sept 2022 |
Keywords
- COVID-19
- Check-Worthiness Estimation
- Computational Journalism
- Detecting Previously Fact-Checked Claims
- Fact-Checking
- Social Media Verification
- Veracity
- Verified Claims Retrieval