Overview of the CLEF-2022 CheckThat! Lab Task 2 on Detecting Previously Fact-Checked Claims

Preslav Nakov*, Giovanni Da San Martino, Firoj Alam, Shaden Shaar, Hamdy Mubarak, Nikolay Babulkov

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

2 Citations (Scopus)

Abstract

We describe the fourth edition of the CheckThat! Lab, part of the 2022 Conference and Labs of the Evaluation Forum (CLEF). The lab evaluates technology supporting three tasks related to factuality, and it covers seven languages such as Arabic, Bulgarian, Dutch, English, German, Spanish, and Turkish. Here, we present the task 2, which asks to detect previously fact-checked claims (in two languages). A total of six teams participated in this task, submitted a total of 37 runs, and most submissions managed to achieve sizable improvements over the baselines using transformer based models such as BERT, RoBERTa. In this paper, we describe the process of data collection and the task setup, including the evaluation measures, and we give a brief overview of the participating systems. Last but not least, we release to the research community all datasets from the lab as well as the evaluation scripts, which should enable further research in detecting previously fact-checked claims.

Original languageEnglish
Pages (from-to)393-403
Number of pages11
JournalCEUR Workshop Proceedings
Volume3180
Publication statusPublished - 2022
Event2022 Conference and Labs of the Evaluation Forum, CLEF 2022 - Bologna, Italy
Duration: 5 Sept 20228 Sept 2022

Keywords

  • COVID-19
  • Check-Worthiness Estimation
  • Computational Journalism
  • Detecting Previously Fact-Checked Claims
  • Fact-Checking
  • Social Media Verification
  • Veracity
  • Verified Claims Retrieval

Fingerprint

Dive into the research topics of 'Overview of the CLEF-2022 CheckThat! Lab Task 2 on Detecting Previously Fact-Checked Claims'. Together they form a unique fingerprint.

Cite this