Crowdsourcing software evaluation

Nada Sherief, Nan Jiang, Mahmood Hosseini, Keith Phalp, Raian Ali

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

9 Citations (Scopus)

Abstract

Crowdsourcing is an emerging online paradigm for problem solving which involves a large number of people often recruited on a voluntary basis and given, as a reward, some tangible or intangible incentives. It harnesses the power of the crowd for minimizing costs and, also, to solve problems which inherently require a large, decentralized and diverse crowd. In this paper, we advocate the potential of crowdsourcing for software evaluation. This is especially true in the case of complex and highly variable software systems, which work in diverse, even unpredictable, contexts. The crowd can enrich and keep the timeliness of the developers' knowledge about software evaluation via their iterative feedback. Although this seems promising, crowdsourcing evaluation introduces a new range of challenges mainly on how to organize the crowd and provide the right platforms to obtain and process their input. We focus on the activity of obtaining evaluation feedback from the crowd and conduct two focus groups to understand the various aspects of such an activity. We finally report on a set of challenges to address and realize correct and efficient crowdsourcing mechanisms for software evaluation.

Original languageEnglish
Title of host publication18th International Conference on Evaluation and Assessment in Software Engineering, EASE 2014
PublisherAssociation for Computing Machinery
ISBN (Print)9781450324762
DOIs
Publication statusPublished - 2014
Externally publishedYes
Event18th International Conference on Evaluation and Assessment in Software Engineering, EASE 2014 - London, United Kingdom
Duration: 12 May 201414 May 2014

Publication series

NameACM International Conference Proceeding Series

Conference

Conference18th International Conference on Evaluation and Assessment in Software Engineering, EASE 2014
Country/TerritoryUnited Kingdom
CityLondon
Period12/05/1414/05/14

Keywords

  • Crowdsourcing
  • Software evaluation
  • Users feedback

Fingerprint

Dive into the research topics of 'Crowdsourcing software evaluation'. Together they form a unique fingerprint.

Cite this