A Conceptual Probabilistic Framework for Annotation Aggregation of Citizen Science Data

Over the last decade, hundreds of thousands of volunteers have contributed to science by collecting or analyzing data. This public participation in science, also known as citizen science, has contributed to significant discoveries and led to publications in major scientific journals. However, little...

Full description

Saved in:
Bibliographic Details
Published in:Mathematics (Basel) Vol. 9; no. 8; p. 875
Main Authors: Cerquides, Jesus, Mülâyim, Mehmet Oğuz, Hernández-González, Jerónimo, Ravi Shankar, Amudha, Fernandez-Marquez, Jose Luis
Format: Journal Article
Language:English
Published: Basel MDPI AG 01-04-2021
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Over the last decade, hundreds of thousands of volunteers have contributed to science by collecting or analyzing data. This public participation in science, also known as citizen science, has contributed to significant discoveries and led to publications in major scientific journals. However, little attention has been paid to data quality issues. In this work we argue that being able to determine the accuracy of data obtained by crowdsourcing is a fundamental question and we point out that, for many real-life scenarios, mathematical tools and processes for the evaluation of data quality are missing. We propose a probabilistic methodology for the evaluation of the accuracy of labeling data obtained by crowdsourcing in citizen science. The methodology builds on an abstract probabilistic graphical model formalism, which is shown to generalize some already existing label aggregation models. We show how to make practical use of the methodology through a comparison of data obtained from different citizen science communities analyzing the earthquake that took place in Albania in 2019.
ISSN:2227-7390
2227-7390
DOI:10.3390/math9080875