Ontological Model for Contextual Data Defining Time Series for Emotion Recognition and Analysis

One of the major challenges facing the field of Affective Computing is the reusability of datasets. Existing affective-related datasets are not consistent with each other, they store a variety of information in different forms, different formats, and the terms used to describe them are not unified....

Full description

Saved in:
Bibliographic Details
Published in:IEEE access Vol. 9; pp. 166674 - 166694
Main Authors: Zawadzka, Teresa, Waloszek, Wojciech, Karpus, Aleksandra, Zapalowska, Sara, Wrobel, Michal R.
Format: Journal Article
Language:English
Published: Piscataway IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:One of the major challenges facing the field of Affective Computing is the reusability of datasets. Existing affective-related datasets are not consistent with each other, they store a variety of information in different forms, different formats, and the terms used to describe them are not unified. This paper proposes a Recording Ontology for Affective-related Datasets (ROAD) as a solution to this problem, by formally describing the datasets and unifying the terms used. The developed ontology allows information about the origin and meaning of the data to be modeled, i.e., time series, representing both emotional states and features derived from biosignals. Furthermore, the ROAD ontology is extensible and not application-oriented, thus it can be used to store data from a wide range of Affective Computing experiments. The ontology was validated by modeling data obtained from one experiment on AMIGOS dataset (A dataset for Multimodal research of affect, personality traits and mood on Individuals and GrOupS). The approach proposed in the paper can be used both by researchers who create new datasets or want to reuse existing ones, and for those who want to process data from experiments in a more automated way.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2021.3132728