A Deep GRU-BiLSTM Network for Multi-modal Emotion Recognition from Text

The recognition of emotions from text has become an indispensable asset in the realm of Natural Language Processing, and numerous approaches have been proposed to address this challenge. In the era of social networks, blogs, forums, chatbots, and artificial intelligence the ability to recognize emot...

Full description

Saved in:
Bibliographic Details
Published in:2024 IEEE 7th International Conference on Advanced Technologies, Signal and Image Processing (ATSIP) Vol. 1; pp. 138 - 143
Main Authors: Yacoubi, Ibtissem, Ferjaoui, Radhia, Khalifa, Anouar Ben
Format: Conference Proceeding
Language:English
Published: IEEE 11-07-2024
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The recognition of emotions from text has become an indispensable asset in the realm of Natural Language Processing, and numerous approaches have been proposed to address this challenge. In the era of social networks, blogs, forums, chatbots, and artificial intelligence the ability to recognize emotions has become crucial for analyzing opinions, assessments, judgments, and behaviors towards a variety of goods, services, organizations, events, and specific issues. The principal objective of this paper is to introduce a novel approach, Deep GRU-BiLSTM, which leverages a technique known as late fusion. Our approach combines the scores of two deep learning models, Gated Re-current Units (GRU) and bidirectional LSTM (BiLSTM), to produce the final classification of emotions from a given text. Through the fusion of these models, we exploit the concept of multi-modality, thereby augmenting the resilience and efficacy of emotion classification. To evaluate the performance of the proposed model, we compared our experimental results with individual GRU, individual BiLSTM, and a method utilizing intermediate fusion, which concatenates features. Our analysis is conducted on two diverse datasets: Emotion for NLP and ISEAR.
ISSN:2687-878X
DOI:10.1109/ATSIP62566.2024.10638944