Deep Block Transformer for Anomaly Detection

In addressing the critical need for efficient anomaly detection within multivariate time series data, existing solutions often grapple with challenges such as the scarcity of anomaly labels, data volatility, and the demand for quick inference in real-time applications. Despite advancements in deep l...

Full description

Saved in:
Bibliographic Details
Published in:2024 4th International Conference on Computer Communication and Artificial Intelligence (CCAI) pp. 481 - 486
Main Authors: Ishaq, Muhammad Yasir, Yong, Zhou, Xue, Shaxin, Raza, Qamar, An, Zhijian, Amin, Muhammad Usama
Format: Conference Proceeding
Language:English
Published: IEEE 24-05-2024
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In addressing the critical need for efficient anomaly detection within multivariate time series data, existing solutions often grapple with challenges such as the scarcity of anomaly labels, data volatility, and the demand for quick inference in real-time applications. Despite advancements in deep learning, a fully satisfactory solution remains elusive. In response, we introduce a groundbreaking approach leveraging a deep transformer network-based model. Our work emphasizes the use of attention-based sequence encoders for swift and insightful anomaly detection, bypassing traditional metrics in favor of advanced self-conditioning for superior feature extraction. By integrating adversarial training and model-agnostic meta-learning (MAML), our model not only adapts to limited data but also significantly outperforms existing methods in terms of F1 scores and training efficiency, as demonstrated in our comprehensive evaluation across six public datasets. This establishes a new benchmark in the field, offering a robust and efficient solution for anomaly detection and diagnosis in industrial applications.
DOI:10.1109/CCAI61966.2024.10603098