Enhancing Visual Perception in Novel Environments via Incremental Data Augmentation Based on Style Transfer
The deployment of autonomous agents in real-world scenarios is challenged by "unknown unknowns", i.e. novel unexpected environments not encountered during training, such as degraded signs. While existing research focuses on anomaly detection and class imbalance, it often fails to address t...
Saved in:
Main Authors: | , , |
---|---|
Format: | Journal Article |
Language: | English |
Published: |
15-09-2023
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The deployment of autonomous agents in real-world scenarios is challenged by
"unknown unknowns", i.e. novel unexpected environments not encountered during
training, such as degraded signs. While existing research focuses on anomaly
detection and class imbalance, it often fails to address truly novel scenarios.
Our approach enhances visual perception by leveraging the Variational
Prototyping Encoder (VPE) to adeptly identify and handle novel inputs, then
incrementally augmenting data using neural style transfer to enrich
underrepresented data. By comparing models trained solely on original datasets
with those trained on a combination of original and augmented datasets, we
observed a notable improvement in the performance of the latter. This
underscores the critical role of data augmentation in enhancing model
robustness. Our findings suggest the potential benefits of incorporating
generative models for domain-specific augmentation strategies. |
---|---|
DOI: | 10.48550/arxiv.2309.08851 |