ZeroScatter: Domain Transfer for Long Distance Imaging and Vision through Scattering Media
Adverse weather conditions, including snow, rain, and fog, pose a major challenge for both human and computer vision. Handling these environmental conditions is essential for safe decision making, especially in autonomous vehicles, robotics, and drones. Most of today's supervised imaging and vi...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Journal Article |
Language: | English |
Published: |
10-02-2021
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Adverse weather conditions, including snow, rain, and fog, pose a major
challenge for both human and computer vision. Handling these environmental
conditions is essential for safe decision making, especially in autonomous
vehicles, robotics, and drones. Most of today's supervised imaging and vision
approaches, however, rely on training data collected in the real world that is
biased towards good weather conditions, with dense fog, snow, and heavy rain as
outliers in these datasets. Without training data, let alone paired data,
existing autonomous vehicles often limit themselves to good conditions and stop
when dense fog or snow is detected. In this work, we tackle the lack of
supervised training data by combining synthetic and indirect supervision. We
present ZeroScatter, a domain transfer method for converting RGB-only captures
taken in adverse weather into clear daytime scenes. ZeroScatter exploits
model-based, temporal, multi-view, multi-modal, and adversarial cues in a joint
fashion, allowing us to train on unpaired, biased data. We assess the proposed
method on in-the-wild captures, and the proposed method outperforms existing
monocular descattering approaches by 2.8 dB PSNR on controlled fog chamber
measurements. |
---|---|
DOI: | 10.48550/arxiv.2102.05847 |