Improving Stereo Matching Generalization via Fourier-Based Amplitude Transform
Stereo matching CNNs suffer from performance deteriorate when evaluated under different distributions from training data. Previous domain adaptation/generalization methods are hard to maintain a robust performance in different baselines and usually require difficult adversarial optimization or intri...
Saved in:
Published in: | IEEE signal processing letters Vol. 29; pp. 1362 - 1366 |
---|---|
Main Authors: | , , , , |
Format: | Journal Article |
Language: | English |
Published: |
New York
IEEE
2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Stereo matching CNNs suffer from performance deteriorate when evaluated under different distributions from training data. Previous domain adaptation/generalization methods are hard to maintain a robust performance in different baselines and usually require difficult adversarial optimization or intricate network structure. To solve this problem, we propose Fourier-based amplitude transform (FAT), mapping the source image to the target style without altering semantic content, which requires no training to perform the domain alignment. Specifically, we leverage the Fourier transform and its inverse to swap the low-frequency amplitude component of the source data with the target data. To effectively map style and relieve the artifacts, we introduce two factors to control the replacing area: the distance of HSV distribution between source and target images; and the difference between the source left image and its warped left image. Experiments testify FAT can significantly bridge domain gaps, making source data distribution closer to target data. Furthermore, when only training on synthetic datasets, FAT can also help different baselines achieve competitive cross-domain generalization capabilities on real datasets. |
---|---|
ISSN: | 1070-9908 1558-2361 |
DOI: | 10.1109/LSP.2022.3180306 |