Event-based Low-illumination Image Enhancement
Event cameras are bio-inspired vision sensors with a high dynamic range (140 dB for event cameras vs. 60 dB for traditional cameras) and can be used to tackle the image degradation problem under extremely low-illumination scenarios, which is still not well-explored yet. In this paper, we propose a j...
Saved in:
Published in: | IEEE transactions on multimedia Vol. 26; pp. 1 - 12 |
---|---|
Main Authors: | , , , , , |
Format: | Journal Article |
Language: | English |
Published: |
Piscataway
IEEE
01-01-2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Event cameras are bio-inspired vision sensors with a high dynamic range (140 dB for event cameras vs. 60 dB for traditional cameras) and can be used to tackle the image degradation problem under extremely low-illumination scenarios, which is still not well-explored yet. In this paper, we propose a joint framework to compose the underexposed frames and event streams captured by the event camera to reconstruct clear images with detailed textures under almost dark conditions. A residual fusion module is proposed to reduce the domain gap between event streams and frames by using the residuals of both modalities. A multi-level reconstruction loss based on the variability of the contrast distribution is proposed to reduce the perceptual errors of the output image. In addition, we construct the first real-world low-illumination image enhancement dataset (mainly under 2 lux illumination scenes), named LIE, containing event streams and frames collected under indoor and outdoor low-light scenarios together with the ground truth clear images. Experimental results on our LIE dataset demonstrate that our proposed method could achieve significant improvements compared with existing methods. |
---|---|
ISSN: | 1520-9210 1941-0077 |
DOI: | 10.1109/TMM.2023.3290432 |