Non-Contrastive Unsupervised Learning of Physiological Signals from Video
Subtle periodic signals such as blood volume pulse and respiration can be extracted from RGB video, enabling remote health monitoring at low cost. Advancements in remote pulse estimation -- or remote photoplethysmography (rPPG) -- are currently driven by deep learning solutions. However, modern appr...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Journal Article |
Language: | English |
Published: |
14-03-2023
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Subtle periodic signals such as blood volume pulse and respiration can be
extracted from RGB video, enabling remote health monitoring at low cost.
Advancements in remote pulse estimation -- or remote photoplethysmography
(rPPG) -- are currently driven by deep learning solutions. However, modern
approaches are trained and evaluated on benchmark datasets with associated
ground truth from contact-PPG sensors. We present the first non-contrastive
unsupervised learning framework for signal regression to break free from the
constraints of labelled video data. With minimal assumptions of periodicity and
finite bandwidth, our approach is capable of discovering the blood volume pulse
directly from unlabelled videos. We find that encouraging sparse power spectra
within normal physiological bandlimits and variance over batches of power
spectra is sufficient for learning visual features of periodic signals. We
perform the first experiments utilizing unlabelled video data not specifically
created for rPPG to train robust pulse rate estimators. Given the limited
inductive biases and impressive empirical results, the approach is
theoretically capable of discovering other periodic signals from video,
enabling multiple physiological measurements without the need for ground truth
signals. Codes to fully reproduce the experiments are made available along with
the paper. |
---|---|
DOI: | 10.48550/arxiv.2303.07944 |