ADAPT: AI‐Driven Artefact Purging Technique for IMU Based Motion Capture

While IMU based motion capture offers a cost‐effective alternative to premium camera‐based systems, it often falls short in matching the latter's realism. Common distortions, such as self‐penetrating body parts, foot skating, and floating, limit the usability of these systems, particularly for...

Full description

Saved in:
Bibliographic Details
Published in:Computer graphics forum Vol. 43; no. 8
Main Authors: Schreiner, P., Netterstrøm, R., Yin, H., Darkner, S., Erleben, K.
Format: Journal Article
Language:English
Published: Oxford Blackwell Publishing Ltd 01-12-2024
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:While IMU based motion capture offers a cost‐effective alternative to premium camera‐based systems, it often falls short in matching the latter's realism. Common distortions, such as self‐penetrating body parts, foot skating, and floating, limit the usability of these systems, particularly for high‐end users. To address this, we employed reinforcement learning to train an AI agent that mimics erroneous sample motion. Since our agent operates within a simulated environment, it inherently avoids generating these distortions since it must adhere to the laws of physics. Impressively, the agent manages to mimic the sample motions while preserving their distinctive characteristics. We assessed our method's efficacy across various types of input data, showcasing an ideal blend of artefact‐laden IMU‐based data with high‐grade optical motion capture data. Furthermore, we compared the configuration of observation and action spaces with other implementations, pinpointing the most suitable configuration for our purposes. All our models underwent rigorous evaluation using a spectrum of quantitative metrics complemented by a qualitative review. These evaluations were performed using a benchmark dataset of IMU‐based motion data from actors not included in the training data.
ISSN:0167-7055
1467-8659
DOI:10.1111/cgf.15172