Efficient Model-Based Anthropometry under Clothing Using Low-Cost Depth Sensors
Measuring human body dimensions is critical for many engineering and product design domains. Nonetheless, acquiring body dimension data for populations using typical anthropometric methods poses challenges due to the time-consuming nature of manual methods. The measurement process for three-dimensio...
Saved in:
Published in: | Sensors (Basel, Switzerland) Vol. 24; no. 5; p. 1350 |
---|---|
Main Authors: | , , , , |
Format: | Journal Article |
Language: | English |
Published: |
Switzerland
MDPI AG
20-02-2024
MDPI |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Measuring human body dimensions is critical for many engineering and product design domains. Nonetheless, acquiring body dimension data for populations using typical anthropometric methods poses challenges due to the time-consuming nature of manual methods. The measurement process for three-dimensional (3D) whole-body scanning can be much faster, but 3D scanning typically requires subjects to change into tight-fitting clothing, which increases time and cost and introduces privacy concerns. To address these and other issues in current anthropometry techniques, a measurement system was developed based on portable, low-cost depth cameras. Point-cloud data from the sensors are fit using a model-based method, Inscribed Fitting, which finds the most likely body shape in the statistical body shape space and providing accurate estimates of body characteristics. To evaluate the system, 144 young adults were measured manually and with two levels of military ensembles using the system. The results showed that the prediction accuracy for the clothed scans remained at a similar level to the accuracy for the minimally clad scans. This approach will enable rapid measurement of clothed populations with reduced time compared to manual and typical scan-based methods. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 1424-8220 1424-8220 |
DOI: | 10.3390/s24051350 |