Deep Learning-Based Real-Time Auto Classification of Smartphone Measured Bridge Vibration Data

In this study, a simple and customizable convolution neural network framework was used to train a vibration classification model that can be integrated into the measurement application in order to realize accurate and real-time bridge vibration status on mobile platforms. The inputs for the network...

Full description

Saved in:
Bibliographic Details
Published in:Sensors (Basel, Switzerland) Vol. 20; no. 9; p. 2710
Main Authors: Shrestha, Ashish, Dang, Ji
Format: Journal Article
Language:English
Published: Switzerland MDPI AG 09-05-2020
MDPI
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this study, a simple and customizable convolution neural network framework was used to train a vibration classification model that can be integrated into the measurement application in order to realize accurate and real-time bridge vibration status on mobile platforms. The inputs for the network model are basically the multichannel time-series signals acquired from the built-in accelerometer sensor of smartphones, while the outputs are the predefined vibration categories. To verify the effectiveness of the proposed framework, data collected from long-term monitoring of bridge were used for training a model, and its classification performance was evaluated on the test set constituting the data collected from the same bridge but not used previously for training. An iOS application program was developed on the smartphone for incorporating the trained model with predefined classification labels so that it can classify vibration datasets measured on any other bridges in real-time. The results justify the practical feasibility of using a low-latency, high-accuracy smartphone-based system amid which bottlenecks of processing large amounts of data will be eliminated, and stable observation of structural conditions can be promoted.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s20092710