A Novel Method to Generate Auto-Labeled Datasets for 3D Vehicle Identification Using a New Contrast Model
Auto-labeling is one of the main challenges in 3D vehicle detection. Auto-labeled datasets can be used to identify objects in LiDAR data, which is a challenging task due to the large size of the dataset. In this work, we propose a novel methodology to generate new 3D based auto-labeling datasets wit...
Saved in:
Published in: | Applied sciences Vol. 13; no. 7; p. 4334 |
---|---|
Main Authors: | , , , , |
Format: | Journal Article |
Language: | English |
Published: |
Basel
MDPI AG
01-04-2023
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Auto-labeling is one of the main challenges in 3D vehicle detection. Auto-labeled datasets can be used to identify objects in LiDAR data, which is a challenging task due to the large size of the dataset. In this work, we propose a novel methodology to generate new 3D based auto-labeling datasets with a different point of view setup than the one used in most recognized datasets (KITTI, WAYMO, etc.). The performance of the methodology has been further demonstrated with the development of our own dataset with the auto-generated labels and tested under boundary conditions on a bridge in a fixed position. The proposed methodology is based on the YOLO model trained with the KITTI dataset. From a camera-LiDAR sensor fusion, it is intended to auto-label new datasets while maintaining the consistency of the ground truth. The performance of the model, with respect to the manually labeled KITTI images, achieves an F-Score of 0.957, 0.927 and 0.740 in the easy, moderate and hard images of the dataset. The main contribution of this work is a novel methodology to auto-label autonomous driving datasets using YOLO as the main labeling system. The proposed methodology is tested under boundary conditions and the results show that this approximation can be easily adapted to a wide variety of problems when labeled datasets are not available. |
---|---|
ISSN: | 2076-3417 2076-3417 |
DOI: | 10.3390/app13074334 |