Lane detection in intelligent vehicle system using optimal 2- tier deep convolutional neural network

In Advanced Driver Assistance Systems(ADAS) and autonomous vehicles, lane detection is an important module. Most lane detection methods focused on detecting lanes from a single image and the results from unsatisfactory performance under extremely bad climatic changes and attain high accuracy is chal...

Full description

Saved in:
Bibliographic Details
Published in:Multimedia tools and applications Vol. 82; no. 5; pp. 7293 - 7317
Main Authors: Dewangan, Deepak Kumar, Sahu, Satya Prakash
Format: Journal Article
Language:English
Published: New York Springer US 01-02-2023
Springer Nature B.V
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In Advanced Driver Assistance Systems(ADAS) and autonomous vehicles, lane detection is an important module. Most lane detection methods focused on detecting lanes from a single image and the results from unsatisfactory performance under extremely bad climatic changes and attain high accuracy is challenging. In this research work, a novel two-tier deep learning based lane detection framework is introduced for multi images at different weather conditions. In both the tiers, the Local Vector Pattern (LVP) based texture features are extracted and an Optimized Deep Convolutional Neural Network (DCNN) is utilized to classify road and lane as well. The weight corresponding to the second convolutional layer of DCNN (both tiers) is fine-tuned by a novel technique called “Flight Straight of Moth Search (FS-MS) Algorithm” that is an enhanced version of the standard Moth search Algorithm, to create the detection more accurate (MS).With respect of particular metrics, the efficiency of the provided work is compared to that existing lane detecting models.Particularly, the computation time of the proposed model is 31.2%, 20.85%, 10.43%, and 4.53% higher than the existing MS + CNN, LA + CNN, GA + CNN, and PSO + CNN methods respectively.
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-022-13425-7