Efficient lightweight residual network for real-time road semantic segmentation

Amine Kherraki, Muaz Maqbool, Rajae El Ouazzani

Abstract


Intelligent transportation system (ITS) is currently one of the most discussed topics in scientific research. Actually, ITS offers advanced monitoring systems that include vehicle counting, pedestrian detection. Lately, convolutional neural networks (CNNs) are extensively used in computer vision tasks, including segmentation, classification, and detection. In fact, image semantic segmentation is a critical issue in computer vision applications. For example, self-driving vehicles require high accuracy with lower parameter requirements to segment the road scene objects in real-time. However, most related work focus on one side, accuracy or parameter requirements, which make CNN models difficult to use in real-time applications. In order to resolve this issue, we propose the efficient lightweight residual network (ELRNet), a novel CNN model, which is an asymmetrical encoder-decoder architecture. Indeed, in this network, we compare four varieties of the proposed factorized block, and three loss functions to get the best combination. In addition, the proposed model is trained from scratch using only 0.61M parameters. All experiments are evaluated on the popular public the cambridge-driving labeled video database (CamVid) road scene dataset and reached results show that ELRNet can achieve better performance in terms of parameters requirements and precision compared to related work.

Keywords


Computer vision; Convolution neural network; Deep learning; Semantic segmentation; Tonomous driving;

Full Text:

PDF


DOI: http://doi.org/10.11591/ijai.v12.i1.pp394-401

Refbacks



Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

IAES International Journal of Artificial Intelligence (IJ-AI)
ISSN/e-ISSN 2089-4872/2252-8938 
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).

View IJAI Stats