An algorithm for training neural networks with L1 regularization
Abstract
This paper presents a new algorithm for building neural network models that automatically selects the most important features and parameters while improving prediction accuracy. Traditional neural networks often use all available input parameters, leading to complex models that are slow to train and prone to overfitting. The proposed algorithm addresses this challenge by automatically identifying and retaining only the most significant parameters during training, resulting in simpler, faster, and more accurate models. We demonstrate the practical benefits of the proposed algorithm through two real-world applications: stock market forecasting using the Wilshire index and business profitability prediction based on company financial data. The results show significant improvements over conventional methods: models use fewer parameters–creating simpler, more interpretable solutions–achieve better prediction accuracy, and require less training time. These advantages make the algorithm particularly valuable for business applications where model simplicity, speed, and accuracy are crucial. The method is especially beneficial for organizations with limited computational resources or that require fast model deployment. By automatically selecting the most relevant features, it reduces the need for manual feature engineering and helps practitioners build more efficient predictive models without requiring deep technical expertise in neural network optimization.
Keywords
Dropout; Inverse problem; L1 regularization; Neural network; Optimization method; Pruning
Full Text:
PDFDOI: http://doi.org/10.11591/ijai.v14.i5.pp%25p
Refbacks
- There are currently no refbacks.
Copyright (c) 2025 Ekaterina Gribanova, Roman Gerasimov
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
IAES International Journal of Artificial Intelligence (IJ-AI)
ISSN/e-ISSN 2089-4872/2252-8938
This journal is published by the Institute of Advanced Engineering and Science (IAES).