A hybrid approach for face recognition using a convolutional neural network combined with feature extraction techniques
Abstract
Facial recognition technology has been used in many fields such as security, biometric identification, robotics, video surveillance, health, and commerce due to its ease of implementation and minimal data processing time. However, this technology is influenced by the presence of variations such as pose, lighting, or occlusion. In this paper, we propose a new approach to improve the accuracy rate of face recognition in the presence of variation or occlusion, by combining feature extraction with a histogram of oriented gradient (HOG), scale invariant feature transform (SIFT), Gabor, and the Canny contour detector techniques, as well as a convolutional neural network (CNN) architecture, tested with several combinations of the activation function used (Softmax and Segmoïd) and the optimization algorithm used during training (adam, Adamax, RMSprop, and stochastic gradient descent (SGD)). For this, a preprocessing was performed on two databases of our database of faces (ORL) and Sheffield faces used, then we perform a feature extraction operation with the mentioned techniques and then pass them to our used CNN architecture. The results of our simulations show a high performance of the SIFT+CNN combination, in the case of the presence of variations with an accuracy rate up to 100%.
Keywords
Activation function; Convolutional neural network; Deep learning; Facial recognition; Optimization algorithms; Scale invariant feature transform+convolutional neural network
Full Text:
PDFDOI: http://doi.org/10.11591/ijai.v12.i2.pp627-640
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
IAES International Journal of Artificial Intelligence (IJ-AI)
ISSN/e-ISSN 2089-4872/2252-8938
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).