Signature verification based on proposed fast hyper deep neural network
Abstract
Many industries have made widespread use of the handwittern signature verification system, including banking, education, legal proceedings, and criminal investigation, in which verification and identification are absolutely necessary. In this research, we have developed an accurate offline signature verification model that can be used in a writer-independent scenario. First, the handwitten signature images went through four preprocessing stages in order to be suitable for finding the unique features. Then, three different types of features namely principal component analysis (PCA) as appearance-based features, gray-level co-occurrence matrix (GLCM) as texture-features, and fast Fourier transform (FFT) as frequency-features are extracted from signature images in order to build a hybrid feature vector for each image. Finally, to classify signature features, we have designed a proposed fast hyper deep neural network (FHDNN) architecture. Two different datasets are used to evaluate our model these are SigComp2011, and CEDAR datasets. The results collected demonstrate that the suggested model can operate with accuracy equal to 100%, outperforming several of its predecessors. In the terms of (precision, recall, and F-score) it gives a very good results for both datasets and exceeds (1.00, 0.487, and 0.655 respectively) on Sigcomp2011 dataset and (1.00, 0.507, and 0.672 respectively) on CEDAR dataset.
Keywords
Deep learning; Fast Fourier transform; Gray-level co-occurrence matrix; Principal component analysis signature verification
Full Text:
PDFDOI: http://doi.org/10.11591/ijai.v13.i1.pp961-973
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
IAES International Journal of Artificial Intelligence (IJ-AI)
ISSN/e-ISSN 2089-4872/2252-8938
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).