Comparison of classifiers using robust features for depression detection on Bahasa Malaysia speech
Abstract
Early detection of depression allows rapid intervention and reduce the escalation of the disorder. Conventional method requires patient to seek diagnosis and treatment by visiting a trained clinician. Bio-sensors technology such as automatic depression detection using speech can be used to assist early diagnosis for detecting remotely those who are at risk. In this research, we focus on detecting depression using Bahasa Malaysia language using speech signals that are recorded remotely via subject’s personal mobile devices. Speech recordings from a total of 43 depressed subjects and 47 healthy subjects were gathered via online platform with diagnosis validation according to the Malay beck depression inventory II (Malay BDI-II), patient health questionnaire (PHQ-9) and subject’s declaration of major depressive disorder (MDD) diagnosis by a trained clinician. Classifier models were compared using time-based and spectrum-based microphone independent feature set with hyperparameter tuning. Random forest performed best for male reading speech with 73% accuracy while support vector machine performed best on both male spontaneous speech and female reading speech with 74% and 73% accuracy, respectively. Automatic depression detection on Bahasa Malaysia language has shown to be promising using machine learning and microphone independent features but larger database is necessary for further validation and improving performance.
Keywords
Classifier; Depression; Features; Robust; Speech
Full Text:
PDFDOI: http://doi.org/10.11591/ijai.v11.i1.pp238-253
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
IAES International Journal of Artificial Intelligence (IJ-AI)
ISSN/e-ISSN 2089-4872/2252-8938
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).