Image-based gramian angular field processing for pedestrian stride-length estimation using convolutional neural network

Pham Doan Tinh, Bui Huy Hoang, Nguyen Duc Cuong


In an age when people spend most of their time indoors and smartphones become a necessity, there is an increasing demand to navigate user absolute position in indoor environments. While global positioning system (GPSs) perform well outdoors, their inaccuracy can not be tolerated in places where GPS signal is weak or barely detected. This leads to a number of solutions which utilize smartphone inertial measurement unit (IMU) to track user location. Most IMU-based methods track the trajectory of a person by using stride-length and heading estimation. Thus, the accuracy of stride-length estimation plays a very important role in these methods. Inspired by recent success in the field of computer vision and machine learning, we proposed an image-based stride-length estimation method that employs gramian angular field (GAF) in converting accelerometer data into images, and then feed them into a convolutional neural network (CNN) to predict the stride-length. We evaluate the performance of our proposed method by using a public dataset from Qu Wang in his GitHub repository (available at The result shows that our proposed method is superior in terms of accuracy in one stride and in large walking distance than others using only data collected from the accelerometer.


Convolutional neural network, Gramian angular field, Image-based, Pedestrian stride-length

Full Text:




  • There are currently no refbacks.

View IJAI Stats

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.