Research in a Wearable Device for Motion Recognition and Position Tracking

A practical solution for indoor navigation.

Supervisor: Dr. Weiping Ni, Lecturer from the Univerity of Nottingham

Time: May to June 2018

I did this project from May to June with Dr. Weiping Ni, who is also my personal tutor (graduated from Cornell University in Electronics and Computer Engineering with her master’s and PhD degree).

1 Overview

During the summer, we focused on motion recognition and position tracking by using an inertial measurement unit (IMU) sensor and an Arduino board. In the process of data acquisition, since we are not dealing with arm movement or fine finger movement for gesture recognitions, we can take full advantage of a relatively lower sampling rate. Typically the sampling frequency for Activities of Daily Living (ADL) is about 50–100Hz based on the Nyquist Theorem. The data acquisition for the position tracking can be further engineered based on the recognized current motion activity. If the current motion is stationary, there is no need to update the location; only when active motion occurs is the computation for the new location is invoked. The gait analysis is based on the data collected from the Activities of Daily Living (ADL) with sampling frequency 100Hz, performed from volunteers with different ages and health conditions and with arbitrary IMU wearing orientations. MATLAB was used in movement analysis, algorithm development and verification.

An attitude and heading reference system (AHRS) was used, consisting of sensors on three axes that provide attitude information for a moving object, including roll, pitch and yaw. The key difference between an IMU and an AHRS is the addition of an on-board processing system in an AHRS that provides attitude and heading information. We are going to develop the sensor fusion algorithm based on the sensor data delivered by the IMU to realize an AHRS and test its performance. With orientation determination, an AHRS can also form part of an inertial navigation system, which uses a computer and motion/rotation/magnetic sensors to continuously calculate by dead reckoning the position, the orientation, and the velocity of a moving object without the need for external references.

In the next step, we will develop the stair ambulation recognition algorithm by combining the information from motion accelerations, trunk tilt angular velocity, the body attitude, and the body elevation [1, 2]. The sensor fusion for position tracking will be done through Extended Kalman Filter with Maximum Likelihood Estimation. We will need to validate the robustness of the developed algorithm with corner cases like fast sitting or fast lying to check the occurrence of the faulty fall detection. It provides a practical solution for indoor navigation when a GPS signal is weak. We will study the possible solution to build an indoor navigation system and evaluate its performance.

2 Methodology

2.1 Inertial Measurement Units (IMUs)

Inertial navigation is a self-contained navigation technique in which measurements provided by accelerometers (MEMS sensors) and gyroscopes are used to track the position and orientation of an object relative to a known starting point, orientation and velocity.

Inertial measurement units (IMUs) typically contain three orthogonal rate-gyroscopes and three orthogonal accelerometers, measuring angular velocity and linear acceleration respectively.By processing signals from these devices it is possible to track the position and orientation. The sensor fusion using magnetometers can reduce the average error in position obtained by the system.

2.2 Inertial Navigation System

When the sensor is subjected to low acceleration, the measurement of accelerometer is mainly composed of gravity component, from which the tilt information can be inferred.

3 My work during two months

(1) Built and tested the AHRS based on the IMU and developed a possible indoor navigation algorithm

(2) Hardware and software interface to correctly setup the IMU measurement and properly calibrate the IMU sensors.

(3) Signal processing to build the AHRS with sensor fusion algorithm development.

(4) Algorithm implementation and testbench development for the verification and performance evaluation.


[1] D. Roetenberg, H. J. Luinge, C. T. M. Baten, and P. H. Veltink, “Compensation of magnetic disturbances improves inertial and magnetic sensing of human body segment orientation”, IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 13, no. 3, 2005, pp. 395–405.

[2] B. Najafi, K. Aminian, A. Paraschiv-Ionescu, F. Loew, C. J. Bula, and P. Robert, “Ambulatory system for human motion analysis using a kinematic sensor: monitoring of daily physical activity in the elderly”, IEEE Transactions on Biomedical Engineering, vol. 50, no. 6, 2003, pp. 711–723.

Master of Science in Artificial Intelligence at Nanyang Technological University | LinkedIn: | GitHub: