Please use this identifier to cite or link to this item:
|Title:||The development of a gait speed detection system for older adults using video-based processing|
|Abstract:||© 2019 Association for Computing Machinery. This study aimed to develop the gait speed detection system for measuring the instantaneous walking speed of older adults. The proposed system employed a standard camera 60 Hz and fixed on a tripod with 3-way head to collect the body motion. Besides, the proposed system was to assess the validity of instantaneous horizontal speed with the three-dimension motion analysis system. The cross-sectional study was used to design in this study. The proposed system consists of ten steps, which are: (1) Input video, (2) Extraction frames, (3) Calibration of a camera and the capture volume, (4) Colour detection and filling into the body, (5) The human body region detection, (6) Filtering of the foreground regions from image difference, (7) Centroid of the human body detection, (8) Identification of the human body position, (9) Feature tracking of the human speed and (10) Estimation of the human speed. The proposed system employed MATLAB (2015a) with the Computer Vision Toolbox and the Image Processing Toolbox for developing and testing. The fifteen older adults with mean age 67 (SD = 4.19) years performed three walking conditions that comprises: 1) walking at a slow speed, 2) walking at usual speed, and 3) walking at a fast speed. Besides, participants walked along a 10-metre walkway in the motion capture laboratory room. The results demonstrate that the proposed system measures have an excellent correlation with the motion analysis system measures, with correlation coefficients between 0.936 and 0.987. Hence, the proposed system is to be one of the useful tools for assessing instantaneous walking speed among older adults in both clinical and community settings.|
|Appears in Collections:||CMUL: Journal Articles|
Files in This Item:
There are no files associated with this item.
Items in CMUIR are protected by copyright, with all rights reserved, unless otherwise indicated.