A.R.G.! Augmented reality and gait: Cues elicit gait adaptations in AR

Abstract

INTRODUCTION Use of augmented reality (AR) technology for rehabilitation has drastically increased in recent years. Evidence has shown that visual and auditory cues alter spatial and temporal gait parameters, respectively [1]. Through providing visual and auditory cues while walking, AR can be used to cue spatiotemporal gait adaptations with applications to clinical populations such as those with Parkinson’s disease [2-4]. However, given the novelty of the technology it is unknown how cues delivered through AR drive gait adaptations. The purpose of this study is to assess the feasibility of visual and auditory cues delivered through AR on altering spatiotemporal gait outcomes in a healthy, young population.

METHODS 20 healthy participants (7 F/13 M; 25.5 + 4 yrs) were recruited to walk for 10 steps in four different cueing conditions using an AR headset: No Cues (NC) (i.e., natural gait), Visual (V), Auditory (A), and Visual + Auditory (VA). Each condition was completed three times in a random order for a total of 12 trials. An Inertial Measurement Unit (IMU) system with integrated footswitches was used to collect spatiotemporal gait data at 200 Hz. A System Usability Survey (SUS) was administered afterwards to determine the usability of our novel application and linear regressions were performed to determine the relationship between reported usability and gait variability.

RESULTS AND DISCUSSION All cueing conditions exhibited a significantly faster cadence (V, A, VA = 0.67, 0.68, 0.68 steps/sec, respectively) compared to NC trials (0.63 steps/sec; p <0.05). Surprisingly, cadence variability was significantly higher for A trials (coefficient of variation (CV) = 0.11) compared to the other three groups (NC, V, AV = 0.1, 0.09. 0.09; p <0.05). V trials exhibited significantly decreased stride lengths (2.13 m) compared to NC (2.3 m; p < 0.05). Increased reported system usability was significantly correlated with decreased stance time across A trials (adj R 2 = 0.262, β = -0.549, p = 0.012).

CONCLUSIONS Our findings reinforced that certain visual and auditory cues affect gait parameters, albeit in a direction opposite of what was hypothesized (e.g., greater cadence variability with auditory cues). These results provide insight into how healthy populations respond to cues delivered through AR, providing a foundation for future studies to implement AR with clinical populations such as those with Parkinson’s disease.

ACKNOWLEDGEMENTS Thanks to Alex Krause, Nicholas Cando, Joshua Vicente, Rabbani Nzeza, and other members of the Move-Learn Lab (MLL) and Positive Augmented Research and Development (PAR-D) Lab for assistance with app development and data collection. This work was supported by the Department of Defense (Contract

W911NF2110273).

REFERENCES [1] Vaz J.R. et al. Phsyiol. 11:1-10, 2020. [2] Ginis P et al. Ann. Phys. Rehabil. Med. 61:407-413, 2018.

3@Dibble L.E. et al. Gait Posture 19:215-225, 2004. 4@Wittwer J.E. et al. Gait Posture 37:219-222, 201

Date
Aug 3, 2023 1:10 AM — 2:40 AM
Location
Fukuoka, JP
Wendy Pham
Wendy Pham
Master’s Candidate

Master’s Candidate

Gwendolyn Retzinger
Gwendolyn Retzinger
Master’s Candidate
Borna Golbarg
Borna Golbarg
Undergraduate Student
Kyle Dang
Kyle Dang
Undergraduate Student
Joshua Vicente
Joshua Vicente
Adjunct Professor and Lab Alumni
Belle Pearl Ponce de Leon
Belle Pearl Ponce de Leon
Master’s Candidate
Naira Marootian
Naira Marootian
Alumni

Alumni

Jose Diaz
Jose Diaz
Master’s Candidate
Jacob W Hinkel-Lipsker
Jacob W Hinkel-Lipsker
Associate Professor