Audio Feedback for Device-Supported Balance Training: Parameter Mapping and Influencing Factors
Abstract
:1. Introduction
2. Materials and Methods
2.1. Subjects
2.2. Experimental Setup
2.2.1. Hardware
2.2.2. Software and Sound Generation
2.3. Procedure (Experimental Protocol)
2.4. Audio Parameters
2.4.1. Experiment 1
2.4.2. Experiment 2
- The Y-axis was signaled with increasing (movement anterior) and decreasing (movement posterior) frequency white noise (an audio signal with equal intensity at different frequencies).
- The X-axis was signaled by the synthetic modulation of amplitude or frequency (vibrato and tremolo, similar to the model “wavering” of E1).
2.5. Outcome Measures
2.6. Data Analysis
3. Results
3.1. Experiment 1
3.1.1. Movement Data
3.1.2. Subjective Evaluation
3.2. Experiment 2
3.2.1. Movement Data
3.2.2. Subjective Evaluation
3.2.3. Correlations
4. Discussion
- In E1, wavering and percussion were the most successful parameters. Overall, the percussive sound model was the most effective and intuitive both in quantitative and in VAS data.
- In E2, participants were quicker in TD, and movement accuracy was superior with the synthetic model, which was also rated as being more helpful.
- With AVFB, TD was shorter, and both movement and postural accuracy were superior, with similar values regardless of the underlying sound model and the experiment.
- Irrespective of the feedback, participants in E2 were moving more on the X-axis (mediolateral movement), and movement accuracy was superior on the Y-axis, with and without visual support.
- Higher musicality was associated with better results for TD and TS, but only without VFB.
- Increased age is associated with more extended TD in all conditions.
4.1. Implications for Acoustic Parameter Mapping
4.2. AFB and VFB
4.3. AFB and Musicality
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Sigrist, R.; Rauter, G.; Riener, R.; Wolf, P. Augmented visual, auditory, haptic, and multimodal feedback in motor learning: A review. Psychon. Bull. Rev. 2013, 20, 21–53. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Todorov, E.; Shadmehr, R.; Bizzi, E. Augmented Feedback Presented in a Virtual Environment Accelerates Learning of a Difficult Motor Task. J. Mot. Behav. 1997, 29, 147–158. [Google Scholar] [CrossRef]
- Wang, C.; Kennedy, D.M.; Boyle, J.B.; Shea, C.H. A guide to performing difficult bimanual coordination tasks: Just follow the yellow brick road. Exp. Brain Res. 2013, 230, 31–40. [Google Scholar] [CrossRef] [PubMed]
- Kennel, C.; Streese, L.; Pizzera, A.; Justen, C.; Hohmann, T.; Raab, M. Auditory reafferences: The influence of real-time feedback on movement control. Front. Psychol. 2015, 6, 69. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Schaffert, N.; Janzen, T.B.; Mattes, K.; Thaut, M.H. A Review on the Relationship Between Sound and Movement in Sports and Rehabilitation. Front. Psychol. 2019, 10, 244. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Effenberg, A.O.; Fehse, U.; Schmitz, G.; Krueger, B.; Mechling, H. Movement Sonification: Effects on Motor Learning beyond Rhythmic Adjustments. Front. Neurosci. 2016, 10, 219. [Google Scholar] [CrossRef] [PubMed]
- Ghai, S. Effects of Real-Time (Sonification) and Rhythmic Auditory Stimuli on Recovering Arm Function Post Stroke: A Systematic Review and Meta-Analysis. Front. Neurol. 2018, 9, 488. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Salmoni, A.W.; Schmidt, R.A.; Walter, C.B. Knowledge of results and motor learning: A review and critical reappraisal. Psychol. Bull. 1984, 95, 355–386. [Google Scholar] [CrossRef]
- Winstein, C.J.; Schmidt, R.A. Reduced frequency of knowledge of results enhances motor skill learning. J. Exp. Psychol. Learn. Mem. Cogn. 1990, 16, 677–691. [Google Scholar] [CrossRef]
- Dyer, J.; Stapleton, P.; Rodger, M. Sonification as Concurrent Augmented Feedback for Motor Skill Learning and the Importance of Mapping Design. Open Psychol. J. 2015, 8, 192–202. [Google Scholar] [CrossRef] [Green Version]
- Fujii, S.; Lulic, T.; Chen, J.L. More Feedback Is Better than Less: Learning a Novel Upper Limb Joint Coordination Pattern with Augmented Auditory Feedback. Front. Neurosci. 2016, 10. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ronsse, R.; Puttemans, V.; Coxon, J.P.; Goble, D.J.; Wagemans, J.; Wenderoth, N.; Swinnen, S.P. Motor learning with augmented feedback: Modality-dependent behavioral and neural consequences. Cereb. Cortex 2011, 21, 1283–1294. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Danna, J.; Velay, J.-L. On the Auditory-Proprioception Substitution Hypothesis: Movement Sonification in Two Deafferented Subjects Learning to Write New Characters. Front. Neurosci. 2017, 11, 137. [Google Scholar] [CrossRef] [PubMed]
- Spence, C.; Parise, C.; Chen, Y.-C. The Colavita Visual Dominance Effect. In The Neural Bases of Multisensory Processes: The Colavita Visual Dominance Effect; Murray, M.M., Wallace, M.T., Eds.; CRC Press: Boca Raton, FL, USA, 2012; pp. 529–556. ISBN 9781439812174. [Google Scholar]
- Spence, C. Explaining the Colavita visual dominance effect. In Progress in Brain Research; Srinivasan, N., Ed.; Elsevier: Amsterdam, The Netherlands, 2009; pp. 245–258. ISBN 9780444534262. [Google Scholar]
- Maier, M.; Ballester, B.R.; Verschure, P.F.M.J. Principles of Neurorehabilitation After Stroke Based on Motor Learning and Brain Plasticity Mechanisms. Front. Syst. Neurosci. 2019, 13, 74. [Google Scholar] [CrossRef] [PubMed]
- Whitehead, D.W. Applying Theory to the Critical Review of Evidence from Music-Based Rehabilitation Research. Crit. Rev. Phys. Rehabil. Med. 2015, 27, 79–92. [Google Scholar] [CrossRef]
- Ghai, S.; Schmitz, G.; Hwang, T.-H.; Effenberg, A.O. Auditory Proprioceptive Integration: Effects of Real-Time Kinematic Auditory Feedback on Knee Proprioception. Front. Neurosci. 2018, 12, 142. [Google Scholar] [CrossRef] [Green Version]
- Ghai, S.; Schmitz, G.; Hwang, T.-H.; Effenberg, A.O. Training proprioception with sound: Effects of real-time auditory feedback on intermodal learning. Ann. N. Y. Acad. Sci. 2019, 1438, 50–61. [Google Scholar] [CrossRef]
- Hasegawa, N.; Takeda, K.; Sakuma, M.; Mani, H.; Maejima, H.; Asaka, T. Learning effects of dynamic postural control by auditory biofeedback versus visual biofeedback training. Gait Posture 2017, 58, 188–193. [Google Scholar] [CrossRef]
- Vuong, Q.C.; Laing, M.; Prabhu, A.; Tung, H.I.; Rees, A. Modulated stimuli demonstrate asymmetric interactions between hearing and vision. Sci. Rep. 2019, 9, 7605. [Google Scholar] [CrossRef]
- Walker, B.N.; Nees, M.A. Theory of Sonification. In The Sonification handbook; Hermann, T., Hunt, A., Neuhoff, J.G., Eds.; Logos-Verl.: Berlin, Germany, 2011; pp. 9–39. ISBN 9783832528195. [Google Scholar]
- Kramer, G.; Walker, B.; Bargar, R. Sonification Report. Status of the Field and Research Agenda. International Community for Auditory Display. 1999. Available online: https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1443&context=psychfacpub (accessed on 23 January 2020).
- Rosati, G.; Roda, A.; Avanzini, F.; Masiero, S. On the role of auditory feedback in robot-assisted movement training after stroke: Review of the literature. Comput. Intell. Neurosci. 2013, 2013, 586138. [Google Scholar] [CrossRef]
- Dubus, G.; Bresin, R. A systematic review of mapping strategies for the sonification of physical quantities. PLoS ONE 2013, 8, e82491. [Google Scholar] [CrossRef] [PubMed]
- Hermann, T.; Hunt, A. Guest Editors’ Introduction: An Introduction to Interactive Sonification. IEEE Multimed. 2005, 12, 20–24. [Google Scholar] [CrossRef] [Green Version]
- Matjacic, Z.; Hesse, S.; Sinkjaer, T. BalanceReTrainer: A new standing-balance training apparatus and methods applied to a chronic hemiparetic subject with a neglect syndrome. NeuroRehabilitation 2003, 18, 251–259. [Google Scholar] [CrossRef] [PubMed]
- Pure Data. Available online: https://puredata.info/ (accessed on 23 January 2020).
- REAKTOR 6. Available online: https://www.native-instruments.com/de/products/komplete/synths/reaktor-6/ (accessed on 23 January 2020).
- Available online: http://opensoundcontrol.org/ (accessed on 14 November 2019).
- Scholz, D.S.; Rohde, S.; Nikmaram, N.; Bruckner, H.-P.; Grossbach, M.; Rollnik, J.D.; Altenmuller, E.O. Sonification of Arm Movements in Stroke Rehabilitation—A Novel Approach in Neurologic Music Therapy. Front. Neurol. 2016, 7, 106. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Mirelman, A.; Herman, T.; Nicolai, S.; Zijlstra, A.; Zijlstra, W.; Becker, C.; Chiari, L.; Hausdorff, J.M. Audio-biofeedback training for posture and balance in patients with Parkinson’s disease. J. Neuroeng. Rehabil. 2011, 8, 35. [Google Scholar] [CrossRef] [Green Version]
- Chiari, L.; Dozza, M.; Cappello, A.; Horak, F.B.; Macellari, V.; Giansanti, D. Audio-biofeedback for balance improvement: An accelerometry-based system. IEEE Trans. Biomed. Eng. 2005, 52, 2108–2111. [Google Scholar] [CrossRef]
- Dozza, M.; Chiari, L.; Horak, F.B. Audio-Biofeedback Improves Balance in Patients With Bilateral Vestibular Loss. Arch. Phys. Med. Rehabil. 2005, 86, 1401–1403. [Google Scholar] [CrossRef]
- Franco, C.; Fleury, A.; Gumery, P.Y.; Diot, B.; Demongeot, J.; Vuillerme, N. iBalance-ABF: A smartphone-based audio-biofeedback balance system. IEEE Trans. Biomed. Eng. 2013, 60, 211–215. [Google Scholar] [CrossRef] [Green Version]
- Ludovico, L.A.; Presti, G. The sonification space: A reference system for sonification tasks. Int. J. Hum. Comput. Stud. 2016, 85, 72–77. [Google Scholar] [CrossRef]
- Walker, B.N. Consistency of magnitude estimations with conceptual data dimensions used for sonification. Appl. Cognit. Psychol. 2007, 21, 579–599. [Google Scholar] [CrossRef] [Green Version]
- Bevilacqua, F.; Boyer, E.O.; Francoise, J.; Houix, O.; Susini, P.; Roby-Brami, A.; Hanneton, S. Sensori-Motor Learning with Movement Sonification: Perspectives from Recent Interdisciplinary Studies. Front. Neurosci. 2016, 10, 385. [Google Scholar] [CrossRef] [Green Version]
- Schaffert, N.; Mattes, K.; Effenberg, A.O. An investigation of online acoustic information for elite rowers in on-water training conditions. J. Hum. Sport Exerc. 2011, 6. [Google Scholar] [CrossRef] [Green Version]
- Fehse, U.; Schmitz, G.; Hartwig, D.; Ghai, S.; Brock, H.; Effenberg, A.O. Auditory Coding of Reaching Space. Appl. Sci. 2020, 10, 429. [Google Scholar] [CrossRef] [Green Version]
- Scholz, D.S.; Wu, L.; Pirzer, J.; Schneider, J.; Rollnik, J.D.; Großbach, M.; Altenmüller, E.O. Sonification as a possible stroke rehabilitation strategy. Front. Neurosci. 2014, 8, 332. [Google Scholar] [CrossRef]
- Neuhoff, J.G.; Wayand, J.; Kramer, G. Pitch and loudness interact in auditory displays: Can the data get lost in the map? J. Exp. Psychol. Appl. 2002, 8, 17–25. [Google Scholar] [CrossRef] [PubMed]
- Hafström, A.; Malmström, E.-M.; Terdèn, J.; Fransson, P.-A.; Magnusson, M. Improved Balance Confidence and Stability for Elderly After 6 Weeks of a Multimodal Self-Administered Balance-Enhancing Exercise Program: A Randomized Single Arm Crossover Study. Gerontol. Geriatr. Med. 2016, 2, 2333721416644149. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Brown, L.M.; Brewster, S.A.; Ramloll, R.; Burton, M.R.; Riedel, B. Design Guidelines for Audio Presentation of Graphs and Tables. In Proceedings of the 2003 International Conference on Auditory Display, Boston, MA, USA, 6–9 July 2003. [Google Scholar]
- Dyer, J.; Stapleton, P.; Rodger, M. Sonification of Movement for Motor Skill Learning in a Novel Bimanual Task: Aesthetics and Retention Strategies. 2016. Available online: https://smartech.gatech.edu/bitstream/1853/56574/1/ICAD2016_paper_27.pdf (accessed on 4 July 2020).
- Müllensiefen, D.; Gingras, B.; Musil, J.; Stewart, L. The musicality of non-musicians: An index for assessing musical sophistication in the general population. PLoS ONE 2014, 9, e89642. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Law, L.N.C.; Zentner, M. Assessing musical abilities objectively: Construction and validation of the profile of music perception skills. PLoS ONE 2012, 7, e52508. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Landry, S.P.; Champoux, F. Musicians react faster and are better multisensory integrators. Brain Cogn. 2017, 111, 156–162. [Google Scholar] [CrossRef]
- Moussard, A.; Bermudez, P.; Alain, C.; Tays, W.; Moreno, S. Life-long music practice and executive control in older adults: An event-related potential study. Brain Res. 2016, 1642, 146–153. [Google Scholar] [CrossRef]
Model | Left of the Target Area | Target Area | Right of the Target Area |
---|---|---|---|
Pitch | Position piano note moving down | Piano notes (position and reference Sound) at note C3 repeatedly at 140 bpm | Position piano note moving up |
Wavering | Vibrato (rapid, slight pitch variations) | Continuous pure tone | Tremolo (wavering effect by rapid reiteration of tone) |
Percussion | Accelerating low drum (Floor Tom) | Tambourine sounds (no drum sounds) | Accelerating high drum (Snare Drum) |
Timbre | More percussive (Marimba) | Repeating piano note | Less percussive (Guitar) |
Stereo | Panned to right ear (lower volume on left ear) | Percussion pattern centered (equal volume on both ears) | Sounds panned to left ear (lower volume on right ear) |
Experiment | Variable | Sound Model | Audio Alone | Audio-Visual |
---|---|---|---|---|
E1 | TD (sec.) | Pitch | 7.98 (± 1.24) | 5.14 (± 0.72) |
Wavering | 6.73 (± 0.54) | 5.13 (± 0.70) | ||
Percussion | 6.57 (± 0.64) | 4.89 (± 0.60) | ||
Timbre | 8.86 (± 4.04) | 4.96 (± 0.53) | ||
Stereo | 8.69 (± 3.61) | 4.94 (± 0.48) | ||
TS | Pitch | 6.23 (± 0.81) | 5.08 (± 1.17) | |
Wavering | 5.87 (± 0.70) | 4.86 (± 1.09) | ||
Percussion | 5.81 (± 0.35) | 5.30 (± 1.08) | ||
Timbre | 6.58 (± 1.02) | 5.21 (± 1.00) | ||
Stereo | 8.97 (± 1.76) | 4.95 (± 0.81) | ||
E2 | TD (sec.) | Synthetic | 13.72 (± 5.81) | 5.72 (± 0.48) |
Musical | 19.21 (± 12.60) | 5.76 (± 0.75) | ||
TS | Synthetic | 9.73 (± 2.04) | 6.11 (± 1.19) | |
Musical | 9.21 (± 2.65) | 5.96 (± 1.40) | ||
ADD | Synthetic | 306.36 (± 152.98) | 77.65 (± 29.35) | |
Musical | 463.26 (± 473.87) | 82.51 (± 31.82) |
Experiment | Question | Sound Model | VAS Score |
---|---|---|---|
E1 | Pleasant | Pitch | 5.80 (± 2.40) |
Wavering | 3.40 (± 2.53) | ||
Percussion | 7.38 (± 1.47) | ||
Timbre | 5.87 (± 2.67) | ||
Stereo | 7.99 (± 2.10) | ||
Helpful | Pitch | 7.57 (± 1.32) | |
Wavering | 7.33 (± 1.84) | ||
Percussion | 7.41 (± 2.36) | ||
Timbre | 5.50 (± 2.76) | ||
Stereo | 7.17 (± 2.23) | ||
E2 | Pleasant | Synthetic | 7.23 (± 2.35) |
Musical | 6.51 (± 2.66) | ||
Helpful | Synthetic | 8.12 (± 1.13) | |
Musical | 6.66 (± 2.52) |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Fuchs, D.; Knauer, M.; Egger, M.; Friedrich, P. Audio Feedback for Device-Supported Balance Training: Parameter Mapping and Influencing Factors. Acoustics 2020, 2, 650-665. https://doi.org/10.3390/acoustics2030034
Fuchs D, Knauer M, Egger M, Friedrich P. Audio Feedback for Device-Supported Balance Training: Parameter Mapping and Influencing Factors. Acoustics. 2020; 2(3):650-665. https://doi.org/10.3390/acoustics2030034
Chicago/Turabian StyleFuchs, Dominik, Martin Knauer, Marion Egger, and Petra Friedrich. 2020. "Audio Feedback for Device-Supported Balance Training: Parameter Mapping and Influencing Factors" Acoustics 2, no. 3: 650-665. https://doi.org/10.3390/acoustics2030034
APA StyleFuchs, D., Knauer, M., Egger, M., & Friedrich, P. (2020). Audio Feedback for Device-Supported Balance Training: Parameter Mapping and Influencing Factors. Acoustics, 2(3), 650-665. https://doi.org/10.3390/acoustics2030034