As people age, their risk of falling increases, which is a significant cause of injury and death. Dr Jeannette R. Mahoney at the Albert Einstein College of Medicine found that reaction-time based multisensory integration processes (ability to process multiple sensory inputs simultaneously) are associated with mobility measures and can be affected by declining cognition. Based on these results, Dr Mahoney developed a remote app-based test called CatchUTM, which aims to make multisensory integration testing more accessible and help older adults maintain their health and independence.
According to the Centers for Disease Control and Prevention (CDC), in the US, a person over the age of 65 falls every second. The CDC furthers that one in four Americans aged 65 years or older will experience a fall each year, with 32,000 dying as a result. In fact, falls are the leading cause of injury and injury-related death for people in this age group. Emergency departments treat around three million adults over 65 for fall injuries every year, and one in five falls leads to serious injuries such as broken bones. The CDC states that each year, the medical cost of falls in the US exceeds $50 billion.
Increased fall risk is a common and dangerous risk of normal aging and research on the factors behind it and the prediction of falls is critical to safeguarding our seniors. Dr Jeannette R. Mahoney, a neuroscientist at the Albert Einstein College of Medicine in Bronx, New York, has researched this topic for over 15 years. She has co-authored various peer-reviewed publications on multisensory integration and its association with cognitive and motor performance, with many studies focusing on the impact of visual-somatosensory integration – simultaneously processing of visual and touch-based stimuli – on gait, balance, and risk of falls. Based on knowledge from these investigations, Dr Mahoney has developed CatchUTM, a reaction time app-based test that not only aims to assess fall risk in older people but will also provide fall-prevention awareness through tailored counselling and therapeutic services.
Multisensory integration and gait
Gait – or the way a person walks – is complex, requiring coordination of bones, joints, muscles, and brain networks. For successful mobility, it is vital that sensory information be efficiently integrated to control movement. Pace, rhythm, and variability were identified by previous research as independent factors of gait (Verghese, J. et al., 2007). Pace involves spatial parameters such as gait speed, stride length, and amount of time spent in double support (with both feet on the ground). Rhythm involves temporal parameters, such as steps per minute (aka cadence), stance time, and swing time. Finally, variability involves inconsistencies in both spatial and temporal parameters.
In a paper published in 2018, Dr Mahoney and her research mentor, Dr Joe Verghese, hypothesised that visual-somatosensory integration would be associated with spatial aspects of gait but not temporal, as there is overlap in the neural circuits involved in goal-directed locomotion and sensory integration.The study consisted of 333 people, with an average age of 76.53. Their gait was evaluated by the participants being asked to walk at their normal walking speed down a 28-foot walkway embedded with pressure sensors. The data from the first and last four feet walked were excluded, to account for acceleration at the start and deceleration at the end. Their visual-somatosensory integration was also tested.
The results of the visual-somatosensory integration test indicated that out of the 333 participants, 95 were superior integrators, 87 were good, 96 were poor, and 55 were deficient. Good and superior integrators performed better in many criteria than poor and deficient integrators. That is, they had faster gait velocity (103.55 cm/s vs 95.93 cm/s), longer strides (119.84 cm vs 113.25 cm), less time spent in double support (31% vs 33%), and less stride length variability (3.46 vs 4.03 standard deviations). The researchers found that visual-somatosensory integration was associated with pace and stride length variability, but not rhythm, confirming their initial hypothesis. They suggest that temporal aspects of gait are more automatic processes, controlled mostly by brainstem and spinal networks that are less influenced by sensory inputs.
Multisensory integration and fallsIn a study published in 2019, the researchers assessed the link between visual-somatosensory integration and falls. In a sample of 289 older adults with an average age of 76.67, visual-somatosensory integration, static balance, and falls were measured. Static balance was assessed using a unipedal stance time test, where the time that the participant is able to balance on one leg is measured. Scoring lower on this test can predict falls and is associated with neuropathy. Yearly lab visits and telephone check-ups every 2-3 months were used to determine whether the participants had experienced a fall.
Out of the 289 participants, 90 were superior integrators, 76 were good, 79 were poor, and 44 were deficient. Those who were poor or deficient integrators were generally older and had more medical issues. They also had shorter unipedal stance times, 13.49 seconds and 12.57 seconds for poor and deficient integrators respectively. In comparison, superior and good integrators had longer stance times of 16.43 and 16.83 seconds. In the follow-up period where the researchers checked for falls, which lasted an average of 24 months, 52% of participants reported a fall. These participants tended to be older than the participants who did not fall, with an average age of 77.89 vs 75.35. Participants who fell also had shorter unipedal stance time, 13.54 seconds compared to 16.89 seconds for those who did not fall. Worse visual-somatosensory integration was associated with increased fall risk; participants with better visual-somatosensory integration were 76% less likely to experience a fall.
Cognitive impairment, multisensory integration, and mobility
With the knowledge that visual-somatosensory integration deficits were linked to worse mobility outcomes, Dr Mahoney and her collaborators investigated how cognitive impairment could affect these two factors in a study published in 2020. They compared adults with mild cognitive impairment (MCI) and dementia to those without. Participants completed the unipedal stance test, quantitative gait assessment on a 28-foot walkway, and a visual-somatosensory integration reaction time test.
“Worse visual-somatosensory integration increased fall risk, and participants with more efficient multisensory integration were 76% less likely to
experience a fall.”
The results indicated that the participants with MCI and dementia had less effective visual-somatosensory integration than those without cognitive impairments. There was no observable direct effect of cognitive impairment on the unipedal stance test; however, there was a significant indirect effect via loss of visual-somatosensory integration. Cognitive impairment significantly impaired gait, both directly and indirectly via lowered visual-somatosensory integration. Visual-somatosensory integration was best in those with normal cognitive function – the researchers suggested this was because the necessary neural networks for sensory, motor and cognitive functioning are impaired in people with dementia and MCI. The results indicated that cognitive impairment is associated with decreased visual-somatosensory integration, which in turn adversely affects mobility measures like balance and gait.
Multisensory integration testing and CatchUTM
In all of the aforementioned studies, visual-somatosensory integration was tested in the lab using the experimental apparatus depicted in Figure 1. The participant is seated, with their eyes fixed on a bullseye directly in front of them. In each hand, they hold stimulators that vibrate and emit light from LEDs. When stimulation is sensed (vibration, light, or both), the participant responds as fast as they can by pressing a foot pedal. White noise delivered via headphones blocks out noise from the vibration motors, and the interval between stimuli varied from one to three seconds to prevent anticipation. The whole reaction time experiment lasts approximately seven minutes, with three blocks of 45 stimuli separated by 20 second breaks.
However, this equipment comes with some downsides, mainly being costly and immobile. Many patients with mobility limitations or severe medical conditions can find it difficult to get to research facilities or join studies that are often time consuming. Dr Mahoney’s own grandmother, Jean Sisinni, participated in several longitudinal research studies at Einstein before becoming too frail to continue. The genesis of the idea to develop a mobile multisensory app that could predict and prevent falls was influenced by this, as well as by queries from Dr Mahoney’s colleague Dr Claudene George, a geriatrician who kept asking, “Jeannette, how do I get this reaction time test in my clinic?” Nevertheless, this concept was truly developed to help older adults like Grandma Sisinni, who recently passed away on January 24th, 2021.
CatchUTM is a ten-minute app-based test that can be conducted remotely on an iPhone, with results being transmitted electronically to the patient’s healthcare provider. Dr Mahoney believes that offering CatchUTM reaction time based multisensory assessments will aid in the identification of fall risk and help older adults maintain their independence through counselling and preventative services. An exclusive license to the intellectual property for CatchUTM was acquired from Albert Einstein College of Medicine in July 2021.
- Verghese, J., Wang, C., Lipton, R.B., et al. (2007). Quantitative gait dysfunction and risk of cognitive decline and dementia. Journal of Neurology, Neurosurgery & Psychiatry, 78(9), 929-935. Available at: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1995159/
- Mahoney, J. R., Dumas, K., & Holtzer, R. (2015). Visual–Somatosensory Integration is Linked to Physical Activity Level in Older Adults. Multisensory Research, 28(1-2), 11-29. Available at: https://doi.org/10.1163/22134808-00002470
- Mahoney, J. & Verghese, J. (2018). Visual-Somatosensory Integration and Quantitative Gait Performance in Aging. Frontiers In Aging Neuroscience, 10. Available at: https://www.frontiersin.org/articles/10.3389/fnagi.2018.00377/full
- Mahoney, J., Cotton, K. & Verghese, J. (2018). Multisensory Integration Predicts Balance and Falls in Older Adults. The Journals Of Gerontology: Series A, 74(9), 1429-1435. Available at: https://pubmed.ncbi.nlm.nih.gov/30357320/
- Mahoney, J. & Verghese, J. (2019). Does Cognitive Impairment Influence Visual-Somatosensory Integration and Mobility in Older Adults? The Journals Of Gerontology: Series A, 75(3), 581-588. Available at: https://academic.oup.com/biomedgerontology/article-abstract/75/3/581/5486074?redirectedFrom=fulltext
- Centers for Disease Control and Prevention (2021). Older Adult Fall Prevention. Available at: https://www.cdc.gov/falls/
Dr Mahoney studies visual-somatosensory integration in aging and its link to cognitive and motor outcomes.
- Joe Verghese, MBBS
- Claudene J. George, MD
- Resnick Gerontology Center at Albert Einstein College of Medicine
- Harold and Murial Block Institute for Clinical Translational Research at Einstein and Montefiore
Dr Jeannette R. Mahoney is a neuroscientist with 15 years of multisensory research experience. She received her BA from Stony Brook University (2002) and PhD from Ferkauf Graduate School of Psychology (2008). Dr Mahoney’s research objectives include optimising integration of multisensory inputs in an effort to reduce falls, improve mobility, and help older adults maintain independence.
Albert Einstein College of Medicine
1225 Morris Park Avenue
Van Etten Building
Bronx, New York 10461
T: +1 718 430 3809
F: +1 718 430 3829
Linkedin: Jeannette R Mahoney