Researchers from Dartmouth-Hitchcock and the University of Massachusetts Boston have received a four-year, $1.2 million grant from the National Institute on Aging in an exciting effort to use voice-assistant devices like Amazon’s Alexa to detect signs of memory problems in people. The team will leverage the financial resources and human capital to develop a system that would capitalize on rapid progress in machine and deep-learning techniques to detect changes in speech patterns to determine if a person is at risk for developing dementia or Alzheimer’s disease.
Led by Xiaohui Liang, assistant professor of computer science from the University of Massachusetts Boston, noted, “We are tackling a significant and complicated data-science question whether the collection of long-term speech patterns of individuals at home will enable us to develop new speech-analysis methods for early detection of this challenging disease.” Professor Liang continued, “Our team envisions that the changes in the speech patterns of individuals musing the voice assistant systems may be sensitive to their decline in memory and function over time,” reported the Concord Monitor.
Better Family Planning
The hope is that these AI-driven detection systems could help detect the onslaught of dementia and/or Alzheimer’s disease earlier and hence offer the entire family the opportunity to plan for interventions earlier in the process, reports John Batsis, a research team member and associate professor of medicine at the Geisel School of Medicine at Dartmouth.
Such a system would be able to detect changes in an individual’s speech pattern, intonation and lexicon, reported Batsis. Many challenges lay ahead such as dealing with a myriad of languages to handling scenarios where there are multiple persons in one room. Such pragmatic issues must be addressed head on. Would this system be sold commercially or would it be paid for in some other way? How would health data and various privacy laws be secured?
Experts Chime in
Other experts not involved with the initiative recently gave the thumbs up. For example, Alicia Nobles, an assistant professor in the Department of Medicine at University of California, San Diego and co-founder of the Center for Data-Driven Health at the Qualcomm Institute, noted in an email that detecting impairments early may be “crucial” to helping patients and their caregivers manage their care. And a senior vice president for policy at AARP, Sarah Lenz, also serves as the executive director of the Global Council on Brain health and suggests the research looked “promising.”
Xiaohui Liang, assistant professor of computer science from the University of Massachusetts Boston
John Batsis, associate professor of medicine at the Geisel School of Medicine at Dartmouth