Alexa could diagnose Alzheimer’s and other brain conditions — should it?

Digital personal assistants could be equipped to diagnose cognitive issues using speech, though the ethics are debatable.

Yagi Studio/DigitalVision/Getty Images

It’s an increasingly common experience: You wander into the kitchen, quietly muttering under your breath, when you hear a disembodied feminine voice say, “I’m sorry, I didn’t quite catch that.”

We can all agree that Alexa’s tendency to eavesdrop is, at times, a little creepy. But is it possible to harness that ability to improve our health? That’s the question that researcher David Simon and his coauthors sought to answer in a recent paper published in Cell Press.

Simon, a legal ethicist at Harvard University, and his team imagined a hypothetical near-future scenario in which Alexa came equipped with the power to diagnose cognitive conditions like Alzheimer's and dementia simply by analyzing an elder person’s speech patterns.

In doing so, they tried to envision all of the benefits such technology could offer — and all the ethical problems it could create.

“These technologies are starting to come into existence already,” Simon tells Inverse. “It’s a rich area for further research.”

Paging Dr. Alexa

When diagnosing conditions like Alzheimer’s, time is of the essence. The condition’s symptoms, which include memory loss, confusion, and personality changes, often come about gradually, making early diagnosis an incredibly difficult task.

“[Technologies like this are] coming faster than the law is equipped to address.”

Most people with Alzheimer’s live for four to eight years after diagnosis, but some patients whose symptoms are caught early can live up to twenty. The ability to catch symptoms before they escalate could give the person and their family time to “address the problem when they still have the capability to do so,” Simon says.

Cognitive conditions can be difficult for doctors to catch early due to subtle warning signs, so digital personal assistants may offer a solution.

Six_Characters/E+/Getty Images

But cognitive conditions can be particularly tricky for doctors to diagnose the old-fashioned way. A typical checkup lasts less than an hour. During that time, a physician could easily overlook some of the more subtle symptoms of early Alzheimer’s. Likewise, family, friends, and even the person themselves might miss early warning signs, especially if they aren’t constantly in contact with one another.

Personal digital assistants, on the other hand, are always around, and they are listening. With the right software, these devices could analyze many years worth of a person’s speech data and detect the red flags of cognitive declines, such as forgetting words or losing fluency, before anyone else. This data could also help a doctor confirm a patient’s diagnosis.

So, does this mean that doctors should hand over the prescription pad to Alexa? Not so fast.

The Dicey Ethics of Digital Eavesdropping

There are some obvious issues with taking medical advice from your digital personal assistant. The first is the potential for misdiagnosis.

“Innovators like to innovate. They’re driven to make a product that is marketable.”

AI certainly isn’t perfect (except for DALL-E mini, of course). A program designed to track cognitive decline in elders might get muddled if it encounters speech from a teen or young adult, and a program trained on neurotypical speech patterns may have trouble accurately assessing a neurodivergent person.

Employing digital personal assistants to detect cognitive conditions could come with several downsides, including misdiagnosis and divulging sensitive information to doctors.


Then, there’s the question of privacy and consent. A device listening in on your home will probably pick up some juicy gossip alongside useful diagnostic data. Patients might not want sensitive information about their dating life, for example, to be presented as evidence of cognitive decline to their doctor.

And the slope gets even more slippery around progressive cognitive diseases. A person with Alzheimer’s may not realize that they’re being monitored by an omniscient AI — even if they were the one to suggest using the program in the first place.

But issues like these sometimes don’t even enter the conversation on the development side.

“Innovators like to innovate,” says Simon.

“They’re driven to make a product that is marketable.”

It’s up to ethicists, policymakers, and medical regulators to grapple with these potential drawbacks. And Simon thinks that they need to start grappling ASAP.

On the horizon...

The idea of Amazon (or a similar company) using its digital assistant as a medical tool is not far-fetched. These devices are already equipped to listen for voice commands and tuned to take in ambient data from around the house; calibrating them for diagnostic purposes could be as simple as a software update. And getting approval for such technology wouldn’t be too hard under current United States guidelines.

Tech companies like Amazon have already infiltrated the healthcare industry, so it may not be long before digital assistants act as doctors.

Yagi Studio/DigitalVision/Getty Images

From a regulatory perspective, it’s much easier to get FDA approval for so-called “medical devices,” which include everything from glucose meters to Fitbits, as opposed to drugs.

What’s more, Amazon’s recent purchase of One Medical positions it to break into the healthcare industry seriously. The megacompany already acquired the online pharmacy PillPack in 2018 and made its first foray into healthcare with its Haven pilot program (which folded last year).

Because of this, Simon believes that now is the time to start thinking through the implications of letting Alexa diagnose your grandma.

“Technologies like this are coming. And I think they’re coming faster than the law is equipped to address in a complete way,” he says.

Forecast the future

Sign up for our free HORIZONS newsletter and discover the technology and ideas of today that can shape the future.

By subscribing to this BDG newsletter, you agree to our Terms of Service and Privacy Policy
Related Tags