Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Opinion: Too many older Americans are getting tested for Alzheimer’s

An 80-something patient came in for an annual visit recently and was worried that recent memory lapses might be symptoms of Alzheimer’s disease. This patient, like several others in my practice, has taken cognitive tests annually for more than a decade.
With each passing year, I see and hear the patient’s spiral of worry: Am I getting more forgetful? Perhaps it feels like “Where are my keys?” and “Where did I leave my wallet?” have become common refrains. These are simple memory lapses, an experience most people have throughout their lives, but they can be troubling as we get older. Are they signs of Alzheimer’s? Or signs of that other dreadful A-word … aging?
Year after year, for 10 years, in fact, this patient’s cognitive tests had come back normal. Until this year, when a blood test was positive for biomarkers for Alzheimer’s — triggering a further sense of panic. But here’s the problem: Science can’t yet tell us whether a positive test means the patient has an early stage of the disease. The only new data point was that this patient, so long dreading this day, had positive biomarkers showing increased risk for the disease. This person may not develop full-blown Alzheimer’s for five years or 20 years or ever. So did running that test have any benefit?
There is a sense of urgency in the medical community to classify a positive biomarker test as “Stage One Alzheimer’s Disease.” This is part of a larger desire to appear aggressive in fighting the disease, which for so long had no tests and no course of treatment.
I get that, and I understand why so many older patients fear Alzheimer’s, but I disagree with doctors whose response is to test early and often — and to diagnose Stage One Alzheimer’s based solely on biomarkers.
That categorization is potentially dangerous to patients, an unnecessary source of middle-of-the-night stress that can be deeply damaging to the very quality of life this person seeks to retain.
Such worries have become more widespread recently as the biomarker blood tests, marketed by companies like Quest and Labcorp, became commercially available so that patients could pay for them out of pocket. Previously, they were only used by clinicians in studies.
There’s an understandable impulse behind the medical community’s push to test lots of people. The more the disease is diagnosed, the more individuals will be identified for future treatments, and there is no doubt we need to improve recognition of early dementia, especially in medically underserved populations.
At the same time, pharmaceutical companies need more clinical trials — and also more patients — to develop new drugs to add to the array that already exists, such as Leqembi. In an aging nation with 7 million people already diagnosed with Alzheimer’s, we need these drugs, badly, but there’s no denying that profit lurks in the background as a motive here as well.
Although there are good reasons to test lots of people and gather a larger pool of patients who might have early stage Alzheimer’s, I think the cost to specific individuals can be just too great.
Candidates for clinical trials should not be identified based on positive blood tests from a clinical setting, but rather through carefully conducted research studies that have proper counseling and disclosure protocols.
More than 40% of individuals over 80 who take a biomarker test for Alzheimer’s will test positive. And it’s natural that many of them then will fixate on the worst-case scenarios and live their lives with a sense of worry and dread.
Patients who’ve tested positive come to me and bluntly state, “I have Alzheimer’s.” I see the scared look on their faces. Reaching this conclusion based on the biomarkers is like diagnosing cancer without doing a biopsy. The danger is real: Half of the patients who actually do have Alzheimer’s experience some sort of depression, and so do quite a few who think they have it — or worry they are sure to get it.
A positive blood biomarker test could also lead to significant mental and financial damage, with unnecessary and expensive procedures like an MRI, a PET scan or a spinal fluid test. Some people don’t like to hear this, but in many cases simple lifestyle changes like exercise, more rest and healthier eating could improve cognitive function. This is the recommendation I make to all my patients regardless of their Alzheimer’s risk.
For now, the biomarker test should be taken only if you are truly showing signs of Alzheimer’s disease. Here are some ways to tell: If a loved one is experiencing noticeable changes from their baseline mental abilities — such as losing track of time and place or struggling to find words — that can be a sign. Personality changes and new mood symptoms are other warning signs. Still, these symptoms should be distinguished from the slow and steady decline we will all experience.
If you do have symptoms, then yes, it could be time for biomarker tests, but only after thorough cognitive testing and a complete review of your medical history. That medical history could show other conditions such as sleep apnea that can influence memory. Lifestyle changes or treatments of those other conditions could resolve the troubling symptoms.
My stance on widespread testing will be different when doctors have more tools available to predict and treat Alzheimer’s and can offer constructive advice after a blood test comes back positive.
When someone has a cancerous tumor, surgeons remove it as soon as possible and start a treatment plan. When someone tests positive for the gene that indicates risk for a certain type of cancer, we monitor carefully.
A positive Alzheimer’s biomarker test, on the other hand, leads to no changes in clinical management when there are no cognitive symptoms. Prevention trials and improved predictive biomarkers for Alzheimer’s disease will change that, but we’re not there yet.
In the meantime, doctors’ oath to “do no harm” should mean we avoid overdiagnosing Alzheimer’s and fueling needless anxiety.
Keith Vossel is a professor of neurology at UCLA.

en_USEnglish