Copyright ©2014 NPR. For personal, noncommercial use only. See Terms of Use. For other uses, prior permission required.

DAVID GREENE, HOST:

This is MORNING EDITION from NPR News. I'm David Greene.

RENEE MONTAGNE, HOST:

And I'm Rene Montagne.

Today in Your Health, we'll look at how online images influence teenage behavior. But first, researchers have developed an experimental blood test for people in their 70s that can predict which of them will soon develop Alzheimer's disease. If the test pans out, seniors would have an easy way to assess their risk.

But NPR's Jon Hamilton reports that they would also face a difficult decision about how much they want to know.

JON HAMILTON, BYLINE: It's already possible to detect signs of Alzheimer's long before symptoms appear. But it's not easy. Howard Federoff, a professor of neurology at Georgetown University, says one option is a spinal tap.

DR. HOWARD FEDEROFF: Not only is it painful but you could have headache and you could have bleeding.

HAMILTON: Or you could have a brain scan.

FEDEROFF: But they're very expensive.

HAMILTON: So Federoff and a team of researchers set out to find something better. They took blood from more than 500 people age 70 and older. Then, he says, they looked to see who developed Alzheimer's in the next five years.

FEDEROFF: We asked, in individuals who went from being cognitively normal to now being cognitively impaired, what was different in their blood relative to those who were cognitively nor mal and remained cognitively normal.

HAMILTON: Federoff says the team sifted through more than 4,000 potential indicators to discover just a handful that were really accurate.

FEDEROFF: We discovered that 10 blood lipids, fats, predicted whether someone would go on to develop cognitive impairment, or Alzheimer's.

HAMILTON: If levels of those blood fats were low, Federoff says, there was a better than 90 percent chance that a person would have cognitive problems within two or three years.

FEDEROFF: As tests go, it's a pretty good one.

HAMILTON: Federoff says the results, which appear in the journal Nature Medicine, are preliminary. He approach still needs to be tried in people of different ages and different racial groups. And the test is still a long way from mainstream use. Even so, he says...

FEDEROFF: It lays open initially the question, OK, so now I know what can I do? And as a clinical community, I think we need to be very responsible so that we can deploy the right professionals to help families understand the implications of a positive test that could be life-altering.

HAMILTON: The problem is that there still isn't a good treatment for Alzheimer's. But, of course, treatment is only one reason to get tested.

Jason Karlawish from the University of Pennsylvania says some people are already being tested for a gene that increases the risk of Alzheimer's.

JASON KARLAWISH: Knowing what their next five, 10, 15-year risk of developing cognitive impairment is, is very relevant to making plans around retirement and where they live. So there is certainly a role for knowing that information for some people in a way that I think makes sense around issues of life planning.

HAMILTON: On the other hand, Karlawish says, people who have the Alzheimer's gene and know it tend to rate their own memories worse than people who have the gene but don't know it. Knowing also seems to hurt people's performance on memory tests.

Karlawish says the biggest concern about Alzheimer's testing, though, probably has to do with stigma and identity.

KARLAWISH: How will other people interact with you if they learn that you have this information. And how will you think about your own brain and your sort of sense of self. I mean, among all of our organs, our brain is the sort of seat of our autonomy and to be told that your brain is at risk of developing cognitive problems is potentially transforming information for an individual.

HAMILTON: Karlawish says it may be easier for people to learn they are at risk if we all begin to view Alzheimer's in a different way.

KARLAWISH: Right now in America when we talk about Alzheimer's disease, it's on the basis of criteria that were developed in 1984. And those criteria - simply put - kind of divide Alzheimer's disease into a category. You either have Alzheimer's disease dementia or you're normal, you don't have it. And I think subsequent research has shown that that model is inadequate.

HAMILTON: Karlawish says a better model might be more like heart disease. It starts with biological changes years before symptoms appear. And there is no bright line separating healthy people from those in early stages of the disease.

Jon Hamilton, NPR News.

Copyright © 2014 NPR. All rights reserved. No quotes from the materials contained herein may be used in any media without attribution to NPR. This transcript is provided for personal, noncommercial use only, pursuant to our Terms of Use. Any other use requires NPR's prior permission. Visit our permissions page for further information.

NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.

Comments

 

Please keep your community civil. All comments must follow the NPR.org Community rules and terms of use, and will be moderated prior to posting. NPR reserves the right to use the comments we receive, in whole or in part, and to use the commenter's name and location, in any medium. See also the Terms of Use, Privacy Policy and Community FAQ.

Support comes from: