Why do some people better resist the cognitive declines that typically come with aging? Researchers have long suspected that genetic factors are involved. Now, a Nature paper published online January 18 provides some of the first evidence in support of this idea. Researchers led by Ian Deary at the University of Edinburgh, U.K., and Peter Visscher at the University of Queensland, Brisbane, Australia, correlated lifetime cognitive change with variations in genetic markers in a large cohort of unrelated people, and estimated that at least one-quarter of individual variation in age-related cognitive decline can be explained by genetic factors. What those factors are is not yet clear. The data leave room for a large environmental effect on brain aging as well. The findings may encourage research into the specific genetic and environmental factors that influence brain aging, Deary suggested, adding, “Since cognitive change is so central to dementia, we think the results are relevant to that, too.”

Another recent paper, published online January 5 in the British Medical Journal, took a different approach to looking at lifetime cognitive change. Researchers led by Archana Singh-Manoux at the Institut National de la Santé et de la Recherche Médicale (INSERM), Villejuif, France, analyzed data from a large longitudinal cohort and found evidence for a small amount of cognitive decline in people as young as 45, with the rate of decline increasing later in life. This belies the common assumption that cognitive decline begins in old age, and has implications for aging studies, the authors suggest.

To examine the genetic contribution to cognitive stability, Deary and colleagues combined data from three Scottish cohorts—the Aberdeen Birth Cohort of 1936, and the Lothian Birth Cohorts of 1921 and 1936. All participants had taken a standard I.Q. test, the Moray House Test, at age 11, and also took several cognitive tests in old age (at ages 65, 70, or 79). Almost 2,000 participants provided blood samples, which the researchers genotyped for more than half a million common single nucleotide polymorphisms (SNPs). By comparing the degree of cognitive change with genetic similarity among pairs of participants using population-based genetic analyses, the researchers estimated that one-fourth of the variation in cognitive stability was due to genetics. This figure may be low, the authors note, as it only includes genes linked to common SNPs. The remaining variation may be due to environmental influences, implying that these factors also contribute considerably to cognitive stability, Deary wrote to ARF. The cohort data do not include any measure of intelligence in young adulthood, but the authors note that by age 11, about half of the individual variation in cognitive ability can be chalked up to genetics. This figure remains similar in adulthood (see Deary et al., 2009), suggesting that the genetic contribution to cognition does not change much during adolescence. The corollary to this, the authors suggest, is that most of the genetic variation they found contributes to age-related cognitive decline.

Despite some reshuffling, the rank order of intelligence among most participants remained similar across the lifespan. Deary and colleagues found a correlation of 0.62 between genetic markers and cognitive scores in childhood and old age. A correlation of 1 would indicate that exactly the same set of genes affected cognition throughout life, and that any differences between intelligence in childhood and old age were solely due to environment; a correlation of 0 would indicate that a completely different set of genes regulates intelligence at either end of the life spectrum. The finding of 0.62 implies that many of the same genes determine intelligence across the lifespan, but that there are also some genes that may only affect intelligence in old age. However, the standard errors on this estimate were large due to low statistical power in the sample, making this figure preliminary. “We are keen for this to be replicated to see if others find the same result,” Deary wrote to ARF.

“This study makes an important contribution,” said Chandra Reynolds at the University of California, Riverside. She noted that the inclusion of both childhood and old age I.Q. data makes this study unique, and also praised the authors’ innovative approach of using a multilevel pairwise comparison, reminiscent of behavioral genetic techniques employed in twin studies, to analyze unrelated individuals. Reynolds pointed out that the data “nicely converge with what has been observed in twin studies of aging,” and support previous findings that genes play an important role in determining intelligence as people age.

Perminder Sachdev at the University of New South Wales, Randwick, Australia, wrote to ARF that the study “confirms what we have suspected—that cognitive aging is partly genetically determined—but we did not have the data previously to demonstrate this.” He added, “There is a long road ahead to determine the exact genes involved, and the genetic and environmental mechanisms at play.” (See full comment below.) ApoE, the main genetic risk factor for sporadic Alzheimer’s disease, is one possible gene candidate (see, e.g., ARF related news story). For his part, Deary told ARF he is collaborating with a team led by Neil Pendleton at the University of Manchester, U.K., to look in larger populations for genes that influence cognitive stability. Their consortium is called Cognitive Ageing Genetics in England and Scotland (CAGES); Deary also participates in the international consortium Cohorts for Heart and Aging Research in Genomic Epidemiology (CHARGE). In addition, Deary plans to examine MRI brain scans to look for genes that affect brain structure.

As described in the second paper, Singh-Manoux and colleagues analyzed 10 years of cognitive testing data from the Whitehall II cohort, which consists of more than 10,000 British civil servants. The tests measured verbal and mathematical reasoning, verbal memory, verbal fluency, and vocabulary. The authors found measurable declines over the decade in every cognitive area except vocabulary, with the greatest drops occurring in reasoning skills. When they stratified participants by age, they saw about a 4 percent decline in reasoning test scores in the youngest age group (45-49 years), increasing to an almost 10 percent drop in the oldest group (65-70 years). Interestingly, a cross-sectional analysis of the same data greatly overestimated age-related cognitive declines in women, likely due to cohort differences such as fewer educational opportunities for women from earlier generations.

In an accompanying editorial, Francine Grodstein at Brigham and Women’s Hospital, Boston, wrote, “Previous data show that modest differences in cognitive performance in earlier life predict larger differences in risk of dementia in later life…Singh-Manoux and colleagues have set a new benchmark for future research and, eventually, clinical practice. That is, efforts to prevent dementia may need to start in adults as young as 45 years.”

Trey Hedden at Massachusetts General Hospital, Boston, noted that, because of the large sample size, the study had immense power to detect small effects. He pointed out that for all cognitive skills except reasoning, the age-related decline over 10 years amounted to less than one additional test item missed. “Such small changes may not be detectable within a person in the clinic,” he wrote to ARF, but added, “This study does provide an important indication of how much cognitive change over a decade might be considered typical during middle adulthood, so that individuals exhibiting higher levels of change over time could be targeted for potential interventions.” (See full comment below.)—Madolyn Bowman Rogers


  1. This study by Singh-Manoux and colleagues is an important addition to the characterization of cognitive change during middle age and older adulthood. It stands out because of the large sample size (7,390 subjects) and the length of time over which the subjects were followed (10 years).

    Rarely do we have an opportunity to examine cognitive change over a decade across so many individuals. It is important to point out that these results confirm what much of the literature on cognitive aging has already shown. In the 2004 review that they cite, our statement that there was little evidence for age-related decline before the age of 60 specifically referred to the known results from the few longitudinal studies existing at that time. We went on to say that many cross-sectional studies did find age-related declines across the entire adult lifespan (from age 20 to 90).

    In particular, we noted a lack of knowledge regarding individuals in midlife (from 30 to 60). We pointed out the mismatch between longitudinal and cross-sectional data during midlife, and noted that both types of designs have their own pitfalls; the truth is likely to reside somewhere between them. One cross-sectional study in which I was involved showed that cognitive differences from age 30-40 occurred to nearly the same extent as those from age 60-70 (Park et al., 2002). Many other studies have shown similar results. It is important, but not surprising, that the authors have been able to confirm such results with longitudinal data. By filling in the gap between 45 and 60, this new study helps us see that longitudinal change in the same people can be detected with the power of a large sample, and that perhaps the cross-sectional estimates were not far off the mark.

    One other important point is that, although the cognitive changes detected in this study were statistically significant, this was due to the large sample size having immense power to detect quite small effects. For all categories of cognitive skills except reasoning, the age-related decline in scores over 10 years amounted to about one item or less. For example, on the memory test, the average 45- to 49-year-old was able to remember 0.58 of an item less when tested again 10 years later. Such small changes may not be detectable within a person in the clinic. For example, if people coming into a clinic have a “true” memory score of 8 out of 20, they might score a 9 when tested on Monday, but a 7 on Friday, even though the test could still be considered highly reliable. If those people come in again in 10 years and score a 7, it would be almost impossible to determine if that is because they’re being tested on a Friday, or because their “true” score has declined to 7 and the current score is reflecting that change.

    This study can detect these small changes because it averages over thousands of people, but knowing that such small changes occur will have limited usefulness in practice. This may also be why previous longitudinal studies with smaller samples failed to detect cognitive change at younger ages. This study does provide an important indication of how much cognitive change over a decade might be considered typical during middle adulthood, so that individuals exhibiting higher levels of change over time could be targeted for potential interventions.


    . Models of visuospatial and verbal memory across the adult life span. Psychol Aging. 2002 Jun;17(2):299-320. PubMed.

    View all comments by Trey Hedden
  2. This is an interesting study that exploits the unique data from Scottish birth cohorts that have been followed up for 54-68 years.

    There is both stability as well as change in cognitive ability over one's lifespan, and the question being asked is whether this is due to genetic or environmental factors. In the past, this question has generally been asked using related individuals who share genetic information (twins or family members), but, of course, such studies are limited in the lifespan covered. The authors have used a new statistical technique to ask this question in cohorts of unrelated individuals, using genomewide SNPs as genetic markers.

    The results show that: 1) the stability of cognitive ability from childhood to old age has a genetic contribution, accounting for slightly less than 40 percent of the variance; and 2) the change in cognition with aging also has a genetic contribution, but that accounts for 24 percent of the variance. Since SNPs only measure one aspect of relatedness, this may well be an underestimate of the genetic contribution. There is also significant attrition in the sample, mainly due to premature death, which is likely to have affected the estimates. Another limitation is that the Moray House Test is an old test, and its validity when applied to older people is uncertain. While the participants were given a number of cognitive tests, only the Moray was available for the childhood assessments.

    However, it is an important study showing that the "normative" change that occurs in cognition with aging is partly affected by genetic factors. It also suggests that environmental factors possibly play a very important role, and the study does not provide insights into the likely environmental factors.

    Examination of the ApoE4 gene, which is a known risk factor for Alzheimer's disease (AD), did not suggest that this gene is likely to have influenced the results.

    The study suggests that it might be fruitful to hunt for genes that affect aging-related cognitive change. I do not think it provides insights into the genetic causes of AD, or whether cognitive aging and AD are distinct processes or related ones.

    In short, it confirms what we have suspected—that cognitive aging is partly genetically determined, but did not have the data previously to demonstrate this. It provides the data to some extent, but there is a long road ahead to determine the exact genes involved and the genetic and environmental mechanisms at play.

    View all comments by Perminder Sachdev

Make a Comment

To make a comment you must login or register.


News Citations

  1. Early ApoE4 Memory Effects, But Do You Really Want to Know?

Paper Citations

  1. . Genetic foundations of human intelligence. Hum Genet. 2009 Jul;126(1):215-32. PubMed.

External Citations

  1. Cohorts for Heart and Aging Research in Genomic Epidemiology

Further Reading

Primary Papers

  1. . Genetic contributions to stability and change in intelligence from childhood to old age. Nature. 2012 Jan 18; PubMed.
  2. . Timing of onset of cognitive decline: results from Whitehall II prospective cohort study. BMJ. 2011;344:d7622. PubMed.
  3. . How early can cognitive decline be detected?. BMJ. 2011;344:d7652. PubMed.