Why do some people better resist the cognitive declines that typically come with aging? Researchers have long suspected that genetic factors are involved. Now, a Nature paper published online January 18 provides some of the first evidence in support of this idea. Researchers led by Ian Deary at the University of Edinburgh, U.K., and Peter Visscher at the University of Queensland, Brisbane, Australia, correlated lifetime cognitive change with variations in genetic markers in a large cohort of unrelated people, and estimated that at least one-quarter of individual variation in age-related cognitive decline can be explained by genetic factors. What those factors are is not yet clear. The data leave room for a large environmental effect on brain aging as well. The findings may encourage research into the specific genetic and environmental factors that influence brain aging, Deary suggested, adding, “Since cognitive change is so central to dementia, we think the results are relevant to that, too.”
Another recent paper, published online January 5 in the British Medical Journal, took a different approach to looking at lifetime cognitive change. Researchers led by Archana Singh-Manoux at the Institut National de la Santé et de la Recherche Médicale (INSERM), Villejuif, France, analyzed data from a large longitudinal cohort and found evidence for a small amount of cognitive decline in people as young as 45, with the rate of decline increasing later in life. This belies the common assumption that cognitive decline begins in old age, and has implications for aging studies, the authors suggest.
To examine the genetic contribution to cognitive stability, Deary and colleagues combined data from three Scottish cohorts—the Aberdeen Birth Cohort of 1936, and the Lothian Birth Cohorts of 1921 and 1936. All participants had taken a standard I.Q. test, the Moray House Test, at age 11, and also took several cognitive tests in old age (at ages 65, 70, or 79). Almost 2,000 participants provided blood samples, which the researchers genotyped for more than half a million common single nucleotide polymorphisms (SNPs). By comparing the degree of cognitive change with genetic similarity among pairs of participants using population-based genetic analyses, the researchers estimated that one-fourth of the variation in cognitive stability was due to genetics. This figure may be low, the authors note, as it only includes genes linked to common SNPs. The remaining variation may be due to environmental influences, implying that these factors also contribute considerably to cognitive stability, Deary wrote to ARF. The cohort data do not include any measure of intelligence in young adulthood, but the authors note that by age 11, about half of the individual variation in cognitive ability can be chalked up to genetics. This figure remains similar in adulthood (see Deary et al., 2009), suggesting that the genetic contribution to cognition does not change much during adolescence. The corollary to this, the authors suggest, is that most of the genetic variation they found contributes to age-related cognitive decline.
Despite some reshuffling, the rank order of intelligence among most participants remained similar across the lifespan. Deary and colleagues found a correlation of 0.62 between genetic markers and cognitive scores in childhood and old age. A correlation of 1 would indicate that exactly the same set of genes affected cognition throughout life, and that any differences between intelligence in childhood and old age were solely due to environment; a correlation of 0 would indicate that a completely different set of genes regulates intelligence at either end of the life spectrum. The finding of 0.62 implies that many of the same genes determine intelligence across the lifespan, but that there are also some genes that may only affect intelligence in old age. However, the standard errors on this estimate were large due to low statistical power in the sample, making this figure preliminary. “We are keen for this to be replicated to see if others find the same result,” Deary wrote to ARF.
“This study makes an important contribution,” said Chandra Reynolds at the University of California, Riverside. She noted that the inclusion of both childhood and old age I.Q. data makes this study unique, and also praised the authors’ innovative approach of using a multilevel pairwise comparison, reminiscent of behavioral genetic techniques employed in twin studies, to analyze unrelated individuals. Reynolds pointed out that the data “nicely converge with what has been observed in twin studies of aging,” and support previous findings that genes play an important role in determining intelligence as people age.
Perminder Sachdev at the University of New South Wales, Randwick, Australia, wrote to ARF that the study “confirms what we have suspected—that cognitive aging is partly genetically determined—but we did not have the data previously to demonstrate this.” He added, “There is a long road ahead to determine the exact genes involved, and the genetic and environmental mechanisms at play.” (See full comment below.) ApoE, the main genetic risk factor for sporadic Alzheimer’s disease, is one possible gene candidate (see, e.g., ARF related news story). For his part, Deary told ARF he is collaborating with a team led by Neil Pendleton at the University of Manchester, U.K., to look in larger populations for genes that influence cognitive stability. Their consortium is called Cognitive Ageing Genetics in England and Scotland (CAGES); Deary also participates in the international consortium Cohorts for Heart and Aging Research in Genomic Epidemiology (CHARGE). In addition, Deary plans to examine MRI brain scans to look for genes that affect brain structure.
As described in the second paper, Singh-Manoux and colleagues analyzed 10 years of cognitive testing data from the Whitehall II cohort, which consists of more than 10,000 British civil servants. The tests measured verbal and mathematical reasoning, verbal memory, verbal fluency, and vocabulary. The authors found measurable declines over the decade in every cognitive area except vocabulary, with the greatest drops occurring in reasoning skills. When they stratified participants by age, they saw about a 4 percent decline in reasoning test scores in the youngest age group (45-49 years), increasing to an almost 10 percent drop in the oldest group (65-70 years). Interestingly, a cross-sectional analysis of the same data greatly overestimated age-related cognitive declines in women, likely due to cohort differences such as fewer educational opportunities for women from earlier generations.
In an accompanying editorial, Francine Grodstein at Brigham and Women’s Hospital, Boston, wrote, “Previous data show that modest differences in cognitive performance in earlier life predict larger differences in risk of dementia in later life…Singh-Manoux and colleagues have set a new benchmark for future research and, eventually, clinical practice. That is, efforts to prevent dementia may need to start in adults as young as 45 years.”
Trey Hedden at Massachusetts General Hospital, Boston, noted that, because of the large sample size, the study had immense power to detect small effects. He pointed out that for all cognitive skills except reasoning, the age-related decline over 10 years amounted to less than one additional test item missed. “Such small changes may not be detectable within a person in the clinic,” he wrote to ARF, but added, “This study does provide an important indication of how much cognitive change over a decade might be considered typical during middle adulthood, so that individuals exhibiting higher levels of change over time could be targeted for potential interventions.” (See full comment below.)—Madolyn Bowman Rogers
- Deary IJ, Johnson W, Houlihan LM. Genetic foundations of human intelligence. Hum Genet. 2009 Jul;126(1):215-32. PubMed.
- Does It Fly? Study Uncovers Genetic Influence on Brain Function
- Faulty DNA Repair Gene Leads to Cognitive Problems
- What Primates Can Tell Us About Normal Brain Aging
- Aging Primates—Brain Iron Up, Motor Skills and Plastic Synapses Down
- Studying the Oldest Old—Chances for Longevity Spelled Out in Genes?
- Histone Acetylation: Epigenetic Achilles’ Heel of Memory in Aging
- Mind Your Magnesium—Oral Compound Boosts Cognition in Rats
- Early ApoE4 Memory Effects, But Do You Really Want to Know?
- DC: Ways to Slow Brain Aging: Exercise, Estrogen, and Sleep?
- Deary IJ, Yang J, Davies G, Harris SE, Tenesa A, Liewald D, Luciano M, Lopez LM, Gow AJ, Corley J, Redmond P, Fox HC, Rowe SJ, Haggarty P, McNeill G, Goddard ME, Porteous DJ, Whalley LJ, Starr JM, Visscher PM. Genetic contributions to stability and change in intelligence from childhood to old age. Nature. 2012 Jan 18; PubMed.
- Singh-Manoux A, Kivimaki M, Glymour MM, Elbaz A, Berr C, Ebmeier KP, Ferrie JE, Dugravot A. Timing of onset of cognitive decline: results from Whitehall II prospective cohort study. BMJ. 2011;344:d7622. PubMed.
- Grodstein F. How early can cognitive decline be detected?. BMJ. 2011;344:d7652. PubMed.