The term complex trait does not give full credence to the study of cognitive genetics. Managing to combine biological, historical, ethical and definitional complexity, the study of ‘intelligence’ has made a mockery of its intended definition, which is ‘to reason and understand’. However, amidst the uncertainty that has plagued this area of research for over a century, a general consensus (amongst geneticists at least) is beginning to emerge. Firstly, the definition of ‘general cognitive ability’ (‘general intelligence’), first proposed by Spearman (Spearman 1927), is that it is the degree of correlation (average 0.3) that exists between varied cognitive tests such as those used to measure vocabulary, processing speed and different facets of memory abilities (Carroll 1993). Secondly, the hotly contested ‘nature verses nurture’ debate can officially be declared a draw with both genetic and environmental factors contributing roughly equally towards observed variation in ability (Bouchard & McGue 2003). As the era of cognitive genetics emerges, it brings with it a host of new questions such as those pertaining to research quality, potential therapeutic benefits and scope for misuse.
Cognitive impairment in the elderly, caused by either normal ageing process or dementia, is an increasing problem in developed countries that has enormous social and economic considerations. Research investigating the genetic basis of cognition is a new and rapidly developing field that may aid in the development of new treatments for age-related cognitive deficit. Over the past 6 years, a number of quantitative trait loci (QTLs) have been associated with cognitive functioning in humans including loci within the genes catechol-o-methyltransferase, brain-derived neurotrophic factor, muscle segment homeobox 1, serotonin transporter 2A (HTR2A), cholinergic muscarinic receptor 2, cathepsin D, metabotrophic glutamate receptor and most recently the class II human leukocyte antigens. Unfortunately, inconsistency within the literature, which is a hallmark of almost all association studies investigating complex diseases and traits, is casting doubt as to which genes are truly associated with cognition and which are a result of Type 2 error. This review will highlight implicated intelligence QTLs, examine the probable reasons for the current discrepancies between reports and discuss the potential advantages that may be procured from the study of cognitive genetics.
Intelligence genes may predict dementia onset
The development of cognitive abilities such as recallable memory, spatial ability and communication skills was an evolutionarily advantage that has made the human species one of the most successful and adaptive on the planet. During the last century, a decrease in mortality, largely caused by improvements in healthcare and nutrition, coupled together with a reduced birth rate has resulted in an increase in the proportion and number of older individuals within developed countries (Kirkwood & Austad 2000). Consequently, there has been a rapid increase in the prevalence of cognitive impairment caused by a combination of dementia and the normal ageing process. Severe cognitive impairment currently affects 1% of the population aged between 65 and 69 years, 18% of people over 90 years (Brayne et al. 2001) and accounts for 38% of all disabilities (Melzer et al. 1999). These figures are increasing, and currently no effective answers on how to combat the problem exist.
However, an increasing number of studies are reporting that level of education and occupation is correlated to the onset of dementia (Glatt et al. 1996; Katzman 1993; Sanchez et al. 2002; Scarmeas & Stern 2004; Stern et al. 1994). This has been explained by a concept called ‘brain reserve capacity’ (BRC). Individuals who attain a high level of education are thought to have a larger BRC. If a person receives an insult to the brain, such as that caused by trauma, inflammation or hypertension, then individuals with a lower BRC will be more likely to show earlier symptoms of cognitive impairment if the level of insult is the same (Mori et al. 1997). Although the exact nature of BRC is unknown, combinations of variables, including efficiency of neural connections, brain volume and synaptic density, may all play a role (Satz 1993). However, variation in education levels is in part a consequence of BRC. High levels of education can be achieved if the individual's BRC is large enough, although factors such as unfavourable socio-economic environment may reduce a person's chance of receiving an appropriate level of education that would maximize their BRC. Therefore, a better predictor of dementia onset has been shown to be a direct measurement of intelligence (Schmand et al. 1997). A greater understanding of the role that genes play in regulating cognitive ability may therefore help in the development of new and more efficient treatments designed to combat cognitive impairment. In addition, the identification of susceptibility genes may allow preventative treatment of high-risk individuals.
Cognitive ability and its decline
Brain-imaging techniques and cognitive testing have provided evidence that the normal ageing process affects different brain areas earlier and more severely than others (Rabbitt & Lowe 2000). Regions most affected include the prefrontal areas that have a role in executive functions and several regions involved in memory and learning that include the temporal cortex, hippocampus and limbic system (Eustache et al. 1995; Raz et al. 1998). There is also a large interindividual variability in the rate of cognitive decline, which does not appear to reflect cognitive ability, socio-economic status or gender (Rabbitt et al. 2003). It would therefore be reasonable to assume that not only will the level of cognitive ability impact on the onset of cognitive deficit but also its trajectory over time may be an important factor.
An interesting question that several longitudinal investigations have explored is whether interindividual differences in decline are heritable. One such study has investigated the rate of change over a 5-year period using 764 twin pairs (Reynolds et al. 2002). The results indicated that variation in the rate of decline was almost exclusively because of environmental factors. Tests of fluid intelligence (ability to solve novel problems) showed no genetic contribution, whilst tests of processing speed produced heritability estimates of 0.01 in individuals less than 65 years of age and 0.02 in those over 65 years. Memory tests were weakly heritable in those over 65 years (0.18) but contributed nothing in those under the age of 65 years. Another study of almost 1000 Danish twins aged over 70 years with a follow up of up to 8 years reported similar results for the rate of IQ decline (heritability, 0.06) (McGue & Christensen 2002). Contradicting these findings was a study reporting the decline in fluid and crystallized intelligence to be highly heritable (0.7 and 0.3, respectively) (McArdle et al. 1998). This study was a smaller study, comprised of 134 New York twins but had a longer follow-up period of up to 16 years.
The reasons for such conflicting results may be because of differences in cognitive tests, length of follow up, statistical models and non-standardization of age groups. However, given that the average rate of decline in fluid intelligence is approximately 6% per decade in those aged over 50 years (Rabbitt et al. 2004), it is unlikely that a follow up of 5 or 8 years (3–5% decline) would be long enough to estimate true heritability. More importantly, practice effects have been shown to counteract the true rate of decline and therefore require adjustment in the statistical model (Rabbitt et al. 2004). Of the studies mentioned above, only the New York twin study adjusted for practice effects. Unfortunately, such contrasting results and lack of replication supporting the New York twin study means that the genetic contribution towards cognitive decline remains unknown. To date, no genes have been definitively associated with decline in non-demented individuals, and although the ɛ4 allele of apolipoprotein E (APOEɛ4) has been implicated (Feskens et al. 1994; Jonker et al. 1998; Small et al. 2000) these findings have been challenged (Mayeux et al. 2001; Pendleton et al. 2002; Smith et al. 1998). In addition, genes associated with decline in dementia such as the angiotensin-converting enzyme have not been associated in non-demented people (Yip et al. 2002).
In contrast to the study of cognitive decline, extensive studies have shown that the level of ability has a strong genetic component (heritability, 0.5) (Bouchard & McGue 2003; Devlin et al. 1997). The relative ease of cross-sectional study design and its confirmed heritability have meant that the majority of researchers have opted to investigate the level of cognitive ability. Since 1998, over 25 publications have reported association with more than a dozen quantitative trait loci (QTLs) that fall into the three broad categories of developmental, synaptic transmission and immune regulation.
The human adult brain has an estimated one hundred billion neurons and between five and 10 times that number of neuroglial cells. Both the proliferation and the differentiation of neurons during foetal development are under tight genetic regulation with the heritability of brain volume, which is one of the strongest predictors of intelligence, estimated at 0.85 (Posthuma et al. 2002). It is therefore not surprising that several developmental genes have been associated with intelligence.
The vertebrate muscle segment homeobox 1 (MSX1) gene encodes a protein that functions as a transcriptional repressor during embryogenesis and promotes cellular proliferation in the developing central nervous system (Bendall & Abate-Shen 2000). MSX1 expression is inversely correlated with cellular differentiation (Bendall et al. 1999), possibly through its ability to up-regulate cyclin D1 expression (Hu et al. 2001; Skapek et al. 1995). A polymorphic microsatellite in the 3′ untranslated region of the MSX1 gene has been associated with intelligence in a cohort of 101 high-IQ (mean IQ 136) children and 101 controls (mean IQ 100) (Fisher et al. 1999). Further research by the same group has recently identified an intronic insertion within the gene that was observed at a significantly higher frequency in high-IQ children (IQ > 160) compared with controls (IQ 100) (unpublished). Mutations within the MSX1 gene have also been associated with Wolf–Hirschorn syndrome (characterized by mental retardation, heart defects and facial clefting) (Ivens et al. 1990) and familial tooth agenesis (Lidral & Reising 2002).
Two metabolic genes that have been implicated in abnormal brain development and loss of cognitive ability are cystathionine β-synthase (CBS) and succinate-semialdehyde dehydrogenase (SSADH). Both these genes were associated with intelligence using the same cohorts described for MSX1. Cystathionine β-synthase metabolizes the non-essential amino acid homocysteine that at high levels causes neurotoxicity (Lipton et al. 1997), neural tube defects (Rosenquist et al. 1996) and disruption of gene methylation (Selhub & Miller 1992). A 68 base-pair insertion (CBS844ins68) located at the intron 7/exon 8 boundary (Sebastio et al. 1995) was found at significantly lower frequencies in the high-IQ children (Barbaux et al. 2000). The functional significance of this polymorphism is unknown, as mRNA containing the mutation is normally spliced (Sperandeo et al. 1996). Succinate-semialdehyde dehydrogenase is involved in the catabolism of γ-aminobutyric acid (GABA), which inhibits synaptic transmission and which at high concentrations is neurotoxic (Blasi et al. 2002). The gene contains a functional non-synonymous exon 3 (C >T His538Tyr) polymorphism that reduces enzyme activity (Cash et al. 1979). The presence of the T allele has been associated with reduced IQ in both case-control and family-based cohorts (Plomin et al. 2004).
Genes involved in synaptic transmission
Another group of QTLs implicated in cognitive ability are those involved in synaptic transmission. Neurotransmitter and related genes are fundamental to both working memory and long-term memory (Anagnostaras et al. 2003; Aura & Riekkinen 1999; Ellis & Nathan 2001) and have been the most extensively studied in cognitive genetics. Particularly well investigated is a non-synonymous functional polymorphism (G>A Val108/158Met) in the catechol-o-methyltransferase (COMT) gene that confers a fourfold increase in enzyme activity (Lotta et al. 1995). It is hypothesized that the common Val allele may impair executive function by increasing the rate of dopamine breakdown in the prefrontal cortex (Egan et al. 2001). Several groups have reported significant associations between this polymorphism and a variety of cognitive tests that measure executive function (Bilder et al. 2002; Diamond et al. 2004; Egan et al. 2001; Goldberg et al. 2003; Joober et al. 2002; Malhotra et al. 2002; Nolan et al. 2004; Rosa et al. 2004). Others have found no association Plomin et al. 1995; Tsai et al. 2004a; Tsai et al. 2004b).
The importance of dopamine in cognitive functioning has prompted several groups to examine the role of the dopamine receptor D2 (DRD2). Two independent studies have reported an association between the DRD2 TAQI A1 allele and both long-term verbal memory and IQ (Bartres-Faz et al. 2002; Tsai et al. 2002). Although several groups have challenged these findings (Ball et al. 1998; Moises et al. 2001; Petrill et al. 1997), it is worth noting that the Bartres-Faz study used a cohort of cognitively impaired older individuals with a mean age of over 65 years. The reports that found no association used either healthy adults under the age of 40 years (Moises et al. 2001) or children (Ball et al. 1998; Petrill et al. 1997). Given there is both an age-related decrease in dopaminergic neurotransmission (Barili et al. 1998) and a reduction in DRD2 density caused by the TAQ1 A1 allele (Jonsson et al. 1999), it is likely that an older cohort will be more sensitive to the effects.
A promoter polymorphism in the adrenergic alpha 2A receptor gene (ADRA2A) has been associated with the total Brown ADD score and subscores of memory and irritability as well as with the Buss-Durkee Hostility Inventory that accounted for between 1.8 and 8.3% of the variance (Comings et al. 2000). Another report by the same group has described an association between the 1890A>T-3′UTR polymorphism in the cholinergic muscarinic receptor 2 (CHRM2) gene and IQ (Comings et al. 2003). This gene is of particular interest owing to its role in long-term potentiation (LTP) (Calabresi et al. 1998), a process believed to be fundamental to memory and learning. However, the largest affect to be reported so far is between a polymorphism (His452Tyr) in the serotonin receptor 2A gene and memory (DeQuervain et al. 2003). Using two independent cohorts, this group reported that the presence of the tyrosine amino acid reduced episodic memory ability by 21%.
Three independent groups have found that a valine to methionine substitution in the 5′ region of the brain-derived neurotrophic factor (BDNF) gene influences episodic memory (Egan et al. 2003; Rybakowski et al. 2003; Tsai et al. 2004). The methionine allele impairs BDNF secretion (Egan et al. 2003), which is thought to be responsible for reduced memory performance. Brain-derived neurotrophic factor also plays a role in both early phase LTP (increase in intracellular calcium and protein kinase activation) and late phase LTP (synaptic structural and functional changes) (Poo 2001; Xu et al. 2000). The same group that reported the initial BDNF finding has also recently reported an association between the metabotrophic glutamate receptor (GRM3) and verbal list learning and verbal fluency that both involve the temporal and prefrontal lobes (Egan et al. 2004). The role of the associated haplotype remains unknown, as none of the investigated SNPs alter coding sequence or splice sites.
Genes involved in the immune response
Immune responses within the healthy brain are kept to a minimum. Yet, certain disease states such as Alzheimer's disease (AD) and vascular dementia and old age can result in chronic activation of the immune system. In particular, there is a two to fourfold increase in circulating cytokines and an increase in the number of activated microglia that gain phagocytic ability and express human leukocyte antigens (HLA) (Dickson et al. 1991; Krabbe et al. 2004; Neumann 2001). A number of reports suggest that inflammatory mechanisms contribute towards cognitive dysfunction in non-demented individuals (Schmidt et al. 2002; Weaver et al. 2002; Yaffe et al. 2003). In support of these findings, one of the latest genes to be associated with cognition is HLA-DRB1 (Shepherd et al. 2004). The Shepherd study reported that HLA-DR1 and DR5 polymorphisms influence cognitive abilities in the elderly. Published in this edition, our group has had similar findings. In addition, we also report a potential epistatic interaction between the cathepsin D (CTSD) (exon 2 C>T transition) gene and both the HLA-DR2 allele and APOE supporting several independent findings (Menzer et al. 2001; Ntais et al. 2004; Papassotiropoulos et al. 2000; Vergelli et al. 1997). We have previously shown that the CTSD polymorphism is associated with fluid intelligence (Payton et al. 2003). In our current article, we have extended this finding to show that it also influences other abilities such as processing speed and memory. Cathepsin D is involved in a number of biological mechanisms including antigen processing, protein turnover and apoptosis, although what appears to be important is that adequate amounts of CTSD reach the lysosomes. The CTSD T allele results in the impaired transport of CTSD from the trans-Golgi network to the vesicles (Touitou et al. 1994). In our cohort of volunteers, the presence of the T allele was associated with reduced cognitive abilities, suggesting that either insufficient enzyme reaching the lysosomes or an excess of secreted enzyme disrupts cognitive functioning.
It is also worth noting that CTSD is transported by the insulin-like growth factor 2 receptor (IGF2R), which was the first gene to be associated with intelligence (Chorney et al. 1998). Despite a subsequent failure to replicate the original finding (Hill et al. 1999), a role for IGF2R cannot be ruled out. The initial association was made using a microsatellite marker located within the 3′ untranslated region of the gene, but no other polymorphisms within the gene were investigated. Recent screening of the IGF2R gene using Transgenomic WAVE™ technology by our group has identified an exonic polymorphism that in homozygous volunteers was significantly associated with cognitive ability (unpublished data). Unfortunately, the frequency of the mutant allele is low (11%), and even though we used a cohort of 770 volunteers, the number of homozygous individuals is still too small to draw any firm conclusions. Furthermore, in-house validation of these results is therefore currently underway using a larger cohort.
Inconsistency within the literature
To date, with the exception of BDNF (investigated by three studies), there have been failure to replicate COMT (Plomin et al. 1995; Tsai et al. 2004), DRD2 (Ball et al. 1998; Moises et al. 2001; Petrill et al. 1997); IGF2R (Hill et al. 2002) and APOE (Pendleton et al. 2002) associations. Indeed, an inability to replicate results using independent cohorts is a problem encountered in all association studies investigating the genetic basis of complex diseases and traits. An example is the search for genes that predispose to AD. In contrast to the majority of behavioural traits, AD has defined diagnostic and histological features that should make the search for causative genes easier. Nevertheless, of the 50 or more genes associated thus far, only APOE has been consistently implicated (Bertram & Tanzi 2004). Given the complexity of the human genome, these inconsistencies should not be surprising. There are over 30 000 genes, and more than a million common variants (Sachidanandam et al. 2001) that will undoubtedly interact. Compounded with a limited understanding of the biological basis of cognition, the issues of multiple testing and the apparent insistence by many research groups that a few hundred samples are adequately powered means that discriminating genuine intelligence genes from false-positive results is an increasing problem that is resulting in confusion and inefficient research. While Type 1 and Type 2 errors may be inevitable, there are several methodological considerations that should be employed by all research groups prior to publication.
Reported effect sizes in cognitive genetics are typically small with larger studies reporting genetic contributions of approximately 1% (Comings et al. 2003; Payton et al. 2003; Plomin et al. 2004). It has therefore been estimated that a minimum of 1000 unselected volunteers would be required to break this 1% QTL barrier (depending upon allele frequency and exact odds ratio) (Cohen 1988). In spite of this estimate, current studies have used cohorts consisting of between 26 and 1026 individuals with a mean sample size of approximately 300. One such study, investigating COMT, used a cohort of 39 individuals and four different tests (Diamond et al. 2004). Having found association with one test only, and failing to correct for multiple testing, the authors claimed that they had reached a level of specificity never before attempted. Insufficiently powered studies of this kind are likely to be the largest single contributing factor of current inconsistencies.
Epistatic interactions such as those reported between CTSD and HLA/APOE in this edition can reduce power even further. It is therefore advisable that cognitive genetic groups focus their efforts more on power and less on throughput. Currently, our group is taking several approaches to address this issue. Firstly, we have expanded our cohort to over 2000 volunteers. Secondly, we are collaborating with the Social, Genetic and Developmental Psychiatry (SGDP) Centre in London and consequently have access to over 10 000 samples. Finally, we are establishing the collection of DNA from an initial 1000 high-IQ individuals. Selecting individuals from the extreme end of the IQ distribution can increase power considerably. For example, 200 people with an IQ of 145 is equivalent to using 100 000 unselected subjects (Plomin et al. 2001).
Investigating more than a single polymorphism
Polymorphisms in close proximity are often preferentially inherited together in blocks called haplotypes. Because polymorphisms are often in strong linkage disequilibrium, it is only necessary to genotype those [usually single-nucleotide polymorphisms (SNPs)] that represent the majority of haplotype diversity [haplotype tagging SNPs (htSNPs)]. Only when all htSNPs within a gene have been genotyped using a sufficiently powered cohort and shown no association, a gene can be claimed to have no role in cognitive abilities.
To date, only one cognitive genetic study has examined more than a single polymorphism (Egan et al. 2004). One implication of this is that a negative result using a single polymorphism may dissuade others from examining the gene in sufficient detail. It is also possible that a gene may contain more than one functional polymorphism. An example is the serotonin transporter (5-HTT) gene that has been associated with a number of personality disorders including neurosis (Lesch et al. 1996), anxiety-related traits (Sen et al. 2004), suicide (Lin & Tsai 2004), bipolar mood disorder (Ospina-Duque et al. 2000), attention-deficit hyperactivity disorder (Langley et al. 2003), impulsive aggressive behaviour (Retz et al. 2004) and psychoticism (Mata et al. 2004). This gene contains two functional polymorphisms (44 base-pair insertion/deletion within the promoter and a 16 or 17 base-pair intron 2 variable number tandem repeat), both of which influence gene expression. It has recently been demonstrated that analysing them in combination is a more powerful approach than analysing each polymorphism separately (Hranilovic et al. 2004). This work showed that classifying someone as high or low expression, which is often done based on genotype results obtained from a single locus, cannot be done accurately unless both loci are considered in combination.
Advances in technology are allowing research groups to screen large numbers of genes. By chance alone, when using a P value <0.05, one in 20 results will suggest significance. Subdividing cohorts for example by age, gender or presence or absence of disease increases the number of tests and further increases the problem of false positives. Replication prior to publication using a single independent cohort would reduce false-positive associations by a factor of twenty. While a number of groups are using two or even three independent sample sets, the majority of groups are still relying on results derived from a single cohort. Once again, collaboration would be an easy and cost-effective solution to this problem. An argument against this may be that different groups have used different testing methods, which make comparisons difficult. In this case, an unrotated first principle component score could be used as a measure of general intelligence. Given that genetic correlations between different cognitive tests is approximately 0.8 (Petrill 1997), this approach would have limited impact on power. The use of family-based cohorts and the transmission/disequilibrium test would also counteract potential problems caused by population stratification (Schulze & McMahon 2002).
Utilizing new technology
Choosing candidate genes on the basis of biological function or because they have been associated with conditions characterized by cognitive deficit such as Alzheimer's disease (Bossy-Wetzel et al. 2004) or X-linked mental retardation (Gecz & Mulley 2000) is a popular choice among researchers. However, there is currently no strong evidence that genes involved in neurodegenerative disorders or the mainly monogenic mental retardation disorders influence cognitive abilities in non-demented or non-retarded individuals. In addition, detecting QTLs of small effect using conventional means can be expensive and time consuming. A novel approach has therefore been taken by the SGDP that combines DNA pooling with microarray technology (Butcher et al. 2004). Selecting people from the extremes of the distribution, pooling their DNA and comparing allele frequencies with pooled controls is an accurate, cost-efficient and DNA-conserving method that has currently been employed in over 250 studies (Norton et al. 2004; Sham et al. 2002). Three cognitive genetics studies have used DNA pooling to screen chromosome 4 (147 markers) (Fisher et al. 1999), chromosome 22 (66 markers) (Hill et al. 1999) and the whole genome (1842 markers) (Plomin et al. 2001) but have thus far been unsuccessful in identifying any QTLs. However, given that there are more than a million common variants in the human genome, it would be fortunate if one of the two thousand markers used thus far was in sufficiently strong LD with a genuine QTL that it could be detected. Microarray technology such as the Affymetrix GeneChips® can genotype up to 100,000 SNPs using a one-primer assay and will therefore increase the chance of detecting intelligence genes. This method has a reproducibility of 99.9%, accuracy of 99.5% and call rate in excess of 95% (Matsuzaki et al. 2004). Even though the 100K chips cost $1000 each, a combination of microarray technology and DNA pooling would be very cost effective. For example, using five cases and five control samples for replication purposes, the authors found that the approximate cost of such a venture would only be $10 000. Initial work has already been completed by the SGDP using pooled DNA from 2500 children that have been stratified by IQ and genotyped on the Affymetrix 10K GeneChip® (Butcher et al. 2005). This work has produced some encouraging results and is currently being followed up using the 100K GeneChip®.
High throughput screening techniques, adequate power and sufficient replication should contribute considerably to the number of reliable associations. Functional techniques may also be applied to add biological relevance to any positive result. Functional work performed for both the initial reports of BDNF and GRM3 (Egan et al. 2003, 2004) provided additional and compelling evidence that both these genes contribute towards cognitive ability. Using a combination of written cognitive tests, a proton magnetic resonance spectroscopic imaging (MRSI) based test of synaptic abundance and neuronal integrity, a test of hippocampal activation using functional magnetic resonance neuroimaging (fMRI) and several in vitro techniques, the group showed that not only was the BDNF polymorphism associated with episodic memory but also it impacted on hippocampal activation, intracellular trafficking and activity-dependent secretion. The use of written cognitive tests, fMRI/MRSI and post-mortem mRNA studies showed that specific haplotypes in the GRM3 gene were responsible for reduced performance on cognitive tests that required hippocampal and prefrontal activation, lower cortical region activity and had a possible influence on GRM3 mRNA and protein levels.
Future of cognitive genetics
Recent advances in technology, the employment of functional techniques and online resources such as the International HapMap project (http://www.hapmap.org) and the Sanger Centre's exon resequencing project (http://www.sanger.ac.uk/genetics/exon) mean that the future of cognitive genetics looks very promising. Nevertheless, despite the potential insights that will be gained from the study of cognitive genetics, there remains a great deal of controversy surrounding the area. The nature vs. nurture debate still continues, long after twin and adoption studies predicted genes to be regulators of cognitive ability and surprisingly even now after the first genes have been reported. Almost certainly, what may prove important is that not only will genes and the environment contribute equally but also there will be considerable interaction between the two. Recent reports in depression are highlighting the importance of gene–environment interactions (Lesch 2004), and it would be foolish to discount similar interactions in cognitive genetics.
Certainly, some ethical issues, such as those relating to the selection of embryos with a preferred trait or characteristic (designer babies) and population differences in intelligence, deserve more debate. However, it should be noted that the majority of variance in intelligence is observed between individuals not between populations. In addition, because genes are likely to contribute only a small amount to the variance in cognitive ability, the massive number of potential combinations (some genes will enhance cognitive ability and some will decrease ability) would make the desired selection of an embryo almost impossible. It is also important to remember that, while a single disease or trait can be associated with multiple genes, the reverse can also be true where a single gene can be associated with multiple diseases. For example, both CTSD and IGF2R have been implicated in cancer (Ellis et al. 1998; Leto et al. 2004). Selecting a gene to enhance a trait, while unwittingly increasing susceptibility to a disease would prove a pointless exercise. In contrast, a huge benefit could be gained in the potential development of new therapies aimed at ameliorating cognitive deficits such as those endured by the elderly.
Conferences such as the ‘molecular genetics of higher mental functioning’ held on Heron Island earlier this year are helping to bring together experts from diverse fields including psychiatry, brain imaging, genetics and proteomics with the aim of understanding this complex behavioural trait. Indeed, not understanding the basis of our ability when we have the capacity to do so would be contrary to what intelligence means: to reason and to understand.