GNN - Genome News Network  
  Home | About | Topics
Genome: The Autobiography of a Species in 23 Chapters
by Matt Ridley

Reviewed by
Kevin Davies, Ph.D.

Book Reviews

 Printer Friendly

News by Topic
Book Reviews

The story of us

The identity of all 100,000 or so human genes will be revealed within the next year or two. Ideal timing then, says Kevin Davies, for Matt Ridley's riveting exposition of DNA and destiny.

Fourth Estate 1999, 18.99

by Barbara J. Culliton
Review by James J. Ferguson, Jr., M.D.
THE COVER OF A recent issue of Time featured a cuddly infant clutching a model of the double helix, under a teasing headline: 'The I.Q. Gene?' The magazine reported excitedly that Joe Tsien, a professor at Princeton University, had engineered a strain of super-intelligent mice, simply by inserting a few extra copies of a single gene (which codes for a key protein involved in transmitting chemical signals between neurons in the brain). Employing a battery of psychological tests appropriately adapted to small, furry, four-legged creatures—finding hidden platforms in water tanks, sniffing out unfamiliar objects in cages, and so on—Tsien recognized that his genetically modified mice could store and recall information far better than their hapless littermates. Tsien accordingly nicknamed his strain of supermice 'Doogie', after the precocious title character of Doogie Howser, M.D., a truly unmemorable American sitcom. Tsien's dramatic results, which were first published in the prestigious journal Nature, leave no doubt that tinkering with the chemistry of the brain can boost short-term memory. In rodent terms, at least, that's enough to put you at the head of the bell curve.

But for all the plaudits, what possible bearing do these studies have on human intelligence? In the interests of fairness, Time also saw fit to print a commentary by the eminent evolutionary biologist Stephen Jay Gould, sharply criticizing the absurdly nave notion of a gene controlling I.Q. Thus, the 'Doogie' mice have unwittingly become the latest fodder in the fractious—and somewhat tedious—debate between those who believe that behavior is genetically hardwired, and the 'not-in-our-genes' crowd, which insists that cognitive development is the product of nurture, not nature.

This fracas has been simmering for years, fueled by a steady stream of gaudy media headlines invoking genes for almost every facet of human development and behaviour. You've seen the headlines: the infidelity gene, the gay gene, the novelty seeking gene, the aggression gene, not to mention genes for baldness, body odour and bed wetting. While some of these controversial claims have merit, others start to flounder under further scrutiny. Earlier this year, for example, Dean Hamer's celebrated 'gay gene' hypothesis suffered a serious blow when a Canadian group failed to replicate his controversial 1993 findings.

Contrary to what one might think, however, not all geneticists are striving to explain human nature in terms of a string of DNA letters. Jane Gitschier, a gene hunter at the University of California captured the silliness of her colleagues in a cartoon of the human Y chromosome, a decrepit bundle of DNA that contains the gene that triggers male fetal development but precious little else. This caricature of the male chromosome depicted the locations of hypothetical genes associated with classic gender-specific behavioural traits: 'channel flipping', 'inability to express emotion over the phone', 'refusal to ask for directions', 'selective hearing loss', and, of course, 'air guitar' (elder scientists tend to prefer 'air violin'). Sad to say, such single genes patently do not exist—although one could argue that all of these habits arise systematically from the Y-borne gene that instigates male development.

Tedious or not, the controversy of the role of genes in shaping human nature is intensifying as the race to decipher the complete sequence of the human genome—3 billion letters of DNA—reaches its conclusion in the next year or so. The impact of identifying all 100,000 human genes on the practice of medicine, our understanding of the human mind, our view of humanity in general, will be profound. Hence the arrival of Matt Ridley's new book, Genome: The Autobiography of a Species in 23 Chapters, is particularly welcome, as it offers one of the most insightful and lively accounts of what we are learning, and might find out in the future, about the book of man.

Ridley's approach is very simple yet highly effective: despite its billing as an 'autobiography', Genome is actually a collection of 23 short stories. The central character in each chapter is one of the genes found on the 23 pairs of human chromosomes, upon which Ridley cleverly layers a mixture of historical, scientific, and medical insights to illustrate a dazzling array of fundamental features of human biology.

Ridley's rationale for the book was inspired by a discussion with a Harvard University evolutionary biologist, who happened to remark that his 'favourite' chromosome was number 15—the home of a cluster of extraordinary genes whose activity depends on whether they were passed down from the father or mother. (In fact, many researchers acknowledge having 'pet' chromosomes. For example, MIT's Eric Lander, who trained as a mathematician, chose to sequence chromosome 17 for the Human Genome Project because he considers 17 to be the most interesting of all numbers.) So, the story of chromosome 1 begins with a gene potentially implicated in the origin and evolution of life on earth. From here, the author embraces subjects as diverse as disease, personality, evolution, politics, conflict, free will and, not surprisingly, intelligence.

Over the years, some chromosomes have become almost synonymous with certain genes. A perfect example is that of chromosome 4 and Huntington's disease. The mutation that causes this fatal and incurable neurodegenerative disease was traced to chromosome 4 back in 1983. This early success came by courtesy of the heroic efforts of Nancy Wexler, a Columbia University researcher who collected hundreds of DNA samples from HD sufferers among the inhabitants of three villages on the shores of Lake Maracaibo, Venezuela. Wexler, whose mother died of HD, marshaled a group of the world's top geneticists in the desperate ten-year quest for the gene. And yet despite finding the HD gene, the discovery simply produces more questions than answers. HD is a dominantly inherited disease, meaning that individuals have a 50:50 chance of inheriting the gene if one of their parents had the disease. Is it worth taking the simple genetic test to find out if you have inherited a fatal disease gene, or live with the burden of uncertainty? Could a positive diagnosis at an early age (decades before symptoms materialize) lead to discrimination in getting a job or insurance? Wexler sums up these awful dilemmas with the words of Tiresias to Oedipus: "It is but sorrow to be wise when wisdom profits not."

However, as Ridley stresses on more than one occasion, 'genes are not there to cause disease.' By drawing on evolutionary psychology, molecular anthropology, and something called 'psychoneuroimmunology', he paints a panoramic view of genes and their influence on personality, memory and language development, as well as delving into human pre-history and eugenics. The writing is crisp, amusing, and frequently provocative. Ridley argues uncompromisingly that the increase in asthma is attributable to today's modern, ultra-hygenic society that minimizes exposure to mycobacteria, leaving the immune system in a precarious, hyperactive state. Later, while discussing stress, Ridley spotlights studies that purport to demonstrate that the incidence of cardiovascular disease is inversely proportional to job status—proof, asserts Ridley, that 'your heart is at the mercy of your pay grade'. This is an intriguing idea, but is contradicted elsewhere in the book when Ridley correctly points out the importance of genes in influencing heart disease. Indeed, hereditary forms of heart disease such as familial hypercholesterolaemia and Tangier disease show unequivocally the importance of the body's handling of cholesterol for good health.

Ridley's bold 'one-gene-one-chromosome' gambit only starts to unravel at the end of the book. Down syndrome, the most common inherited form of mental retardation, is usually caused by an extra copy of chromosome 21, which Ridley considers as part of an excellent chapter on eugenics. In fact, researchers now think that the symptoms of Down syndrome are chiefly attributable to an extra dose of just a few genes. Studies on one of these genes, called 'minibrain' (named after studies of the corresponding gene in fruit flies), show promise in developing new forms of treatment for Down syndrome in years to come.

In the final chapter on chromosome 22, Ridley conjures up a hypothetical gene with which to tackle the thorny subject of inheritance and free will. Ironically, scientists at the Sanger Centre in Cambridge (with colleagues in Japan and the United States) have just celebrated the complete sequencing (ignoring a few gaps left for technical reasons) of chromosome 22—a major landmark in the brief history of the Human Genome Project. The cataloguing of almost 1,000 genes on this chromosome should enable researchers to identify genes for cancer, schizophrenia and many other disorders in the next year or so.

Within the six months, researchers hope to sketch out the sequence of most of the 3 billion letters in the human genome—what one Nobel laureate termed biology's 'Holy Grail'. The current director of the Human Genome Project, Francis Collins, says the work is of equal, if not greater, importance as the moon landings and the Manhattan Project. But despite leading an international alliance of researchers, supported largely by the NIH and the Wellcome Trust, Collins is desperately fending off a dramatic late bid to pirate the sequence by Craig Venter, a brash, rich, smart American DNA sequencer. Armed with some 300 state-of-the-art DNA sequencing machines and an $80-million Compaq supercomputer to assemble millions of DNA sequences, Venter's new company, Celera Genomics, predicts it can complete the sequence within a year or two, snapping up the patent rights to thousands of genes in the process.

No matter who wins this race, the ultimate decoding of human DNA paves the way for monumental advances in medical research and a golden opportunity to chronicle man's history and origins. As Ridley points out, the human genome 'is a record of our history written in the code for a working machine.' This is the true autobiography of our species—a 4-billion-year-old Fortran code that is almost ready for publication.

While the identification of 100,000 genes will be a major landmark in human history, equally important will be the description of millions of variations in the DNA that exist between any two individuals. It is these seemingly trivial variations that control the activity of our genes, sculpting each person's unique physical appearance and mental development. Within maybe 10-20 years, doctors could be handing patients a CD-ROM containing their complete DNA sequence, along with a detailed profile of present and future health risks and customized medicines. The drug industry is staking hundreds of millions of pounds on genome research to boost drug discovery and match the best drugs for a patient's genetic make-up.

As biomedical research hurtles into the post-genomic era, there will be many more audacious claims for new genes underpinning different elements of human behaviour. Interestingly, Ridley dismisses concerns in some quarters that the discovery of personality genes could one day usher in a brave new world of human genetic engineering, not on ethical grounds necessarily but because there are simply too many genes involved. Arguments that human ingenuity will be thwarted by failings of technology are seldom convincing, however, and one should not underestimate the will of the people. No less an authority than James Watson once said, "If we could honestly promise young couples that we knew how to give them offspring with superior character, why should we assume they would decline? If scientists find ways to greatly improve human capabilities, there will be no stopping the public from happily seizing them."

Back to GNN Home Page