Some thirty years ago, with the launch in 1990 by the Bush administration of the “Decade of the Brain,” neurocentrism took hold in the Western world — America, Japan, and Europe. It held on well into the aughts. Neurocentrism is the belief that the brain is the seat of the mind, that they are in some sense the same entity, and that therefore one can understand mental and psychic life by understanding the brain, which is often dubbed the most complex object in the universe, with its estimated eighty-six billion neurons and hundred trillion or so synaptic connections. As a consequence of discussions about the brain already underway in the 1980s between upper-level American science agencies, councils, and associations, the government awarded generous funding for research in neuroscience, psychology, and neurology. It aimed in large part to address the staggering cost of neurodegenerative diseases, which was (correctly, as it turned out) predicted to increase massively over the next decades, as well as to study the aetiology and the effects of neurological disorders and accidents. An underlying assumption of the program was that it would constitute one of the ultimate achievements of humankind to unravel the brain’s functioning. A similar hope was pinned on genetics, with the Human Genome Project launched in 1990, with a similar equivalence posited between genomes and selves. If one came to grips with the biology, in short, one would finally understand the nature of life, identity, and consciousness. There was an essence that one could seize. Science would yield ultimate truths. In those years a reductionism of mind and life to their constituent parts prevailed; it was galvanized and encouraged by the optimistic ethos of the time. Popular books about the brain, and also about genetics, flourished. To be sure, there existed corners of resistance to
or
Register for 2 free articles a month Preview for freeAlready have an account? Sign in here.