Some thirty years ago, with the launch in 1990 by the Bush administration of the “Decade of the Brain,” neurocentrism took hold in the Western world — America, Japan, and Europe. It held on well into the aughts. Neurocentrism is the belief that the brain is the seat of the mind, that they are in some sense the same entity, and that therefore one can understand mental and psychic life by understanding the brain, which is often dubbed the most complex object in the universe, with its estimated eighty-six billion neurons and hundred trillion or so synaptic connections. As a consequence of discussions about the brain already underway in the 1980s between upper-level American science agencies, councils, and associations, the government awarded generous funding for research in neuroscience, psychology, and neurology. It aimed in large part to address the staggering cost of neurodegenerative diseases, which was (correctly, as it turned out) predicted to increase massively over the next decades, as well as to study the aetiology and the effects of neurological disorders and accidents. An underlying assumption of the program was that it would constitute one of the ultimate achievements of humankind to unravel the brain’s functioning. A similar hope was pinned on genetics, with the Human Genome Project launched in 1990, with a similar equivalence posited between genomes and selves. If one came to grips with the biology, in short, one would finally understand the nature of life, identity, and consciousness. There was an essence that one could seize. Science would yield ultimate truths. In those years a reductionism of mind and life to their constituent parts prevailed; it was galvanized and encouraged by the optimistic ethos of the time. Popular books about the brain, and also about genetics, flourished. To be sure, there existed corners of resistance to reductionism, in the name of phenomenological complexity, with philosophers of mind exploring the nature of consciousness — for instance, a Journal of Consciousness Studies was founded in 1994, which provided a forum for collaborations between philosophical speculation and empirical data, and interdisciplinary conferences on the major topic started taking off then. But this resistance took place within rarefied academic spheres. Neurocentrism was easier for non-specialists to comprehend. Meanwhile the fields of mind sciences grew, and multiplied, separately from biological neurosciences, insofar as the term “mind” designates not a physical entity but the abilities that allow organisms to function in and interact with the world. A “Decade of the Mind” was announced in 2007. The cognitive sciences yielded modular models of the mind, represented for a while as subdivided into mechanisms supposedly developed during the Pleistocene. Importantly, these cognitive sciences were a formidable and fertile response to the behaviorism that had preceded them, insofar as they supposed, in contrast to behaviorist assumptions, that there was indeed such a thing as a mind that could be studied. The association of the cognitive sciences with neuroscience then gave birth to cognitive neuroscience, which made use of imaging technologies to explore mental functions. The appearance in 1991 of functional magnetic resonance imaging (fMRI) — a technology that allows one to observe the brain in action — was a historical revolution whose impact on the imagination was not unlike that of the moon landing. It seemed to announce a bold, bright future when one could finally peer into places that had never before been visible. The first steps into this future were taken, however, with a measure of presentist and materialist hubris, and often at the cost of philosophically informed subtlety. Now, three decades later, research and funding continue, and rightly so — but the mood, the priorities, and the assumptions have radically changed. And so it is time to take stock of where the mind sciences are today, and what place these sciences now hold in the collective imagination, especially in light of the bewilderingly rapid evolution of computer science and, most recently, of artificial intelligence — an expression whose assumptions also need parsing. At its height, neurocentrism in its excitement generated countless claims about the cerebral location of mental functions, delivered in the media as so many revelations about the “place” of the deepest aspects of human experience — cognition, language, volition,
or
Register for 2 free articles a month Preview for freeAlready have an account? Sign in here.