Like all vertebrates, you have a blind spot in the retina of each eye. This region, a scotoma (from the Greek word for darkness), has no light-sensitive cells and therefore light arriving at that spot has no path to the visual areas of your brain.
Paradoxically, you can "see" your own blind spot. Try it by looking with one eye at the plus sign in the middle of the rectangle on the opposite page. Cover the other eye with one hand and hold the page at arm's length in front of you, then slowly bring the rectangle closer while maintaining your focus on the plus sign. When the page is about six inches away, the black disc on the same side as your open eye will disappear. It will reappear as you bring the page closer still. The moment of disappearance tells you when light from that disc is falling on the blind spot of your open eye. Here's a bonus: If you shift the gaze of your open eye to the still-visible disc on the other side, the plus sign will disappear!
You may have noticed something strange in the location of the vanished disc. When it disappeared, it left no blank spot — no hole in the grid background. You saw an unbroken grid. Your brain did something quite remarkable — it filled in the blind spot with some- thing that made reasonable sense — a continuation of the same grid that was visible everywhere else in the rectangle.
A much more dramatic form of blindness than the one you just experienced occurs in a pathological condition called blindsight, which involves damage to the brain's visual cortex. Patients with this damage show the striking behavior of accurately reaching for and grasping an object placed in front of them, even while having no conscious visual experience of the object. If you place a hammer before the patient and ask, "Do you see something in front of you?" the patient will answer, "No, I don't." But ask the patient to reach for and grasp the hammer and the patient who just said it was invisible will do so successfully! This seemingly bizarre phenomenon happens because the condition of blindsight leaves intact subcortical retina-to-brain pathways that can guide visual behavior, even in the absence of consciously seeing the hammer.
Rather than an effect of visual perception, this book focuses on another type of blindspot, one that contains a large set of biases and keeps them hidden. This hidden-bias blindspot shares a feature with the blind spot that you just experienced via the image of the grid and discs — we can be unaware of hidden biases in the same way we are unaware of the retinal scotoma in each of our eyes. This blindspot also shares a feature with the dramatic and pathological phenomenon of blindsight. Just as patients who can't "see" a hammer can still act as if they do, hidden biases are capable of guiding our behavior without our being aware of their role.
What are the hidden biases of this book's title? They are — for lack of a better term — bits of knowledge about social groups. These bits of knowledge are stored in our brains because we encounter them so frequently in our cultural environments. Once lodged in our minds, hidden biases can influence our behavior toward members of particular social groups, but we remain oblivious to their influence. In talking with others about hidden biases, we have discovered that most people find it unbelievable that their behavior can be guided by mental content of which they are unaware.
In this book we aim to make clear why many scientists, our- selves very much included, now recognize hidden-bias blindspots as fully believable because of the sheer weight of scientific evidence that demands this conclusion. But convincing readers of this is no simple challenge. How can we show the existence of something in our own minds of which we remain completely unaware?
Some years ago, we presented people with a test that could re- veal possible hidden bias — a test of their relative preference for two American cultural icons: Oprah Winfrey and Martha Stewart. A perfect and humorous example of just how unbelievable we find the idea that our behavior might be guided by information that lies in our blindspot arrived via this email: "Dear Harvard People: There is no way that I prefer Martha Stewart over Oprah Winfrey. Please fix your tests. Sincerely, Frank."
We know what Frank means. Frank does know, in the common understanding of that term, that his fondness for Oprah exceeds that for Martha. And, as his message indicates, Frank finds it simply unbelievable that his mind could additionally possess a preference about which he has no conscious knowledge. Therefore, it's the test that needs to be fixed!
The self-administered test that Frank found to be so flawed is the Implicit Association Test, which we as well as many others have been studying since 1995. Just as the rectangle with the black discs allows us to see the otherwise hidden retinal blind spot, the Implicit Association Test has enabled us to reveal to ourselves the con- tents of hidden-bias blindspots. And where the demonstration of the retinal blind spot allows us to know that the visual blind spot exists but not much more, the Implicit Association Test (IAT) lets us look into the hidden-bias blindspot and discover what it contains.
The two of us met in Columbus, Ohio, in 1980 when Mahzarin arrived from India as a PhD student to work with Tony at Ohio State University. The decade of the 1980s brought significant changes to our branch of psychology. Psychology was on the verge of what can now — thirty years later — be recognized as a revolution triggered by new methods that could reveal potent mental content and processes that were inaccessible to introspection. The two of us sought to learn whether these methods could be sufficiently developed to reveal and explain these unseen influences on social behavior. Looking back to that period, we can see how fortunate we were to be swept into the vortex of this revolution.
The still-growing surge of research on unconscious mental function has already dramatically changed how human behavior is understood. A quarter century ago, most psychologists believed that human behavior was primarily guided by conscious thoughts and feelings. Nowadays the majority will readily agree that much of human judgment and behavior is produced with little conscious thought. A quarter century ago, the word "unconscious" — having fallen out of favor in scientific psychology earlier in the twentieth century — was barely to be found in the scientific journals that we read and in which we published our research. Nowadays, the term "unconscious cognition" appears frequently, although it was surpassed in the 1990s by the related term "implicit cognition." A quarter century ago, psychologists' methods for understanding the mind relied mostly on asking people to report their mental states and intentions. Nowadays, research methods are much more diverse, including many that do not rely at all on research participants' reports on the contents of their minds or the causes of their behavior.
Readers who are fond of endnotes will discover our reliance on the scientists of the past eighty years, in whose footsteps we readily follow. Two of these predecessors stand out as giants with shoulders broad enough to accommodate many later researchers, us among them. Gunnar Myrdal led the multi-year collaborative effort that produced An American Dilemma in 1944, which converged with other forces to put race discrimination in the United States on the national agenda, where it still remains. Gordon Allport, writing The Nature of Prejudice in 1954, gave the scientific study of prejudice a foundation and organization that continues, in the twenty- first century, to inspire new scientific work.
Like the late United States senator Daniel Patrick Moynihan, we believe that people have a right to their own opinion but not a right to their own facts. This is easier said than done, because what constitutes a fact is often unclear and even contentious. Political satirist Steven Colbert coined the term truthiness, defined as the tendency to accept propositions that one wishes to be true as true, ignoring the usual verification standards for facts.
In poking fun at truthiness — by presuming to favor it over genuine facts — Colbert, the pseudo-conservative, quipped, "I don't trust books. They're all fact, no heart." To avoid indulging in truthiness of our own, we have chosen to stick closely to evidence, especially experiments whose conclusions reflect widely shared consensus among experts. In other words, we have opted, consistently and consciously, for more fact and less heart.
Like other scientists, we do not have the luxury of believing that what appears true and valid now will always appear so. Inevitably, future knowledge will exceed and replace present understanding. But if we have done our job modestly well, it may take a few decades for that to happen to the conclusions reached in this book, among them the idea that hidden-bias blindspots are so widespread that many good people have them.
It is with some trepidation that we refer to "good people" in this book's subtitle. We have no special competence (let alone the moral authority) to judge who is good and who is not. By "good people" we refer to those, ourselves included, who intend well and who strive to align their behavior with their intentions. Our highest aim for this book is to explain the science sufficiently so that these good people will be better able to achieve that alignment.
From BLINDSPOT by Mahzarin R. Banaji and Anthony G. Greenwald. Copyright 2013 by Mahzarin Banaji and Anthony Greenwald. Excerpted by permission of Delacorte Press, a division of Random House, Inc. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.