"He has his hand in his waistband," George Zimmerman told the 911 dispatcher in Sanford, Fla., shortly before he shot and killed Trayvon Martin. "He has something in his hand.” Presumably Zimmerman thought that "something" was a gun. But Martin was unarmed.
It hardly exculpates Zimmerman to suggest that he thought Martin might be armed, because he was a neighborhood watch volunteer, not a cop, and he followed Martin even after the dispatcher specifically instructed him not to. But it's interesting to consider his case in light of a 2006 study by B. Keith Payne, a psychologist at the University of North Carolina, that was tweeted earlier today by the Russell Sage Foundation. The paper's subject is the relationship between "weapon bias" and race.
In the study, subjects were asked to distinguish between images of harmless hand tools and images of guns, both projected onto a screen. Immediately before each image appeared there flashed a lightening-quick (more or less subliminal) image of a white face or a black face. The subjects were told to ignore the faces and focus on identifying the objects. The test was administered at two speeds. In one version, subjects proceeded at their own pace. In the other version, subjects had to respond within half a second.
As you would expect, responses in the self-paced version were highly accurate. But respondents nonetheless identified guns (accurately) in a shorter period of time after a black face was flashed than after a white face was. In addition, to the very small extent there were false positives for guns (i.e., about 5 percent of the time), these were likelier to occur after a black face was flashed.
Responses in the split-second version were fairly inaccurate, with about one-third of the responses false positives. But there were more false positives after a black face was flashed (nearly 40 percent) than after a white face was flashed (about 30 percent). "When a black face precedes a gun," Payne explained, "stereotypes and intent are in concert." But "when a black face precedes a harmless object, stereotypes and intent are in opposition." Payne wrote that after his findings were published in an earlier paper he received two letters objecting to them. One letter, by a retired police officer, complained that Payne was saying cops are likely to be biased in split-second decisions (as in the case of Amadou Diallo, which inspired the study). The other letter complained that Payne was excusing racial bias by cops by suggesting it occurred without conscious intent. Payne observed that the two complaints, while contradictory, were both hard to dismiss.
Zimmerman was consciously suspicious of Martin well before Martin put his hand in (or, possibly, didn't put his hand in) his waistband, and in all likelihood what made Martin look suspicious to Zimmerman was the fact that he was black. Zimmerman apparently muttered something derogatory about Martin, and while accounts differ as to what that was (it's very hard to make out on the tape) the comment was plural, suggesting he was placing Martin mentally into some larger undesirable group, probably racial. So in Zimmerman's case, whatever prejudice existed wasn't likely all that subconscious. But wouldn't a consciously bigoted person be even more apt than an ordinary person to "see" a gun that didn't exist in the hand of an African American?
What Payne's paper mainly tells me is that Florida's "stand your ground" law, which allows a person to claim self-defense even when that person isn't at home and has a plausible opportunity to escape, is catastrophically bad public policy. If people are more likely to imagine guns in the hands of black people than white people then the result will be disproportionate deaths for innocent black people. And if people are pretty likely to imagine guns that aren't there even in the hands of white people, then the result will be too many deaths of innocent people of all races. Even Malcolm Gladwell, whose book Blink celebrated the efficacy of snap judgments, surely believes that opportunities to make split-second decisions about whether one's life is threatened should be sharply circumscribed, because getting it right most of the time (i.e., 60 to 70 percent) is nowhere close to being accurate enough.