Discovering the Limitations of Using Intuition
In one empirical demonstration of how diffi cult it can be to understand even our own behavior, Nisbett and Wilson (1977) had college students read a passage describing a woman who was applying for a job as a counselor in a crisis intervention center. Unknown to the students, the descriptions of the interview were varied so that different students read different information about what occurred during the interview. Some students read that the woman had superb academic credentials, whereas others did not learn this information.
For some students the woman was described as having spilled a cup of coffee over the interviewer’s desk during the interview, whereas for others no such event was mentioned. After reading the information, the students fi rst judged the woman they had read about in terms of her suitability for the job on rating scales such as how much they liked her and how intelligent they thought she was. They also indicated how they thought each of the behaviors they had read about (for instance, being highly intelligent or spilling coffee over everything) infl uenced their judgments.
On the basis of these data, the researchers were able to determine how the woman’s behaviors actually infl uenced the students’ judgments of her. They found, for instance, that being described as having excellent academic credentials increased ratings of intelligence and that spilling coffee on the interviewer’s desk actually increased how much the students liked her.2 But, when the actual effects of the behaviors on the judgments were compared to the students’ reports about how the behaviors infl uenced their judgments, the researchers found that the students were not always correct. Although the students were aware that information about strong academic credentials increased their judgments of intelligence, they had no idea that the applicant’s having spilled coffee made them like her more.
Still another way that intuition may lead us astray is that, once we learn about the outcome of a given event (for instance, when we read about the results of a research project), we frequently believe that we would have been able to predict the outcome ahead of time. For instance, if half of a class of students is told that research concerning interpersonal attraction has demonstrated that “opposites attract” and the other half is told that research has demonstrated that “birds of a feather fl ock together,” both sets of students will frequently report believing that they would have predicted this outcome before they read about it. The problem is that reading a description of the research fi nding leads us to think of the many cases that we know that support it, and thus, makes it seem believable. The tendency to think that we could have predicted something that we probably could not have predicted is called the hindsight bias.
In sum, although intuition is useful for getting ideas, and although our intuitions are sometimes correct, they are not infallible. Peoples’ theories about how they make judgments do not always correspond well to how they actually make decisions. And people believe that they would have predicted events that they would not have, making research fi ndings seem like they are just common sense. This does not mean that intuition is not important—scientists frequently rely on their intuition to help them solve problems. But, because they realize that this intuition is frequently unreliable, they always back up their intuition empirically. Behavioral scientists believe that, just as research into the nature of electrons and protons guided the development of the transistor, so behavioral research can help us understand the behavior of people in their everyday lives. And these scientists believe that collecting data will allow them to discover the determinants of behavior and use this knowledge productively.
0 comments:
Post a Comment