{"id":60,"date":"2016-08-09T17:04:14","date_gmt":"2016-08-09T17:04:14","guid":{"rendered":"https:\/\/courses.lumenlearning.com\/suny-hccc-social-psychology\/?post_type=chapter&#038;p=60"},"modified":"2016-08-09T17:04:14","modified_gmt":"2016-08-09T17:04:14","slug":"how-we-use-our-expectations","status":"publish","type":"chapter","link":"https:\/\/courses.lumenlearning.com\/suny-hccc-social-psychology\/chapter\/how-we-use-our-expectations\/","title":{"raw":"How We Use Our Expectations","rendered":"How We Use Our Expectations"},"content":{"raw":"<div class=\"bcc-box bcc-highlight\">\n<h3>Learning Objectives<\/h3>\n<ol><li>Provide examples of how salience and accessibility influence information processing.<\/li>\n\t<li>Review, differentiate, and give examples of some important cognitive heuristics that influence social judgment.<\/li>\n\t<li>Summarize and give examples of the importance of social cognition in everyday life.<\/li>\n<\/ol><\/div>\nOnce we have developed a set of schemas and attitudes, we naturally use that information to help us evaluate and respond to others. Our expectations help us to think about, size up, and make sense of individuals, groups of people, and the relationships among people. If we have learned, for example, that someone is friendly and interested in us, we are likely to approach them; if we have learned that they are threatening or unlikable, we will be more likely to withdraw. And if we believe that a person has committed a crime, we may process new information in a manner that helps convince us that our judgment was correct. In this section, we will consider how we use our stored knowledge to come to accurate (and sometimes inaccurate) conclusions about our social worlds.\n<h2>Automatic versus Controlled Cognition<\/h2>\nA good part of both cognition and social cognition is spontaneous or automatic. <strong>Automatic cognition<\/strong> refers to<em> thinking that occurs out of our awareness, quickly, and without taking much effort<\/em> (Ferguson &amp; Bargh, 2003; Ferguson, Hassin, &amp; Bargh, 2008).\u00a0The things that we do most frequently tend to become more automatic each time we do them, until they reach a level where they don\u2019t really require us to think about them very much. Most of us can ride a bike and operate a television remote control in an automatic way. Even though it took some work to do these things when we were first learning them, it just doesn\u2019t take much effort anymore. And because we spend a lot of time making judgments about others, many of these judgments,\u00a0which are strongly influenced by our schemas, are made quickly and automatically (Willis &amp; Todorov, 2006).\n\nBecause automatic thinking occurs outside of our conscious awareness, we frequently have no idea that it is occurring and influencing our judgments or behaviors. You might remember a time when you returned home, unlocked the door, and 30 seconds later couldn\u2019t remember where you had put your keys! You know that you must have used the keys to get in, and you know you must have put them somewhere, but you simply don\u2019t remember a thing about it. Because many of our everyday judgments and behaviors are performed automatically, we may not always be aware that they are occurring or influencing us.\n\nIt is of course a good thing that many things operate automatically because it would be extremely difficult to have to think about them all the time. If you couldn\u2019t drive a car automatically, you wouldn\u2019t be able to talk to the other people riding with you or listen to the radio at the same time\u2014you\u2019d have to be putting most of your attention into driving. On the other hand, relying on our snap judgments about Bianca\u2014that she\u2019s likely to be expressive, for instance\u2014can be erroneous. Sometimes we need to\u2014and should\u2014go beyond automatic cognition and consider people more carefully. <em>When we deliberately size up and think about something, for instance, another person,\u00a0<\/em>we call it <strong>controlled cognition<\/strong>.\u00a0Although you might think that controlled cognition would be more common and that automatic thinking would be less likely, that is not always the case. The problem is that thinking takes effort and time, and we often don\u2019t have too much of those things available.\n\nIn the following Research Focus, we consider an example of automatic cognition in a study that uses a common social cognitive procedure known as <strong>priming<\/strong>,\u00a0<em>a technique in which information is temporarily brought into memory through exposure to situational events, which\u00a0can then influence judgments entirely out of awareness.<\/em>\n<div class=\"textbox shaded\">\n<h3>Research Focus<\/h3>\nBehavioral Effects of Priming\n\nIn one demonstration of how automatic cognition can influence our behaviors without us being aware of them, John Bargh and his colleagues (Bargh, Chen, &amp; Burrows, 1996)\u00a0conducted two studies, each with the exact same procedure. In the experiments, they showed college students sets of five scrambled words. The students were to unscramble the five words in each set to make a sentence. Furthermore, for half of the research participants, the words were related to the stereotype of elderly people. These participants saw words such as \u201cin Florida retired live people\u201d and \u201cbingo man the forgetful plays.\u201d\n\nThe other half of the research participants also made sentences but did so out of words that had nothing to do with the elderly stereotype. The purpose of this task was to prime (activate) the schema of elderly people in memory for some of the participants but not for others.\n\nThe experimenters then assessed whether the priming of elderly stereotypes would have any effect on the students\u2019 behavior\u2014and indeed it did. When each research participant had gathered all his or her belongings, thinking that the experiment was over, the experimenter thanked him or her for participating and gave directions to the closest elevator. Then, without the participant knowing it, the experimenters recorded the amount of time that the participant spent walking from the doorway of the experimental room toward the elevator. As you can see in Figure 2.8, \"Automatic Priming and Behavior,\" the same results were found in both experiments\u2014the participants who had made sentences using words related to the elderly stereotype took on the behaviors of the elderly\u2014they walked significantly more slowly (in fact, about 12% more slowly across the two studies) as they left the experimental room.\n\n[caption id=\"attachment_2659\" align=\"alignnone\" width=\"400\"]<a href=\"http:\/\/opentextbc.ca\/socialpsychology\/wp-content\/uploads\/sites\/21\/2014\/09\/Figure-2-8.png\"><img src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/457\/2016\/08\/09165212\/Figure-2-8.png\" alt=\"Automatic priming and behaviour\" class=\"wp-image-2659\" height=\"147\" width=\"400\"\/><\/a> Figure 2.8 Automatic Priming and Behavior. In two separate experiments, Bargh, Chen, and Borroughs (1996) found that students who had been exposed to words related to the elderly stereotype walked more slowly than those who had been exposed to more neutral words.[\/caption]\n\n\u00a0\n\nTo determine if these priming effects occurred out of the conscious awareness of the participants, Bargh and his colleagues asked a third group of students to complete the priming task and then to indicate whether they thought the words they had used to make the sentences had any relationship to each other or could possibly have influenced their behavior in any way. These students had no awareness of the possibility that the words might have been related to the elderly or could have influenced their behavior.\n\nThe point of these experiments, and many others like them, is clear\u2014it is quite possible that our judgments and behaviors are influenced by our social situations, and this influence may be entirely outside of our conscious awareness. To return again to Bianca, it is even possible that we notice her nationality and that our beliefs about Italians influence our responses to her, even though we have no idea that they are doing so and really believe that they have not.\n\n<\/div>\n\u00a0\n<h2>Salience and Accessibility Determine Which Expectations We Use<\/h2>\nWe each have a large number of schemas that we might bring to bear on any type of judgment we might make. When thinking about Bianca, for instance, we might focus on her nationality, her gender, her physical attractiveness, her intelligence, or any of many other possible features. And we will react to Bianca differently depending on which schemas we use. Schema activation is determined both by the salience<em>\u00a0<\/em>of the\u00a0characteristics of the person we are judging and by the current activation or cognitive accessibility<em>\u00a0<\/em>of the schema.\n<h2>Salience<\/h2>\n[caption id=\"attachment_922\" align=\"alignnone\" width=\"350\"]<a href=\"http:\/\/opentextbc.ca\/socialpsychology\/wp-content\/uploads\/sites\/21\/2014\/05\/people.png\"><img class=\"wp-image-922\" alt=\"people\" src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/457\/2016\/08\/09165229\/people-1024x885.png\" height=\"303\" width=\"350\"\/><\/a> Figure 2.9 Which of these people are more salient and therefore more likely to attract your attention?<br\/> Source: Man with a moustache (<a href=\"http:\/\/commons.wikimedia.org\/wiki\/File:Man_with_a_moustache,_Chambal,_India.jpg\">http:\/\/commons.wikimedia.org\/wiki\/File:Man_with_a_moustache,_Chambal,_India.jpg<\/a>) by yann used under CC BY-SA 3.0 (<a href=\"http:\/\/creativecommons.org\/licenses\/by-sa\/3.0\/deed.en\">http:\/\/creativecommons.org\/licenses\/by-sa\/3.0\/deed.en<\/a>). Jill Jackson (<a href=\"https:\/\/www.flickr.com\/photos\/kriskesiak\/6493819855\/\">https:\/\/www.flickr.com\/photos\/kriskesiak\/6493819855\/<\/a>) by Kris Kesiak used under CC BY-NC 2.0 (<a href=\"https:\/\/creativecommons.org\/licenses\/by-nc\/2.0\/\">https:\/\/creativecommons.org\/licenses\/by-nc\/2.0\/<\/a>). Amelia earhart (<a href=\"http:\/\/en.wikipedia.org\/wiki\/File:Amelia_earhart.jpeg\">http:\/\/en.wikipedia.org\/wiki\/File:Amelia_earhart.jpeg<\/a>) in Public Domain (<a href=\"http:\/\/en.wikipedia.org\/wiki\/Public_domain\">http:\/\/en.wikipedia.org\/wiki\/Public_domain<\/a>). Ralph Lauren Photoshoot (<a href=\"https:\/\/www.flickr.com\/photos\/brandoncwarren\/2964734674\/\">https:\/\/www.flickr.com\/photos\/brandoncwarren\/2964734674\/<\/a>) by Brandon Warren used under CC BY-NC 2.0 (<a href=\"https:\/\/www.flickr.com\/photos\/brandoncwarren\/2964734674\/\">https:\/\/www.flickr.com\/photos\/brandoncwarren\/2964734674\/<\/a>). Wild Hair (<a href=\"http:\/\/en.wikipedia.org\/wiki\/File:Wild_hair.jpg\">http:\/\/en.wikipedia.org\/wiki\/File:Wild_hair.jpg<\/a>) by peter klashorst used under CC BY 2.0 (<a href=\"http:\/\/creativecommons.org\/licenses\/by\/2.0\/deed.en\">http:\/\/creativecommons.org\/licenses\/by\/2.0\/deed.en<\/a>)[\/caption]\n\nOne determinant of which schemas are likely to be used in social judgment is the extent to which we attend to particular features of the person or situation that we are responding to. We are more likely to judge people on the basis of characteristics of\u00a0salience, which<em>\u00a0<\/em>attract our attention when we see someone with them. For example, things that are unusual, negative, colorful, bright, and moving are more salient and thus more likely to be attended to than are things that do not have these characteristics (McArthur &amp; Post, 1977; Taylor &amp; Fiske, 1978).\n\nWe are more likely to initially judge people on the basis of their sex, race, age, and physical attractiveness, rather than on, say, their religious orientation or their political beliefs, in part because these features are so salient when we see them (Brewer, 1988).\u00a0Another thing that makes something particularly salient is its infrequency or unusualness. If Bianca is from Italy and very few other people in our community are, that characteristic is something that we notice, it is salient, and we are therefore likely to attend to it. That she is also a woman is, at least in this context, is less salient.\n<div>\n\nThe salience of the stimuli in our social worlds may sometimes lead us to make judgments on the basis of information that is actually less informative than is other less salient information. Imagine, for instance, that you wanted to buy a new smartphone for yourself. You\u2019ve been trying to decide whether to get the iPhone or a rival product. You went online and checked out the reviews, and you found that although the phones differed on many dimensions, including price, battery life, and so forth, the rival product was nevertheless rated significantly higher by the owners than was the iPhone. As a result, you decide to go and purchase one the next day. That night, however, you go to a party, and a friend of yours shows you her iPhone. You check it out, and it seems really great. You tell her that you were thinking of buying a rival product, and she tells you that you are crazy. She says she knows someone who had one and had a lot of\u00a0<span style=\"line-height: 1.5em\">problems\u2014it didn\u2019t download music properly, the battery died right after the warranty was up, and so forth, and that she would never buy one. Would you still buy it, or would you switch your plans? \u00a0<\/span>\n\n<\/div>\n<span style=\"line-height: 1.5em\">If you think about this question logically, the information that you just got from your friend isn\u2019t really all that important; you now know the opinions of one more person, but that can\u2019t really change the overall consumer ratings of the two machines very much. On the other hand, the information your friend gives you and the chance to use her iPhone are highly salient. The information is right there in front of you, in your hand, whereas the statistical information from reviews<\/span><span style=\"line-height: 1.5em\">\u00a0is only in the form of a table that you saw on your computer. The outcome in cases such as this is that people frequently ignore the less salient, but more important, information, such as <\/span><em>the likelihood that events occur across a large population<\/em><em style=\"line-height: 1.5em\">,\u00a0<\/em><span style=\"line-height: 1.5em\">known as <strong>base rates<em>,\u00a0<\/em><\/strong>in favor of the actually less important, but nevertheless more salient, information<em>.<\/em><\/span>\n\nAnother case in which we ignore base-rate information occurs when we use the <strong>representativeness heuristic, <\/strong>which occurs <em>when we base our judgments on information that seems to represent, or match, what we expect will happen, while ignoring more informative base-rate information.<\/em> Consider, for instance, the following puzzle. Let\u2019s say that you went to a hospital this week, and you checked the records of the babies that were born on that day (<a href=\"#Table2-2\">Table 2.2, \"Using the Representativeness Heuristic\"<\/a>). Which pattern of births do you think that you are most likely to find?\n<a id=\"Table2-2\"\/>\nTable 2.2 Using the Representativeness Heuristic\n<table><thead><tr><th>\n<h2>List A<\/h2>\n<\/th>\n<th\/>\n<th>\n<h2>List B<\/h2>\n<\/th>\n<\/tr><\/thead><tbody><tr><td>6:31 a.m.<\/td>\n<td>Girl<\/td>\n<td>6:31 a.m<\/td>\n<td>Boy<\/td>\n<td\/>\n<\/tr><tr><td>8:15 a.m.<\/td>\n<td>Girl<\/td>\n<td>8:15 a.m.<\/td>\n<td>Girl<\/td>\n<td\/>\n<\/tr><tr><td>9:42 a.m.<\/td>\n<td>Girl<\/td>\n<td>9:42 a.m.<\/td>\n<td>Boy<\/td>\n<td\/>\n<\/tr><tr><td>1:13 p.m.<\/td>\n<td>Girl<\/td>\n<td>1:13 p.m.<\/td>\n<td>Girl<\/td>\n<td\/>\n<\/tr><tr><td>3:39 p.m.<\/td>\n<td>Boy<\/td>\n<td>3:39 p.m.<\/td>\n<td>Girl<\/td>\n<td\/>\n<\/tr><tr><td>5:12 p.m.<\/td>\n<td>Boy<\/td>\n<td>5:12 p.m.<\/td>\n<td>Boy<\/td>\n<td\/>\n<\/tr><tr><td>7:42 p.m.<\/td>\n<td>Boy<\/td>\n<td>7:42 p.m.<\/td>\n<td>Girl<\/td>\n<td\/>\n<\/tr><tr><td>11:44 p.m.<\/td>\n<td>Boy<\/td>\n<td>11:44 p.m.<\/td>\n<td>Boy<\/td>\n<td\/>\n<\/tr><\/tbody><\/table>\nMost people think that List B is more likely, probably because it\u00a0looks more random and thus matches (is \u201crepresentative of\u201d) our ideas about randomness. But statisticians know that any pattern of four girls and four boys is equally likely and thus that List B is no more likely than List A. The problem is that we have an image of what randomness should be, which doesn\u2019t always match what is rationally the case. Similarly, people who see a coin that comes up heads five times in a row will frequently predict (and perhaps even bet!) that tails will be next\u2014it just seems like it has to be. But mathematically, this erroneous expectation (known as the gambler\u2019s fallacy) is simply not true: the base-rate likelihood of any single coin flip being tails is only 50%, regardless of how many times it has come up heads in the past.\n\nTo take one more example, consider the following information:\n\nI have a friend who is analytical, argumentative, and is involved in community activism. Which of the following is she? (Choose one.)\n\n\u2014A lawyer\n\n\u2014A salesperson\n\nCan you see how you might be led, potentially incorrectly, into thinking that my friend is a lawyer? Why? The description (\u201canalytical, argumentative, and is involved in community activism\u201d) just seems more representative or stereotypical of our expectations about lawyers than salespeople. But the base rates tell us something completely different, which should make us wary of that conclusion. Simply put, the number of salespeople greatly outweighs the number of lawyers in society, and thus statistically it is far more likely that she is a salesperson. Nevertheless, the representativeness heuristic will often cause us to overlook such important information. One unfortunate consequence of this is that it can contribute to the maintenance of stereotypes. If someone you meet seems, superficially at least, to represent the stereotypical characteristics of a social group, you may incorrectly classify that person\u00a0as a member of that group, even when it is highly likely that he or she is\u00a0not.\n<h2>Cognitive Accessibility<\/h2>\nAlthough the characteristics that we use to think about objects or people are determined in part by their salience, individual differences in the person who is doing the judging are also important. People vary in the type of schemas that they tend to use when judging others and when thinking about themselves. One way to consider this is in terms of the <strong>cognitive accessibility<\/strong> of the schema. Cognitive accessibility refers to<em> the extent to which a schema is activated in memory and thus likely to be used in information processing.<\/em> Simply put, the schemas we tend to typically use are often those that are most accessible to us.\n\nYou probably know people who are football nuts (or maybe tennis or some other sport nuts). All they can talk about is football. For them, we would say that football is a highly accessible construct. Because they love football, it is important to their self-concept; they set many of their goals in terms of the sport, and they tend to think about things and people in terms of it (\u201cIf he plays or watches football, he must be okay!\u201d). Other people have highly accessible schemas about eating healthy food, exercising, environmental issues, or really good coffee, for instance. In short, when a schema is accessible, we are likely to use it to make judgments of ourselves and others.\n\nAlthough accessibility can be considered a person variable (a given idea is more highly accessible for some people than for others), accessibility can also be influenced by situational factors. When we have recently or frequently thought about a given topic, that topic becomes more accessible and is likely to influence our judgments. This is in fact a potential explanation for the results of the priming study you read about earlier\u2014people walked slower because the concept of elderly had been primed and thus was currently highly accessible for them.\n\nBecause we rely so heavily on our schemas and attitudes, and particularly on those that are salient and accessible, we can sometimes be overly influenced by them. Imagine, for instance, that I asked you to close your eyes and determine whether there are more words in the English language that begin with the letter <em>R<\/em> or that have the letter <em>R<\/em> as the third letter. You would probably try to solve this problem by thinking of words that have each of the characteristics. It turns out that most people think there are more words that begin with <em>R<\/em>, even though there are in fact more words that have <em>R<\/em> as the third letter.\n\nYou can see that this error can occur as a result of cognitive accessibility. To answer the question, we naturally try to think of all the words that we know that begin with <em>R<\/em> and that have <em>R<\/em> in the third position. The problem is that when we do that, it is much easier to retrieve the former than the latter, because we store words by their first, not by their third, letter. We may also think that our friends are nice people because we see them primarily when they are around us (their friends). And the traffic might seem worse in our own neighborhood than we think it is in other places, in part because nearby traffic jams are more accessible for us than are traffic jams that occur somewhere else. And do you think it is more likely that you will be killed in a plane crash or in a car crash? Many people fear the former, even though the latter is much more likely: statistically, your chances of being involved in an aircraft accident are far lower than being killed in an automobile accident. In this case, the problem is that plane crashes, which are highly salient, are more easily retrieved from our memory than are car crashes, which often receive far less media coverage.\n\n<em>The tendency to make judgments of the frequency of an event, or the likelihood that an event will occur, on the basis of the ease with which the event can be retrieved from memory <\/em>is known as the <strong>availability heuristic<\/strong> (Schwarz &amp; Vaughn, 2002; Tversky &amp; Kahneman, 1973).\u00a0The idea is that things that are highly accessible (in this case, the term <em>availability<\/em> is used) come to mind easily and thus may overly influence our judgments. Thus, despite the clear facts, it may be easier to think of plane crashes than of car crashes because the former are more accessible. If so, the availability heuristic can lead to errors in judgments.\n\nFor example, as people tend to overestimate the risk of rare but dramatic events, including plane crashes and terrorist attacks, their responses to these estimations may not always be proportionate to the true risks. For instance, it has been widely documented that fewer\u00a0people chose to use air travel in the aftermath of the\u00a0September 11, 2001 (9\/11), terrorist attacks on the World Trade Center, particularly in the United States. Correspondingly, many individuals chose other methods of travel, often electing to drive rather than fly to their destination. Statistics across all regions of the world confirm that driving is far more dangerous than flying, and this prompted the cognitive psychologist Gerd Gigerenzer to estimate how many extra deaths that the increased road traffic following 9\/11 might have caused. He arrived at an estimate of around an additional 1,500 road deaths in the United States alone in the year following those terrorist attacks, which was six times the number of people killed on the airplanes on September 11, 2001 (Gigerenzer, 2006).\n\nAnother way that the cognitive accessibility of constructs can influence information processing is through their effects on <strong>processing fluency<\/strong>. Processing fluency refers to<em> the ease with which we can process information in our environments.<\/em> When stimuli are highly accessible, they can be quickly attended to and processed, and they therefore have a large influence on our perceptions. This influence is due, in part, to the fact that we often react positively to information that we can process quickly, and we use this positive response as a basis of judgment (Reber, Winkielman, &amp; Schwarz, 1998; Winkielman &amp; Cacioppo, 2001).\n\nIn one study demonstrating this effect, Norbert Schwarz and his colleagues (Schwarz et al., 1991)\u00a0asked one set of college students to list six\u00a0occasions when they had acted either assertively or unassertively, and asked another set of college students to list 12 such examples. Schwarz determined that for most students, it was pretty easy to list six examples but pretty hard to list 12.\n\nThe researchers then asked the participants to indicate how assertive or unassertive they actually were. You can see from <a href=\"#Figure2-10\">Figure 2.10, \"Processing Fluency,\"<\/a> that the ease of processing influenced judgments. The participants who had an easy time listing examples of their behavior (because they only had to list six instances) judged that they did in fact have the characteristics they were asked about (either assertive or unassertive), in comparison with the participants who had a harder time doing the task (because they had to list 12 instances). Other research has found similar effects\u2014people rate that they ride their bicycles more often after they have been asked to recall only a few rather than many instances of doing so (Aarts &amp; Dijksterhuis, 1999),\u00a0and they hold an attitude with more confidence after being asked to generate few rather than many arguments that support it (Haddock, Rothman, Reber, &amp; Schwarz, 1999). Sometimes less really is more!\n<a id=\"Figure2-10\"\/>\n\n[caption id=\"attachment_2661\" align=\"alignnone\" width=\"350\"]<a href=\"http:\/\/opentextbc.ca\/socialpsychology\/wp-content\/uploads\/sites\/21\/2014\/09\/Figure-2-10.png\"><img src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/457\/2016\/08\/09165233\/Figure-2-10.png\" alt=\"Processing Fluency\" class=\"wp-image-2661\" height=\"140\" width=\"350\"\/><\/a> Figure 2.10 Processing Fluency. When it was relatively easy to complete the questionnaire (only six examples were required), the student participants rated that they had more of the trait than when the task was more difficult (12 answers were required). Data are from Schwarz et al. (1991).[\/caption]\n\n\u00a0\n\nEchoing the findings mentioned earlier in relation to schemas,\u00a0we are likely to use this type of quick and \u201cintuitive\u201d processing, based on our feelings about how easy it is to complete a task, when we don\u2019t have much time or energy for more in-depth processing, such as when we are under time pressure, tired, or unwilling to process the stimulus in sufficient detail. Of course, it is very adaptive to respond to stimuli quickly (Sloman, 2002; Stanovich &amp; West, 2002; Winkielman, Schwarz, &amp; Nowak, 2002),\u00a0and it is not impossible that in at least some cases, we are better off making decisions based on our initial responses than on a more thoughtful cognitive analysis (Loewenstein, Weber, Hsee, &amp; Welch, 2001).\u00a0For instance, Dijksterhuis, Bos, Nordgren, and van Baaren (2006)\u00a0found that when participants were given tasks requiring decisions that were very difficult to make on the basis of a cognitive analysis of the problem, they made better decisions when they didn\u2019t try to analyze the details carefully but simply relied on their intuitions.\n\nIn sum, people are influenced not only by the information they get but on how they get it. We are more highly influenced by things that are salient and accessible and thus easily attended to, remembered, and processed. On the other hand, information that is harder to access from memory, is less likely to be attended to, or takes more effort to consider is less likely to be used in our judgments, even if this information is statistically more informative.\n<h2>The False Consensus Bias Makes Us Think That Others Are More Like Us Than They Really Are<\/h2>\nThe tendency to base our judgments on the accessibility of social constructs can lead to still other errors in judgment.\u00a0One such error is known as the <strong>false consensus bias<\/strong>,<em>\u00a0the tendency to overestimate the extent to which other people hold similar views to our own<\/em>. As our own beliefs are highly accessible to us, we tend to rely on them too heavily when asked to predict those of others. For instance, if you are in favor of abortion rights and opposed to capital punishment, then you are likely to think that most other people share these beliefs (Ross, Greene, &amp; House, 1977).\u00a0In one demonstration of the false consensus bias, Joachim Krueger and his colleagues (Krueger &amp; Clement, 1994)\u00a0gave their research participants, who were college students, a personality test. Then they asked the same participants to estimate the percentage of other students in their school who would have answered the questions the same way that they did. The students who agreed with the items often thought that others would agree with them too, whereas the students who disagreed typically believed that others would also disagree. A closely related bias to the false consensus effect is the\u00a0<strong>projection bias<\/strong><em>, <\/em>which is\u00a0<em>the tendency to assume that others share our cognitive and affective states<\/em>\u00a0(Hsee, Hastie, &amp; Chen, 2008).\n\nIn regards to our chapter case study, the false consensus effect has also been implicated in the potential causes of the 2008 financial collapse. Considering investor behavior within its social context, an important part of sound decision making is the ability to predict other investors' intentions and behaviors, as this will help to foresee potential market trends. In this context, Egan, Merkle, and Weber (in press) outline how the false consensus effect can lead investors to overestimate the extent to which other investors share their judgments about the likely trends, which can in turn lead them to make inaccurate predictions of their behavior, with dire economic consequences.<em>\n<\/em>\n\nAlthough it is commonly observed, the false consensus bias does not occur on all dimensions. Specifically, the false consensus bias is not usually observed on judgments of positive personal traits that we highly value as important. People (falsely, of course) report that they have better personalities (e.g., a better sense of humor), that they engage in better behaviors (e.g., they are more likely to wear seatbelts), and that they have brighter futures than almost everyone else (Chambers, 2008).\u00a0These results suggest that although in most cases we assume that we are similar to others, in cases of valued personal characteristics the goals of self-concern lead us to see ourselves more positively than we see the average person. There are some important cultural differences here, though, with members of collectivist cultures typically showing less of this type of self-enhancing bias, than those from individualistic cultures (Heine, Lehman, Markus, &amp; Kitayama, 1999).\n<h2>Perceptions of What \u201cMight Have Been\u201d Lead to Counterfactual Thinking<\/h2>\nIn addition to influencing our judgments about ourselves and others, the salience and accessibility of information can have an important effect on our own emotions and self-esteem. Our emotional reactions to events are often colored not only by what did happen but also by what <em>might have<\/em> happened. If we can easily imagine an outcome that is better than what actually happened, then we may experience sadness and disappointment; on the other hand, if we can easily imagine that a result might have been worse that what actually happened, we may be more likely to experience happiness and satisfaction. <em>The tendency to think about events according to what might have been <\/em>is known as <strong>counterfactual thinking<\/strong> (Roese, 1997).\n\nImagine, for instance, that you were participating in an important contest, and you won the silver medal. How would you feel? Certainly you would be happy that you won, but wouldn\u2019t you probably also be thinking a lot about what might have happened if you had been just a little bit better\u2014you might have won the gold medal! On the other hand, how might you feel if you won the bronze medal (third place)? If you were thinking about the counterfactual (the \u201cwhat might have been\u201d), perhaps the idea of not getting any medal at all would have been highly accessible and so you\u2019d be happy that you got the medal you did get.\n\nMedvec, Madey, and Gilovich (1995)\u00a0investigated exactly this idea by videotaping the responses of athletes who won medals in the 1992 summer Olympic Games. They videotaped the athletes both as they learned that they had won a silver or a bronze medal and again as they were awarded the medal. Then they showed these videos, without any sound, to people who did not know which medal which athlete had won. The raters indicated how they thought the athlete was feeling, on a range from \u201cagony\u201d to \u201cecstasy.\u201d The results showed that the bronze medalists did indeed seem to be, on average, happier than were the silver medalists. Then, in a follow-up study, raters watched interviews with many of these same athletes as they talked about their performance. The raters indicated what we would expect on the basis of counterfactual thinking. The silver medalists often talked about their disappointments in having finished second rather than first, whereas the bronze medalists tended to focus on how happy they were to have finished third rather than fourth.\n\n[caption id=\"attachment_2662\" align=\"alignnone\" width=\"400\"]<a href=\"http:\/\/opentextbc.ca\/socialpsychology\/wp-content\/uploads\/sites\/21\/2014\/09\/Figure-2-11.png\"><img src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/457\/2016\/08\/09165238\/Figure-2-11.png\" alt=\"Olympic Medalists\" class=\"wp-image-2662\" height=\"265\" width=\"400\"\/><\/a> Figure 2.11 Does the bronze medalist look happier to you than the silver medalist? Medvec, Madey, and Gilovich (1995) found that, on average, bronze medalists were happier than silver medalists.<br\/> Source: Tina Maze Andrea Fischbacher and Lindsey Vonn by Duncan Rawlinson (<a href=\"https:\/\/www.flickr.com\/photos\/44124400268@N01\/4374497787\">https:\/\/www.flickr.com\/photos\/44124400268@N01\/4374497787<\/a>) used under CC BY-NC 2.0 license (<a href=\"https:\/\/creativecommons.org\/licenses\/by-nc\/2.0\/\">https:\/\/creativecommons.org\/licenses\/by-nc\/2.0\/<\/a>)[\/caption]\n\nCounterfactual thinking seems to be part of the human condition and has even been studied in numerous other social settings, including juries. For example, people who were asked to award monetary damages to others who had been in an accident offered them substantially more in compensation if they were almost not injured than they did if the accident seemed more inevitable (Miller, Turnbull, &amp; McFarland, 1988).\n\nAgain, the moral of the story regarding the importance of cognitive accessibility is clear\u2014in the case of counterfactual thinking, the accessibility of the potential alternative outcome can lead to some seemingly paradoxical effects.\n<h2>Anchoring and Adjustment Lead Us to Accept Ideas That We Should Revise<\/h2>\nIn some cases, we may be aware of the danger of acting on our expectations and attempt to adjust for them. Perhaps you have been in a situation where you are beginning a course with a new professor and you know that a good friend of yours does not like him. You may be thinking that you want to go beyond your negative expectation and prevent this knowledge from biasing your judgment. However,<em> the accessibility of the initial information frequently prevents this adjustment from occurring\u2014leading us to weight initial information\u00a0too heavily and thereby insufficiently move our judgment away from it. <\/em>This is called the problem of <strong>anchoring and adjustment<\/strong><em>.\u00a0<\/em>\n\nTversky and Kahneman (1974)\u00a0asked some of the student participants in one of their studies of anchoring and adjustment to solve this multiplication problem quickly and without using a calculator:\n\n1 \u00d7 2 \u00d7 3 \u00d7 4 \u00d7 5 \u00d7 6 \u00d7 7 \u00d7 8\n\nThey asked other participants to solve this problem:\n\n8 \u00d7 7 \u00d7 6 \u00d7 5 \u00d7 4 \u00d7 3 \u00d7 2 \u00d7 1\n\nThey found that students who saw the first problem gave an estimated answer of about 512, whereas the students who saw the second problem estimated about 2,250. Tversky and Kahneman argued that the students couldn\u2019t solve the whole problem in their head, so they did the first few multiplications and then used the outcome of this preliminary calculation as their starting point, or anchor. Then the participants used their starting estimate to find an answer that sounded plausible. In both cases, the estimates were too low relative to the true value of the product (which is 40,320)\u2014but the first set of guesses were even lower because they started from a lower anchor.\n\nInterestingly, the tendency to anchor on initial information seems to be sufficiently strong that in some cases, people will do so even when the anchor is clearly irrelevant to the task at hand. For example, Ariely, Loewenstein, and Prelec (2003)\u00a0asked students \u00a0to bid on items in an auction after having noted the last two digits of their social security numbers. They then asked the students to generate and write down a hypothetical price for each of the auction items, based on these numbers. \u00a0If the last two digits were 11, then the bottle of wine, for example, was priced at $11. If the two numbers were 88, the textbook was $88. After they wrote down this initial, arbitrary price, they then had to bid for the item. People with high numbers bid up to 346% more than those with low ones! Ariely, reflecting further on these findings, concluded that the \u201cSocial security numbers were the anchor in this experiment only because we requested them. We could have just as well asked for the current temperature or the manufacturer\u2019s suggested retail price. Any question, in fact, would have created the anchor. Does that seem rational? Of course not\u201d (2008, p. 26). A rather startling conclusion from the effect of arbitrary, irrelevant anchors on our judgments is that we will often grab hold of any available information to guide our judgments, regardless of whether it is actually germane to the issue.\n\nOf course, savvy marketers have long used the anchoring phenomenon to help them. You might not be surprised to hear that people are more likely to buy more products when they are listed as four for $1.00 than when they are listed as $0.25 each (leading people to anchor on the four and perhaps adjust only a bit away).\u00a0And it is no accident that a car salesperson always starts negotiating with a high price and then works down. The salesperson is trying to get the consumer anchored on the high price, with the hope that it will have a big influence on the final sale value.\n<h2>Overconfidence<\/h2>\nStill another potential judgmental bias, and one that has powerful and often negative effects on our judgments, is the <strong>overconfidence<\/strong> <strong>bias<\/strong><em>,<\/em> <em>a\u00a0tendency to be overconfident in our own skills, abilities, and judgments<\/em>. We often have little awareness of our own limitations, leading us to act as if we are more certain about things than we should be, particularly on tasks that are difficult. Adams and Adams (1960)\u00a0found that for words that were difficult to spell, people were correct in spelling them only about 80% of the time, even though they indicated that they were \u201c100% certain\u201d that they were correct. David Dunning and his colleagues (Dunning, Griffin, Milojkovic, &amp; Ross, 1990)\u00a0asked college students to predict how another student would react in various situations. Some participants made predictions about a fellow student whom they had just met and interviewed, and others made predictions about their roommates. In both cases, participants reported their confidence in each prediction, and accuracy was determined by the responses of the target persons themselves. The results were clear: regardless of whether they judged a stranger or a roommate, the students consistently overestimated the accuracy of their own predictions (<a href=\"#Figure2-12\">Figure <\/a><span style=\"text-decoration: underline\">2.12<\/span>).\n<a id=\"Figure2-12\"\/>\n\n[caption id=\"attachment_2663\" align=\"alignnone\" width=\"400\"]<a href=\"http:\/\/opentextbc.ca\/socialpsychology\/wp-content\/uploads\/sites\/21\/2014\/09\/Figure-2-12.png\"><img src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/457\/2016\/08\/09165240\/Figure-2-12.png\" alt=\"Overconfidence\" class=\"wp-image-2663\" height=\"159\" width=\"400\"\/><\/a> Figure \u00a02.12 Dunning and colleagues\u00a0(1990) found that, regardless of whether they were judging strangers or their roommates, students were overconfident. The percentage confidence that they assigned to their own predictions was significantly higher than the actual percentage of their predictions that were correct.[\/caption]\n\nMaking matters even worse, Kruger and Dunning (1999)\u00a0found that people who scored low rather than high on tests of spelling, logic, grammar, and humor appreciation were also most likely to show overconfidence by overestimating how well they would do. Apparently, poor performers are doubly cursed\u2014they not only are unable to predict their own skills but also are the most unaware that they can\u2019t do so (Dunning, Johnson, Ehrlinger, &amp; Kruger, 2003).\n\nThe tendency to be overconfident in our judgments can have some very negative effects. When eyewitnesses testify in courtrooms regarding their memories of a crime, they often are completely sure that they are identifying the right person. But their confidence doesn\u2019t correlate much with their actual accuracy. This is, in part, why so many people have been wrongfully convicted on the basis of inaccurate eyewitness testimony given by overconfident witnesses (Wells &amp; Olson, 2003). Overconfidence can also spill over into professional judgments, for example, in clinical psychology (Oskamp, 1965) and in market investment and trading (Chen, Kim, Nofsinger, &amp; Rui, 2007). Indeed, in regards to our case study at the start of this chapter, the role of overconfidence bias in the financial crisis of 2008 and its aftermath has been well documented (Abbes, 2012).\n\nThis overconfidence also often seems to apply to social judgments about the future in general. A pervasive\u00a0<strong>optimistic bias<em>\u00a0<\/em><\/strong>has been noted in members of many cultures (Sharot, 2011), which can be defined as<em> a tendency to believe that positive outcomes are more likely to happen than negative ones, particularly in relation to ourselves versus others.<\/em> Importantly, this optimism is often unwarranted. Most people, for example, underestimate their risk of experiencing negative events like divorce and illness, and overestimate the likelihood of positive ones, including gaining a promotion at work or living to a ripe old age (Schwarzer, \u00a01994). There is some evidence of diversity in regards to optimism, however, across different groups. People in collectivist cultures tend not to show this bias to the same extent as those living in individualistic ones (Chang, Asakawa, &amp; Sanna, 2001). Moreover, individuals who have clinical depression have been shown to evidence a phenomenon termed\u00a0<strong>depressive realism<\/strong><em>,\u00a0<\/em>whereby their<em> social judgments about the future are less positively skewed and often more accurate than those who do not have depression<\/em> (Moore &amp; Fresco, 2012).\n\nThe optimistic bias can also extend into the <strong>planning fallacy<\/strong><em>,\u00a0<\/em>defined as <em>a tendency to\u00a0overestimate the amount that we can accomplish over a particular time frame.<\/em>\u00a0This fallacy can also entail the underestimation of the resources and costs involved in completing a task or project, as anyone who has attempted to budget for home renovations can probably attest to. Everyday examples of the planning fallacy abound, in everything from the completion of course assignments to the construction of new buildings. On a grander scale, newsworthy items in any country hosting a major sporting event, for example, the Olympics or World Cup soccer always seem to include the spiralling budgets and overrunning timelines as the events approach.\n\nWhy is the planning fallacy so persistent? Several factors appear to be at work here. Buehler, Griffin and Peetz (2010) argue that when planning projects, individuals orient to the future and pay too little attention to their past relevant experiences. This can cause them to overlook previous occasions where they experienced difficulties and over-runs. They also tend to plan for what time and resources are likely to be needed, if things run as planned. That is, they do not spend enough time thinking about all the things that might go wrong, for example, all the unforeseen demands on their time and resources that may occur during the completion of the task. Worryingly, the planning fallacy seems to be even stronger for tasks where we are highly motivated and invested in timely completions. It appears that wishful thinking is often at work here (Buehler et al., 2010).\u00a0For some further perspectives on the advantages and disadvantages of the optimism bias, see this engaging TED Talk by Tali Sharot at:\u00a0<a target=\"_blank\" href=\"http:\/\/www.ted.com\/talks\/tali_sharot_the_optimism_bias\">http:\/\/www.ted.com\/talks\/tali_sharot_the_optimism_bias<\/a>\n\nIf these biases related to overconfidence appear at least sometimes to lead us to inaccurate social judgments, a key question here is why are they so pervasive? What functions do they serve? One possibility is that they help to enhance people's motivation and self-esteem levels. If we have a positive view of our abilities and judgments, and are confident that we can execute tasks to deadlines, we will be more likely to attempt challenging projects and to put ourselves forward for demanding opportunities. Moreover, there is consistent evidence that a mild degree of optimism can predict a range of positive outcomes, including success and even physical health (Forgeard &amp; Seligman, 2012).\n<h2>The Importance of Cognitive Biases in Everyday Life<\/h2>\nIn our review of some of the many cognitive biases that affect our social judgment, we have seen that the effects on us as individuals range from fairly trivial decisions; for example, which phone to buy (which perhaps\u00a0doesn't seem so trivial at the time) to potentially life and death decisions (about methods of travel, for instance).\n\nHowever, when we consider that many of these errors will not only affect us but also everyone around us, then their consequences can really add up. Why would so many people continue to buy lottery tickets or to gamble their money in casinos when the likelihood of them ever winning is so low? One possibility, of course, is the representative heuristic\u2014people ignore the low base rates of winning and focus their attention on the salient likelihood of winning a huge prize. And the belief in astrology, which all scientific evidence suggests is not accurate, is probably driven in part by the salience of the occasions when the predictions do occur\u2014when a horoscope is correct (which it will of course sometimes be), the correct prediction is highly salient and may allow people to maintain the (overall false) belief as they recollect confirming evidence more readily.\n\nPeople may also take more care to prepare for unlikely events than for more likely ones because the unlikely ones are more salient or accessible. For instance, people may think that they are more likely to die from a terrorist attack or as the result of a homicide than they are from diabetes, stroke, or tuberculosis. But the odds are much greater of dying from the health problems than from terrorism or homicide. Because people don\u2019t accurately calibrate their behaviors to match the true potential risks, the individual and societal costs are quite large (Slovic, 2000).\n\nAs well as influencing our judgments relating to ourselves, salience and accessibility also color how we perceive our social worlds, which may have a big influence on our behavior. For instance, people who watch a lot of violent television shows also tend to view the world as more dangerous in comparison to those who watch less violent TV (Doob &amp; Macdonald, 1979).\u00a0This follows from the idea that our judgments are based on the accessibility of relevant constructs. We also overestimate our contribution to joint projects (Ross &amp; Sicoly, 1979),\u00a0perhaps in part because our own contributions are so obvious and salient, whereas the contributions of others are much less so. And the use of cognitive heuristics can even affect our views about global warming. Joireman, Barnes, Truelove, and Duell (2010)\u00a0found that people were more likely to believe in the existence of global warming when they were asked about it on hotter rather than colder days and when they had first been primed with words relating to heat. Thus the principles of salience and accessibility, because they are such an important part of our social judgments, can create a series of biases that can make a difference on a truly global level.\n\nAs we have already seen specifically in relation to overconfidence, research has found that even people who should know better\u2014and who need to know better\u2014are subject to cognitive biases in general. Economists, stock traders, managers, lawyers, and even doctors have been found to make the same kinds of mistakes in their professional activities that people make in their everyday lives (Byrne &amp; McEleney, 2000; Gilovich, Griffin, &amp; Kahneman, 2002; Hilton, 2001).\u00a0And the use of cognitive heuristics is increased when people are under time pressure (Kruglanski &amp; Freund, 1983)\u00a0or when they feel threatened (Kassam, Koslov, &amp; Mendes, 2009),\u00a0exactly the situations that\u00a0often occur when professionals are required to make their decisions.\n<h2>Biased About Our Biases: The Bias Blind Spot<\/h2>\nSo far, we have discussed some of the most important and heavily researched social cognitive biases that affect our appraisals of ourselves in relation to our social worlds and noted some of their key limitations. Recently, some social psychologists have become interested in how aware we are of how these biases and the ways in which they can affect our own and others' thinking. The short answer to this is that we often underestimate the extent to which our social cognition is biased, and that we typically (incorrectly) believe that we are less biased than the average person. Researchers have named<em>\u00a0<\/em>this<em> tendency to believe that our own judgments are less susceptible to the influence of bias than those of others <\/em>as the <strong>bias blind spot<\/strong> (Ehrlinger, Gilovich, &amp; Ross, 2005). Interestingly, the level of bias blind spot that people demonstrate is unrelated to the actual amount of bias they show in their social judgments (West, Meserve, &amp; Stanovich, 2012). Moreover, those scoring higher in cognitive ability actually tend to show a larger bias blind spot (West et al., 2012).\n\nSo, if our social cognition appears to be riddled with multiple biases, and we tend to show biases about these biases, what hope is there for us in reaching sound social judgments? \u00a0Before we arrive at such a pessimistic conclusion, however, it is important to redress the balance of evidence a little. Perhaps just learning more about these biases, as we have done in this chapter, can help us to recognize when they are likely to be useful to our social judgments, and to take steps to reduce their effects when they hinder our understanding of our social worlds. Maybe, although many of the biases discussed tend to persist even in the face of our awareness, at the very least, learning about them could be an important first step toward reducing their unhelpful effects on our social cognition. In order to get reliably better at policing our biases, though, we probably need to go further. One of the world's foremost authorities on social cognitive biases, Nobel Laureate Daniel Kahneman, certainly thinks so. He argues that individual awareness of biases is an important precursor to the development of a common vocabulary about them, that will then make us better able as communities to discuss their effects on our social judgments (Kahneman, 2011). Kahneman also asserts that we may be more likely to recognize and challenge bias in each other's thinking than in our own, an observation that certainly fits with the concept of the bias blind spot. Perhaps, even if we cannot effectively police our thinking on our own, we can help to police one another's.\n\nThese arguments are consistent with some evidence that, although mere awareness is rarely enough to significantly attenuate the effects of bias, it can be helpful when accompanied by systematic cognitive retraining.\u00a0Many social psychologists and other scientists are working to help people make better decisions. One possibility is to provide people with better feedback. Weather forecasters, for instance, are quite accurate in their decisions (at least in the short-term), in part because they are able to learn from the clear feedback that they get about the accuracy of their predictions. Other research has found that accessibility biases can be reduced by leading people to consider multiple alternatives rather than focusing only on the most obvious ones, and by encouraging people to think about exactly the opposite possible outcomes than the ones they are expecting (Hirt, Kardes, &amp; Markman, 2004).\u00a0And certain educational experiences can help people to make better decisions. For instance, Lehman, Lempert, and Nisbett (1988)\u00a0found that graduate students in medicine, law, and chemistry, and particularly those in psychology, all showed significant improvement in their ability to reason correctly over the course of their graduate training.\n\nAnother source for some optimism about the accuracy of our social cognition is that these heuristics and biases can, despite their limitations, often lead us to a broadly accurate understanding of the situations we encounter. Although we do have limited cognitive abilities, information, and time when making social judgments, that does not mean we cannot and do not make enough sense of our social worlds in order to function effectively in our daily lives. Indeed, some researchers, including Cosmides and Tooby (2000) and Gigerenzer (2004) have argued that these biases and heuristics have been sculpted by evolutionary forces to offer fast and frugal\u00a0<span style=\"line-height: 1.5em\">ways of reaching sound judgments about our infinitely complex social worlds enough of the time to have adaptive value. If, for example, you were asked to say which Spanish city had a larger population, Madrid or Valencia, the chances are you would quickly answer that Madrid was bigger, even if you did not know the relevant population figures. Why? Perhaps the availability heuristic and cognitive accessibility had something to do with it\u2014the chances are that most people have just heard more about Madrid in the global media over the years, and they can more readily bring these instances to mind. From there, it is a short leap to the general rule that larger cities tend to get more media coverage. So, although our journeys to\u00a0our social judgments may not be always be pretty, at least we often arrive at the right destination.\u00a0<\/span>\n<h3>Social Psychology in the Public Interest<\/h3>\nThe Validity of Eyewitness Testimony\n\nOne social situation in which the accuracy of our person-perception skills is vitally important is the area of eyewitness testimony (Charman &amp; Wells, 2007; Toglia, Read, Ross, &amp; Lindsay, 2007; Wells, Memon, &amp; Penrod, 2006).\u00a0Every year, thousands of individuals are charged with and often convicted of crimes based largely on eyewitness evidence. In fact, many people who were convicted prior to the existence of forensic DNA have now been exonerated by DNA tests, and more than 75% of these people were victims of mistaken eyewitness identification (Wells, Memon, &amp; Penrod, 2006; Fisher, 2011).\n\nThe judgments of eyewitnesses are often incorrect, and there is only a small correlation between how accurate and how confident an eyewitness is. Witnesses are frequently overconfident, and a person who claims to be absolutely certain about his or her identification is not much more likely to be accurate than someone who appears much less sure, making it almost impossible to determine whether a particular witness is accurate or not (Wells &amp; Olson, 2003).\n\nTo accurately remember a person or an event at a later time, we must be able to accurately see and store the information in the first place, keep it in memory over time, and then accurately retrieve it later. But the social situation can influence any of these processes, causing errors and biases.\n\nIn terms of initial encoding of the memory, crimes normally occur quickly, often in situations that are accompanied by a lot of stress, distraction, and arousal. Typically, the eyewitness gets only a brief glimpse of the person committing the crime, and this may be under poor lighting conditions and from far away. And the eyewitness may not always focus on the most important aspects of the scene. Weapons are highly salient, and if a weapon is present during the crime, the eyewitness may focus on the weapon, which would draw his or her attention away from the individual committing the crime (Steblay, 1997).\u00a0In one relevant study, Loftus, Loftus, and Messo (1987)\u00a0showed people slides of a customer walking up to a bank teller and pulling out either a pistol or a checkbook. By tracking eye movements, the researchers determined that people were more likely to look at the gun than at the checkbook and that this reduced their ability to accurately identify the criminal in a lineup that was given later.\n\nPeople may be particularly inaccurate when they are asked to identify members of a race other than their own (Brigham, Bennett, Meissner, &amp; Mitchell, 2007).\u00a0In one field study, for example, Meissner and Brigham (2001)\u00a0sent European-American, African-American, and Hispanic students into convenience stores in El Paso, Texas. Each of the students made a purchase, and the researchers came in later to ask the clerks to identify photos of the shoppers. Results showed that the clerks demonstrated the own-race bias: they were all more accurate at identifying customers belonging to their own racial or ethnic group, which may be more salient to them, than they were at identifying people from other groups. There seems to be some truth to the adage that \u201cThey all look alike\u201d\u2014at least if an individual is looking at someone who is not of his or her own race.\n\n[caption id=\"attachment_930\" align=\"alignnone\" width=\"400\"]<img class=\"wp-image-930\" alt=\"people\" src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/457\/2016\/08\/09165243\/people1-1024x256.jpg\" height=\"100\" width=\"400\"\/> Figure 2.13 One source of error in eyewitness testimony is the relative difficulty of accurately identifying people who are not of one\u2019s own race.<br\/> Source: Ladakh, Hemis Shukpachan by Dietmar Temps (<a href=\"https:\/\/www.flickr.com\/photos\/deepblue66\/10607432526\">https:\/\/www.flickr.com\/photos\/deepblue66\/10607432526<\/a>) used under CC BY-NC-SA 2.0 license (<a href=\"https:\/\/creativecommons.org\/licenses\/by-nc-sa\/2.0\/\">https:\/\/creativecommons.org\/licenses\/by-nc-sa\/2.0\/<\/a>). Group Portrait by John Ragai (<a href=\"https:\/\/www.flickr.com\/photos\/johnragai\/13167551744\">https:\/\/www.flickr.com\/photos\/johnragai\/13167551744<\/a>) used under CC BY 2.0 (<a href=\"https:\/\/creativecommons.org\/licenses\/by\/2.0\/\">https:\/\/creativecommons.org\/licenses\/by\/2.0\/<\/a>). College students by Adam S (<a href=\"https:\/\/www.flickr.com\/photos\/111963716@N06\/11529206136\">https:\/\/www.flickr.com\/photos\/111963716@N06\/11529206136<\/a>) used under CC BY 2.0 (<a href=\"https:\/\/creativecommons.org\/licenses\/by\/2.0\/\">https:\/\/creativecommons.org\/licenses\/by\/2.0\/<\/a>)[\/caption]\n\nEven if information gets encoded properly, memories may become distorted over time. For one thing, people might discuss what they saw with other people, or they might read information relating to it from other bystanders or in the media. Such postevent information can distort the original memories such that the witnesses are no longer sure what the real information is and what was provided later. The problem is that the new, inaccurate information is highly cognitively accessible, whereas the older information is much less so. The reconstructive memory bias suggests that the memory may shift over time to fit the individual's current beliefs about the crime. Even describing a face makes it more difficult to recognize the face later (Dodson, Johnson, &amp; Schooler, 1997).\n\nIn an experiment by Loftus and Palmer (1974),\u00a0participants viewed a film of a traffic accident and then, according to random assignment to experimental conditions, answered one of three questions:\n<ol><li>About how fast were the cars going when they hit each other?<\/li>\n\t<li>About how fast were the cars going when they smashed each other?<\/li>\n\t<li>About how fast were the cars going when they contacted each other?<\/li>\n<\/ol>\nAs you can see in in the Figure 2.14, \"Reconstructive Memory,\" although all the participants saw the same accident, their estimates of the speed of the cars varied by condition. People who had seen the \u201csmashed\u201d question estimated the highest average speed, and those who had seen the \u201ccontacted\u201d question estimated the lowest.\n\n[caption id=\"attachment_2664\" align=\"alignnone\" width=\"400\"]<a href=\"http:\/\/opentextbc.ca\/socialpsychology\/wp-content\/uploads\/sites\/21\/2014\/09\/Figure-2-14.png\"><img src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/457\/2016\/08\/09165245\/Figure-2-14.png\" alt=\"Reconstructive Memory\" class=\"wp-image-2664\" height=\"110\" width=\"400\"\/><\/a> Figure 2.14 Reconstructive Memory[\/caption]\n\nParticipants viewed a film of a traffic accident and then answered a question about the accident. According to random assignment, the blank was filled by either \u201chit,\u201d \u201csmashed,\u201d or \u201ccontacted\u201d each other. The wording of the question influenced the participants\u2019 memory of the accident. Data are from Loftus and Palmer (1974).\n\nThe situation is particularly problematic when the eyewitnesses are children, because research has found that children are more likely to make incorrect identifications than are adults (Pozzulo &amp; Lindsay, 1998)\u00a0and are also subject to the own-race identification bias (Pezdek, Blandon-Gitlin, &amp; Moore, 2003).\u00a0In many cases, when sex abuse charges have been filed against babysitters, teachers, religious officials, and family members, the children are the only source of evidence. The possibility that children are not accurately remembering the events that have occurred to them creates substantial problems for the legal system.\n\nAnother setting in which eyewitnesses may be inaccurate is when they try to identify suspects from mug shots or lineups. A lineup generally includes the suspect and five to seven other innocent people (the fillers), and the eyewitness must pick out the true perpetrator. The problem is that eyewitnesses typically feel pressured to pick a suspect out of the lineup, which increases the likelihood that they will mistakenly pick someone (rather than no one) as the suspect.\n\nResearch has attempted to better understand how people remember and potentially misremember the scenes of and people involved in crimes and to attempt to improve how the legal system makes use of eyewitness testimony. In many states, efforts are being made to better inform judges, juries, and lawyers about how inaccurate eyewitness testimony can be. Guidelines have also been proposed to help ensure that child witnesses are questioned in a nonbiasing way (Poole &amp; Lamb, 1998).\u00a0Steps can also be taken to ensure that lineups yield more accurate eyewitness identifications. Lineups are more fair when the fillers resemble the suspect, when the interviewer makes it clear that the suspect might or might not be present (Steblay, Dysart, Fulero, &amp; Lindsay, 2001),\u00a0and when the eyewitness has not been shown the same pictures in a mug-shot book prior to the lineup decision. And several recent studies have found that witnesses who make accurate identifications from a lineup reach their decision faster than do witnesses who make mistaken identifications, suggesting that authorities must take into consideration not only the response but how fast it is given (Dunning &amp; Perretta, 2002).\n\nIn addition to distorting our memories for events that have actually occurred, misinformation may lead us to falsely remember information that never occurred. Loftus and her colleagues asked parents to provide them with descriptions of events that did happen (e.g., moving to a new house) and did not happen (e.g., being lost in a shopping mall) to their children. Then (without telling the children which events were real or made up) the researchers asked the children to imagine both types of events. The children were instructed to \u201cthink really hard\u201d about whether the events had occurred (Ceci, Huffman, Smith, &amp; Loftus, 1994).\u00a0More than half of the children generated stories regarding at least one of the made-up events, and they remained insistent that the events did in fact occur even when told by the researcher that they could not possibly have occurred (Loftus &amp; Pickrell, 1995).\u00a0Even college students are susceptible to manipulations that make events that did not actually occur seem as if they did (Mazzoni, Loftus, &amp; Kirsch, 2001).\n\nThe ease with which memories can be created or implanted is particularly problematic when the events to be recalled have important consequences. Therapists often argue that patients may repress memories of traumatic events they experienced as children, such as childhood sexual abuse, and then recover the events years later as the therapist leads them to recall the information\u2014for instance, by using dream interpretation and hypnosis (Brown, Scheflin, &amp; Hammond, 1998).\n\nBut other researchers argue that painful memories such as sexual abuse are usually very well remembered, that few memories are actually repressed, and that even if they are, it is virtually impossible for patients to accurately retrieve them years later (McNally, Bryant, &amp; Ehlers, 2003; Pope, Poliakoff, Parker, Boynes, &amp; Hudson, 2007).\u00a0These researchers have argued that the procedures used by the therapists to \u201cretrieve\u201d the memories are more likely to actually implant false memories, leading the patients to erroneously recall events that did not actually occur. Because hundreds of people have been accused, and even imprisoned, on the basis of claims about \u201crecovered memory\u201d of child sexual abuse, the accuracy of these memories has important societal implications. Many psychologists now believe that most of these claims of recovered memories are due to implanted, rather than real, memories (Loftus &amp; Ketcham, 1994).\n\nTaken together, then, the problems of eyewitness testimony represent another example of how social cognition\u2014including the processes that we use to size up and remember other people\u2014may be influenced, sometimes in a way that creates inaccurate perceptions, by the operation of salience, cognitive accessibility, and other information-processing biases.\n\n\u00a0\n<div class=\"bcc-box bcc-success\">\n<h3>Key Takeaways<\/h3>\n<ul><li>We use our schemas and attitudes to help us judge and respond to others. In many cases, this is appropriate, but our expectations can also lead to biases in our judgments of ourselves and others.<\/li>\n\t<li>A good part of our social cognition is spontaneous or automatic, operating without much thought or effort. On the other hand, when we have the time and the motivation to think about things carefully, we may engage in thoughtful, controlled cognition.<\/li>\n\t<li>Which expectations we use to judge others is based on both the situational salience of the things we are judging and the cognitive accessibility of our own schemas and attitudes.<\/li>\n\t<li>Variations in the accessibility of schemas lead to biases such as the availability heuristic, the representativeness heuristic, the false consensus bias, biases caused by counterfactual thinking, and those elated to overconfidence.<\/li>\n\t<li>The potential biases that are the result of everyday social cognition can have important consequences, both for us in our everyday lives but even for people who make important decisions affecting many other people. Although biases are common, they are not impossible to control, and psychologists and other scientists are working to help people make better decisions.<\/li>\n\t<li>The operation of cognitive biases, including the potential for new information to distort information already in memory, can help explain the tendency for eyewitnesses to be overconfident and frequently inaccurate in their recollections of what occurred at crime scenes.<\/li>\n<\/ul><\/div>\n\u00a0\n<div class=\"bcc-box bcc-info\">\n<h3>Exercises and Critical Thinking<\/h3>\n<ol><li>Give an example of a time when you may have committed one of the cognitive heuristics and biases discussed in this chapter. What factors (e.g., availability;\u00a0salience) caused the error, and what was the outcome of your use of the shortcut or heuristic? What do you see as the general advantages and disadvantages of using this bias in your everyday life? Describe one possible strategy you could use to reduce the potentially harmful effects of this bias in your life.<\/li>\n\t<li>Go to the website <a href=\"http:\/\/thehothand.blogspot.com\">http:\/\/thehothand.blogspot.com<\/a>, which analyzes the extent to which people accurately perceive \u201cstreakiness\u201d in sports. Based on the information provided on this site, as well as that in this chapter, in what ways might our sports perceptions be influenced by our expectations and the use of cognitive heuristics and biases?<\/li>\n\t<li>Different cognitive heuristics and biases often operate together to influence our social cognition in particular situations. Describe a situation where you feel that two or more biases were affecting your judgment. How did they interact? What combined effects on your social cognition did they have? Which of the heuristics and biases outlined in this chapter do you think might be particularly likely to happen together in social situations and why?<\/li>\n<\/ol><\/div>\n<div class=\"textbox shaded\">\n<h3>References<\/h3>\nAarts, H., &amp; Dijksterhuis, A. (1999). How often did I do it? Experienced ease of retrieval and frequency estimates of past behavior.\u00a0<i>Acta Psychologica, 103<\/i>(1\u20132), 77\u201389.\n\nAbbes, M. B. (2012). Does overconfidence explain volatility during the global financial crisis?\u00a0<em>Transition Studies Review, 19(3)<\/em>, 291-312.\n\nAdams, P. A., &amp; Adams, J. K. (1960). Confidence in the recognition and reproduction of words difficult to spell.\u00a0<i>American Journal of Psychology, 73<\/i>, 544\u2013552.\n\nAriely, D. (2008).\u00a0<i>Predictably irrational: The hidden forces that shape our decisions<\/i>. New York: Harper Perennial.\n\nAriely, D., Loewenstein, D., \u00a0&amp; Prelec, D. (2003). Coherent arbitrariness: Stable demand curves without stable preferences. <em>Quarterly Journal of Economics 118 (1),<\/em>\u00a073\u2013106.\n\nBargh, J. A., Chen, M., &amp; Burrows, L. (1996). Automaticity of social behavior: Direct effects of trait construct and stereotype activation on action.\u00a0<i>Journal of Personality and Social Psychology, 71<\/i>(2), 230\u2013244.\n\nBrewer, M. B. (1988). A dual process model of impression formation. In T. K. Srull &amp; R. S. Wyer (Eds.),\u00a0<i>Advances in social cognition<\/i>\u00a0(Vol. 1, pp. 1\u201336). Hillsdale, NJ: Erlbaum.\n\nBrigham, J. C., Bennett, L. B., Meissner, C. A., &amp; Mitchell, T. L. (Eds.). (2007).\u00a0<i>The influence of race on eyewitness memory<\/i>. Mahwah, NJ: Lawrence Erlbaum Associates Publishers.\n\nBrown, D., Scheflin, A. W., &amp; Hammond, D. C. (1998).\u00a0<i>Memory, trauma treatment, and the law<\/i>. New York, NY: Norton.\n\nBuehler, R., Griffin, D., &amp; Peetz, J. (2010). The planning fallacy: Cognitive, motivational, and social origins. In M. P. Zanna, J. M. Olson (Eds.) ,\u00a0<i>Advances in experimental social psychology, Vol 43<\/i>\u00a0(pp. 1-62). San Diego, CA US: Academic Press.\u00a0doi:10.1016\/S0065-2601(10)43001-4\n\nByrne, R. M. J., &amp; McEleney, A. (2000). Counterfactual thinking about actions and failures to act.\u00a0<i>Journal of Experimental Psychology: Learning, Memory, and Cognition, 26<\/i>(5), 1318\u20131331.\n\nCeci, S. J., Huffman, M. L. C., Smith, E., &amp; Loftus, E. F. (1994). Repeatedly thinking about a non-event: Source misattributions among preschoolers.\u00a0<i>Consciousness and Cognition: An International Journal, 3<\/i>(3\u20134), 388\u2013407.\n\nChambers, J. R. (2008). Explaining false uniqueness: Why we are both better and worse than others.\u00a0<i>Social and Personality Psychology Compass, 2<\/i>(2), 878\u2013894.\n\nChang, E. C., Asakawa, K., &amp; Sanna, L. J. (2001). Cultural variations in optimistic and pessimistic bias: Do Easterners really expect the worst and Westerners really expect the best when predicting future life events?.\u00a0<i>Journal of Personality and Social Psychology<\/i>,<i>81<\/i>(3), 476-491. doi:10.1037\/0022-3514.81.3.476\n\nCharman, S. D., &amp; Wells, G. L. (2007). Eyewitness lineups: Is the appearance-changes instruction a good idea?\u00a0<i>Law and Human Behavior, 31<\/i>(1), 3\u201322.\n\nChen, G., Kim, K. A., Nofsinger, J. R., &amp; Rui, O. M. (2007). Trading performance, disposition effect, overconfidence, representativeness bias, and experience of emerging market investors.\u00a0<i>Journal of Behavioral Decision Making<\/i>,\u00a0<i>20<\/i>(4), 425-451. doi:10.1002\/bdm.561\n\nCosmides, L., &amp; Tooby, J. (2000). Evolutionary psychology and the emotions. In M. Lewis &amp; J. M. Haviland-Jones (Eds.),\u00a0<em>Handbook of emotions, 2nd edition\u00a0<\/em>(pp. 91-115). New York, NY: The Guilford Press.\n\nDijksterhuis, A., Bos, M. W., Nordgren, L. F., &amp; van Baaren, R. B. (2006). On making the right choice: The deliberation-without-attention effect.\u00a0<i>Science, 311<\/i>(5763), 1005\u20131007.\n\nDodson, C. S., Johnson, M. K., &amp; Schooler, J. W. (1997). The verbal overshadowing effect: Why descriptions impair face recognition.\u00a0<i>Memory &amp; Cognition, 25<\/i>(2), 129\u2013139.\n\nDoob, A. N., &amp; Macdonald, G. E. (1979). Television viewing and fear of victimization: Is the relationship causal?\u00a0<i>Journal of Personality and Social Psychology, 37<\/i>(2), 170\u2013179.\n\nDunning, D., &amp; Perretta, S. (2002). Automaticity and eyewitness accuracy: A 10- to 12-second rule for distinguishing accurate from inaccurate positive identifications.\u00a0<i>Journal of Applied Psychology, 87<\/i>(5), 951\u2013962.\n\nDunning, D., Griffin, D. W., Milojkovic, J. D., &amp; Ross, L. (1990). The overconfidence effect in social prediction.\u00a0<i>Journal of Personality and Social Psychology, 58<\/i>(4), 568\u2013581.\n\nDunning, D., Johnson, K., Ehrlinger, J., &amp; Kruger, J. (2003). Why people fail to recognize their own incompetence.\u00a0<i>Current Directions in Psychological Science, 12<\/i>(3), 83\u201387.\n\nEgan, D., Merkle, C., &amp; Weber, M. (in press). Second-order beliefs and the individual investor.\u00a0<em>Journal of Economic Behavior &amp; Organization.\u00a0<\/em>\n\nEhrlinger J.,\u00a0Gilovich, T.D., &amp; Ross, L. (2005). Peering into the bias blind spot: People\u2019s assessments of bias in themselves and others.\u00a0<em>Personality and Social Psychology Bulletin, 31,\u00a0<\/em>1-13.\n\nFerguson, M. J., &amp; Bargh, J. A. (2003). The constructive nature of automatic evaluation. In J. Musch &amp; K. C. Klauer (Eds.),\u00a0<i>The psychology of evaluation: Affective processes in cognition and emotion<\/i>\u00a0(pp. 169\u2013188). Mahwah, NJ: Lawrence Erlbaum Associates Publishers.\n\nFerguson, M. J., Hassin, R., &amp; Bargh, J. A. (2008). Implicit motivation: Past, present, and future. In J. Y. Shah &amp; W. L. Gardner (Eds.),\u00a0<i>Handbook of motivation science<\/i>\u00a0(pp. 150\u2013166). New York, NY: Guilford Press.\n\nFisher, R. P. (2011). Editor\u2019s introduction: Special issue on psychology and law.\u00a0<i>Current Directions in Psychological Science, 20<\/i>, 4. doi:10.1177\/0963721410397654\n\nForgeard, M. C., &amp; Seligman, M. P. (2012). Seeing the glass half full: A review of the causes and consequences of optimism.<i>Pratiques Psychologiques<\/i>,\u00a0<i>18<\/i>(2), 107-120. doi:10.1016\/j.prps.2012.02.002\n\nGigerenzer, G. (2004). Fast and frugal heuristics: The tools of founded rationality. In D. J. Koehler &amp; N. Harvey (Eds.),\u00a0<em>Blackwell handbook of judgment and decision making\u00a0<\/em>(pp. 62-88). Malden, MA: Blackwell Publishing.\n\nGigerenzer, G. (2006). Out of the frying pan and into the fire: Behavioral reactions to terrorist attacks.\u00a0<em>Risk Analysis, 26,\u00a0<\/em>347-351.\n\nGilovich, T., Griffin, D., &amp; Kahneman, D. (Eds.). (2002).\u00a0<i>Heuristics and biases: The psychology of intuitive judgment<\/i>. New York, NY: Cambridge University Press.\n\nHaddock, G., Rothman, A. J., Reber, R., &amp; Schwarz, N. (1999). Forming judgments of attitude certainty, intensity, and importance: The role of subjective experiences.\u00a0<i>Personality and Social Psychology Bulletin, 25<\/i>, 771\u2013782.\n\nHeine, S. J., Lehman, D. R., Markus, H. R., &amp; Kitayama, S. (1999). Is there a universal need for positive self-regard? <em>Psychological Review, 106(4),<\/em> 766-794. doi: 10.1037\/0033-295X.106.4.766\n\nHilton, D. J. (2001). The psychology of financial decision-making: Applications to trading, dealing, and investment analysis.\u00a0<i>Journal of Behavioral Finance, 2<\/i>, 37\u201353. doi: 10.1207\/S15327760JPFM0201_4\n\nHirt, E. R., Kardes, F. R., &amp; Markman, K. D. (2004). Activating a mental simulation mind-set through generation of alternatives: Implications for debiasing in related and unrelated domains.\u00a0<i>Journal of Experimental Social Psychology, 40<\/i>(3), 374\u2013383.\n\nHsee, C. K., Hastie, R., &amp; Chen, J. (2008). Hedonomics: Bridging decision research with happiness research. Perspectives On <em>Psychological Science, 3(3)<\/em>, 224-243. doi:10.1111\/j.1745-6924.2008.00076.x\n\nJoireman, J., Barnes Truelove, H., &amp; Duell, B. (2010). Effect of outdoor temperature, heat primes and anchoring on belief in global warming.\u00a0<i>Journal of Environmental Psychology, 30<\/i>(4), 358\u2013367.\n\nKahneman, D. (2011). <i>Thinking fast and slow.\u00a0<\/i>New York: Farrar, Strauss, Giroux.\n\nKassam, K. S., Koslov, K., &amp; Mendes, W. B. (2009). Decisions under distress: Stress profiles influence anchoring and adjustment.\u00a0<i>Psychological Science, 20<\/i>(11), 1394\u20131399.\n\nKrueger, J., &amp; Clement, R. W. (1994). The truly false consensus effect: An ineradicable and egocentric bias in social perception.\u00a0<i>Journal of Personality and Social Psychology, 67<\/i>(4), 596\u2013610.\n\nKruger, J., &amp; Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one\u2019s own incompetence lead to inflated self-assessments.\u00a0<i>Journal of Personality and Social Psychology, 77<\/i>(6), 1121\u20131134.\n\nKruglanski, A. W., &amp; Freund, T. (1983). The freezing and unfreezing of lay inferences: Effects on impressional primacy, ethnic stereotyping, and numerical anchoring.\u00a0<i>Journal of Experimental Social Psychology, 19,<\/i>\u00a0448\u2013468.\n\nLehman, D. R., Lempert, R. O., &amp; Nisbett, R. E. (1988). The effects of graduate training on reasoning: Formal discipline and thinking about everyday-life events.\u00a0<i>American Psychologist, 43<\/i>(6), 431\u2013442.\n\nLoewenstein, G. F., Weber, E. U., Hsee, C. K., &amp; Welch, N. (2001). Risk as feelings.\u00a0<i>Psychological Bulletin, 127<\/i>(2), 267\u2013286.\n\nLoftus, E. F., &amp; Ketcham, K. (1994).\u00a0<i>The myth of repressed memory: False memories and allegations of sexual abuse<\/i>\u00a0(1st ed.). New York, NY: St. Martin\u2019s Press.\n\nLoftus, E. F., &amp; Palmer, J. C. (1974). Reconstruction of automobile destruction: An example of the interaction between language and memory.\u00a0<i>Journal of Verbal Learning &amp; Verbal Behavior, 13<\/i>(5), 585\u2013589.\n\nLoftus, E. F., &amp; Pickrell, J. E. (1995). The formation of false memories.\u00a0<i>Psychiatric Annals, 25<\/i>(12), 720\u2013725.\n\nLoftus, E. F., Loftus, G. R., &amp; Messo, J. (1987). Some facts about \u201cweapon focus.\u201d\u00a0<i>Law and Human Behavior, 11<\/i>(1), 55\u201362.\n\nMazzoni, G. A. L., Loftus, E. F., &amp; Kirsch, I. (2001). Changing beliefs about implausible autobiographical events: A little plausibility goes a long way.\u00a0<i>Journal of Experimental Psychology: Applied, 7<\/i>(1), 51\u201359.\n\nMcArthur, L. Z., &amp; Post, D. L. (1977). Figural emphasis and person perception.\u00a0<i>Journal of Experimental Social Psychology, 13<\/i>(6), 520\u2013535.\n\nMcNally, R. J., Bryant, R. A., &amp; Ehlers, A. (2003). Does early psychological intervention promote recovery from posttraumatic stress?\u00a0<i>Psychological Science in the Public Interest, 4<\/i>(2), 45\u201379.\n\nMedvec, V. H., Madey, S. F., &amp; Gilovich, T. (1995). When less is more: Counterfactual thinking and satisfaction among Olympic medalists.\u00a0<i>Journal of Personality and Social Psychology, 69<\/i>(4), 603\u2013610.\n\nMeissner, C. A., &amp; Brigham, J. C. (2001). Thirty years of investigating the own-race bias in memory for faces: A meta-analytic review.\u00a0<i>Psychology, Public Policy, and Law, 7<\/i>(1), 3\u201335.\n\nMiller, D. T., Turnbull, W., &amp; McFarland, C. (1988). Particularistic and universalistic evaluation in the social comparison process.\u00a0<i>Journal of Personality and Social Psychology, 55<\/i>, 908\u2013917.\n\nMoore, M. T., &amp; Fresco, D. M. (2012). Depressive realism: A meta-analytic review.\u00a0<i>Clinical Psychology Review<\/i>,\u00a0<i>32<\/i>(6), 496-509. doi:10.1016\/j.cpr.2012.05.004\n\nOskamp, S. (1965). Overconfidence in case-study judgments.\u00a0<em>Journal of Consulting Psychology, 29(3)<\/em>, 261-265.\n\nPezdek, K., Blandon-Gitlin, I., &amp; Moore, C. (2003). Children\u2019s face recognition memory: More evidence for the cross-race effect.\u00a0<i>Journal of Applied Psychology, 88<\/i>(4), 760\u2013763.\n\nPoole, D. A., &amp; Lamb, M. E. (1998).\u00a0<i>The development of interview protocols<\/i>. Washington, DC: American Psychological Association.\n\nPope, H. G., Jr., Poliakoff, M. B., Parker, M. P., Boynes, M., &amp; Hudson, J. I. (2007). Is dissociative amnesia a culture-bound syndrome? Findings from a survey of historical literature.\u00a0<i>Psychological Medicine: A Journal of Research in Psychiatry and the Allied Sciences, 37<\/i>(2), 225\u2013233.\n\nPozzulo, J. D., &amp; Lindsay, R. C. L. (1998). Identification accuracy of children versus adults: A meta-analysis.\u00a0<i>Law and Human Behavior, 22<\/i>(5), 549\u2013570.\n\nReber, R., Winkielman, P., &amp; Schwarz, N. (1998). Effects of perceptual fluency on affective judgments.\u00a0<i>Psychological Science, 9<\/i>(1), 45\u201348.\n\nRoese, N. J. (1997). Counterfactual thinking.\u00a0<i>Psychological Bulletin, 121<\/i>(1), 133\u2013148.\n\nRoss, L., Greene, D., &amp; House, P. (1977). The false consensus effect: An egocentric bias in social perception and attribution processes.\u00a0<i>Journal of Experimental Social Psychology, 13<\/i>(3), 279\u2013301.\n\nRoss, M., &amp; Sicoly, F. (1979). Egocentric biases in availability and attribution.\u00a0<i>Journal of Personality and Social Psychology, 37<\/i>(3), 322\u2013336.\n\nSchwarz, N., &amp; Vaughn, L. A. (Eds.). (2002).\u00a0<i>The availability heuristic revisited: Ease of recall and content of recall as distinct sources of information<\/i>. New York, NY: Cambridge University Press.\n\nSchwarz, N., Bless, H., Strack, F., Klumpp, G., Rittenauer-Schatka, H., &amp; Simons, A. (1991). Ease of retrieval as information: Another look at the availability heuristic.\u00a0<i>Journal of Personality and Social Psychology, 61,<\/i>\u00a0195\u2013202.\n\nSharot, T. (2011).\u00a0<em>The optimism bias: A tour of the irrationally positive brain.\u00a0<\/em>New York: Pantheon Books.\n\nSloman, S. A. (Ed.). (2002).\u00a0<i>Two systems of reasoning<\/i>. New York, NY: Cambridge University Press.\n\nSlovic, P. (Ed.). (2000).\u00a0<i>The perception of risk<\/i>. London, England: Earthscan Publications.\n\nStanovich, K. E., &amp; West, R. F. (Eds.). (2002).\u00a0<i>Individual differences in reasoning: Implications for the rationality debate?<\/i>\u00a0New York, NY: Cambridge University Press.\n\nSteblay, N. M. (1997). Social influence in eyewitness recall: A meta-analytic review of lineup instruction effects.\u00a0<i>Law and Human Behavior, 21<\/i>(3), 283\u2013297.\n\nSteblay, N., Dysart, J., Fulero, S., &amp; Lindsay, R. C. L. (2001). Eyewitness accuracy rates in sequential and simultaneous lineup presentations: A meta-analytic comparison.\u00a0<i>Law and Human Behavior, 25<\/i>(5), 459\u2013473.\n\nTaylor, S. E., &amp; Fiske, S. T. (1978). Salience, attention and attribution: Top of the head phenomena.\u00a0<i>Advances in Experimental Social Psychology, 11,<\/i>\u00a0249\u2013288.\n\nToglia, M. P., Read, J. D., Ross, D. F., &amp; Lindsay, R. C. L. (Eds.). (2007).\u00a0<i>The handbook of eyewitness psychology<\/i>\u00a0(Vols. 1 &amp; 2). Mahwah, NJ: Lawrence Erlbaum Associates Publishers.\n\nTversky, A., &amp; Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability.\u00a0<i>Cognitive Psychology, 5<\/i>, 207\u2013232.\n\nTversky, A., &amp; Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases.\u00a0<i>Science, 185<\/i>(4157), 1124\u20131131.\n\nWells, G. L., &amp; Olson, E. A. (2003). Eyewitness testimony.\u00a0<i>Annual Review of Psychology, 54<\/i>, 277\u2013295.\n\nWells, G. L., Memon, A., &amp; Penrod, S. D. (2006). Eyewitness evidence: Improving its probative value.\u00a0<i>Psychological Science in the Public Interest, 7<\/i>(2), 45\u201375.\n\nWest, R. F., Meserve, R. J., &amp; Stanovich, K. E. (2012). Cognitive sophistication does not attenuate the bias blind spot.\u00a0<i>Journal Of Personality and Social Psychology<\/i>,\u00a0<i>103<\/i>(3), 506-519. doi:10.1037\/a0028857\n\nWillis, J., &amp; Todorov, A. (2006). First impressions: Making up your mind after a 100-Ms exposure to a face.\u00a0<i>Psychological Science, 17<\/i>(7), 592\u2013598.\n\nWinkielman, P., &amp; Cacioppo, J. T. (2001). Mind at ease puts a smile on the face: Psychophysiological evidence that processing facilitation elicits positive affect.\u00a0<i>Journal of Personality and Social Psychology, 81<\/i>(6), 989\u20131000.\n\nWinkielman, P., Schwarz, N., &amp; Nowak, A. (Eds.). (2002).\u00a0<i>Affect and processing dynamics: Perceptual fluency enhances evaluations<\/i>. Amsterdam, Netherlands: John Benjamins Publishing Company.\n\n<\/div>","rendered":"<div class=\"bcc-box bcc-highlight\">\n<h3>Learning Objectives<\/h3>\n<ol>\n<li>Provide examples of how salience and accessibility influence information processing.<\/li>\n<li>Review, differentiate, and give examples of some important cognitive heuristics that influence social judgment.<\/li>\n<li>Summarize and give examples of the importance of social cognition in everyday life.<\/li>\n<\/ol>\n<\/div>\n<p>Once we have developed a set of schemas and attitudes, we naturally use that information to help us evaluate and respond to others. Our expectations help us to think about, size up, and make sense of individuals, groups of people, and the relationships among people. If we have learned, for example, that someone is friendly and interested in us, we are likely to approach them; if we have learned that they are threatening or unlikable, we will be more likely to withdraw. And if we believe that a person has committed a crime, we may process new information in a manner that helps convince us that our judgment was correct. In this section, we will consider how we use our stored knowledge to come to accurate (and sometimes inaccurate) conclusions about our social worlds.<\/p>\n<h2>Automatic versus Controlled Cognition<\/h2>\n<p>A good part of both cognition and social cognition is spontaneous or automatic. <strong>Automatic cognition<\/strong> refers to<em> thinking that occurs out of our awareness, quickly, and without taking much effort<\/em> (Ferguson &amp; Bargh, 2003; Ferguson, Hassin, &amp; Bargh, 2008).\u00a0The things that we do most frequently tend to become more automatic each time we do them, until they reach a level where they don\u2019t really require us to think about them very much. Most of us can ride a bike and operate a television remote control in an automatic way. Even though it took some work to do these things when we were first learning them, it just doesn\u2019t take much effort anymore. And because we spend a lot of time making judgments about others, many of these judgments,\u00a0which are strongly influenced by our schemas, are made quickly and automatically (Willis &amp; Todorov, 2006).<\/p>\n<p>Because automatic thinking occurs outside of our conscious awareness, we frequently have no idea that it is occurring and influencing our judgments or behaviors. You might remember a time when you returned home, unlocked the door, and 30 seconds later couldn\u2019t remember where you had put your keys! You know that you must have used the keys to get in, and you know you must have put them somewhere, but you simply don\u2019t remember a thing about it. Because many of our everyday judgments and behaviors are performed automatically, we may not always be aware that they are occurring or influencing us.<\/p>\n<p>It is of course a good thing that many things operate automatically because it would be extremely difficult to have to think about them all the time. If you couldn\u2019t drive a car automatically, you wouldn\u2019t be able to talk to the other people riding with you or listen to the radio at the same time\u2014you\u2019d have to be putting most of your attention into driving. On the other hand, relying on our snap judgments about Bianca\u2014that she\u2019s likely to be expressive, for instance\u2014can be erroneous. Sometimes we need to\u2014and should\u2014go beyond automatic cognition and consider people more carefully. <em>When we deliberately size up and think about something, for instance, another person,\u00a0<\/em>we call it <strong>controlled cognition<\/strong>.\u00a0Although you might think that controlled cognition would be more common and that automatic thinking would be less likely, that is not always the case. The problem is that thinking takes effort and time, and we often don\u2019t have too much of those things available.<\/p>\n<p>In the following Research Focus, we consider an example of automatic cognition in a study that uses a common social cognitive procedure known as <strong>priming<\/strong>,\u00a0<em>a technique in which information is temporarily brought into memory through exposure to situational events, which\u00a0can then influence judgments entirely out of awareness.<\/em><\/p>\n<div class=\"textbox shaded\">\n<h3>Research Focus<\/h3>\n<p>Behavioral Effects of Priming<\/p>\n<p>In one demonstration of how automatic cognition can influence our behaviors without us being aware of them, John Bargh and his colleagues (Bargh, Chen, &amp; Burrows, 1996)\u00a0conducted two studies, each with the exact same procedure. In the experiments, they showed college students sets of five scrambled words. The students were to unscramble the five words in each set to make a sentence. Furthermore, for half of the research participants, the words were related to the stereotype of elderly people. These participants saw words such as \u201cin Florida retired live people\u201d and \u201cbingo man the forgetful plays.\u201d<\/p>\n<p>The other half of the research participants also made sentences but did so out of words that had nothing to do with the elderly stereotype. The purpose of this task was to prime (activate) the schema of elderly people in memory for some of the participants but not for others.<\/p>\n<p>The experimenters then assessed whether the priming of elderly stereotypes would have any effect on the students\u2019 behavior\u2014and indeed it did. When each research participant had gathered all his or her belongings, thinking that the experiment was over, the experimenter thanked him or her for participating and gave directions to the closest elevator. Then, without the participant knowing it, the experimenters recorded the amount of time that the participant spent walking from the doorway of the experimental room toward the elevator. As you can see in Figure 2.8, &#8220;Automatic Priming and Behavior,&#8221; the same results were found in both experiments\u2014the participants who had made sentences using words related to the elderly stereotype took on the behaviors of the elderly\u2014they walked significantly more slowly (in fact, about 12% more slowly across the two studies) as they left the experimental room.<\/p>\n<div id=\"attachment_2659\" style=\"width: 410px\" class=\"wp-caption alignnone\"><a href=\"http:\/\/opentextbc.ca\/socialpsychology\/wp-content\/uploads\/sites\/21\/2014\/09\/Figure-2-8.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-2659\" src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/457\/2016\/08\/09165212\/Figure-2-8.png\" alt=\"Automatic priming and behaviour\" class=\"wp-image-2659\" height=\"147\" width=\"400\" \/><\/a><\/p>\n<p id=\"caption-attachment-2659\" class=\"wp-caption-text\">Figure 2.8 Automatic Priming and Behavior. In two separate experiments, Bargh, Chen, and Borroughs (1996) found that students who had been exposed to words related to the elderly stereotype walked more slowly than those who had been exposed to more neutral words.<\/p>\n<\/div>\n<p>\u00a0<\/p>\n<p>To determine if these priming effects occurred out of the conscious awareness of the participants, Bargh and his colleagues asked a third group of students to complete the priming task and then to indicate whether they thought the words they had used to make the sentences had any relationship to each other or could possibly have influenced their behavior in any way. These students had no awareness of the possibility that the words might have been related to the elderly or could have influenced their behavior.<\/p>\n<p>The point of these experiments, and many others like them, is clear\u2014it is quite possible that our judgments and behaviors are influenced by our social situations, and this influence may be entirely outside of our conscious awareness. To return again to Bianca, it is even possible that we notice her nationality and that our beliefs about Italians influence our responses to her, even though we have no idea that they are doing so and really believe that they have not.<\/p>\n<\/div>\n<p>\u00a0<\/p>\n<h2>Salience and Accessibility Determine Which Expectations We Use<\/h2>\n<p>We each have a large number of schemas that we might bring to bear on any type of judgment we might make. When thinking about Bianca, for instance, we might focus on her nationality, her gender, her physical attractiveness, her intelligence, or any of many other possible features. And we will react to Bianca differently depending on which schemas we use. Schema activation is determined both by the salience<em>\u00a0<\/em>of the\u00a0characteristics of the person we are judging and by the current activation or cognitive accessibility<em>\u00a0<\/em>of the schema.<\/p>\n<h2>Salience<\/h2>\n<div id=\"attachment_922\" style=\"width: 360px\" class=\"wp-caption alignnone\"><a href=\"http:\/\/opentextbc.ca\/socialpsychology\/wp-content\/uploads\/sites\/21\/2014\/05\/people.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-922\" class=\"wp-image-922\" alt=\"people\" src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/457\/2016\/08\/09165229\/people-1024x885.png\" height=\"303\" width=\"350\" \/><\/a><\/p>\n<p id=\"caption-attachment-922\" class=\"wp-caption-text\">Figure 2.9 Which of these people are more salient and therefore more likely to attract your attention?<br \/> Source: Man with a moustache (<a href=\"http:\/\/commons.wikimedia.org\/wiki\/File:Man_with_a_moustache,_Chambal,_India.jpg\">http:\/\/commons.wikimedia.org\/wiki\/File:Man_with_a_moustache,_Chambal,_India.jpg<\/a>) by yann used under CC BY-SA 3.0 (<a href=\"http:\/\/creativecommons.org\/licenses\/by-sa\/3.0\/deed.en\">http:\/\/creativecommons.org\/licenses\/by-sa\/3.0\/deed.en<\/a>). Jill Jackson (<a href=\"https:\/\/www.flickr.com\/photos\/kriskesiak\/6493819855\/\">https:\/\/www.flickr.com\/photos\/kriskesiak\/6493819855\/<\/a>) by Kris Kesiak used under CC BY-NC 2.0 (<a href=\"https:\/\/creativecommons.org\/licenses\/by-nc\/2.0\/\">https:\/\/creativecommons.org\/licenses\/by-nc\/2.0\/<\/a>). Amelia earhart (<a href=\"http:\/\/en.wikipedia.org\/wiki\/File:Amelia_earhart.jpeg\">http:\/\/en.wikipedia.org\/wiki\/File:Amelia_earhart.jpeg<\/a>) in Public Domain (<a href=\"http:\/\/en.wikipedia.org\/wiki\/Public_domain\">http:\/\/en.wikipedia.org\/wiki\/Public_domain<\/a>). Ralph Lauren Photoshoot (<a href=\"https:\/\/www.flickr.com\/photos\/brandoncwarren\/2964734674\/\">https:\/\/www.flickr.com\/photos\/brandoncwarren\/2964734674\/<\/a>) by Brandon Warren used under CC BY-NC 2.0 (<a href=\"https:\/\/www.flickr.com\/photos\/brandoncwarren\/2964734674\/\">https:\/\/www.flickr.com\/photos\/brandoncwarren\/2964734674\/<\/a>). Wild Hair (<a href=\"http:\/\/en.wikipedia.org\/wiki\/File:Wild_hair.jpg\">http:\/\/en.wikipedia.org\/wiki\/File:Wild_hair.jpg<\/a>) by peter klashorst used under CC BY 2.0 (<a href=\"http:\/\/creativecommons.org\/licenses\/by\/2.0\/deed.en\">http:\/\/creativecommons.org\/licenses\/by\/2.0\/deed.en<\/a>)<\/p>\n<\/div>\n<p>One determinant of which schemas are likely to be used in social judgment is the extent to which we attend to particular features of the person or situation that we are responding to. We are more likely to judge people on the basis of characteristics of\u00a0salience, which<em>\u00a0<\/em>attract our attention when we see someone with them. For example, things that are unusual, negative, colorful, bright, and moving are more salient and thus more likely to be attended to than are things that do not have these characteristics (McArthur &amp; Post, 1977; Taylor &amp; Fiske, 1978).<\/p>\n<p>We are more likely to initially judge people on the basis of their sex, race, age, and physical attractiveness, rather than on, say, their religious orientation or their political beliefs, in part because these features are so salient when we see them (Brewer, 1988).\u00a0Another thing that makes something particularly salient is its infrequency or unusualness. If Bianca is from Italy and very few other people in our community are, that characteristic is something that we notice, it is salient, and we are therefore likely to attend to it. That she is also a woman is, at least in this context, is less salient.<\/p>\n<div>\n<p>The salience of the stimuli in our social worlds may sometimes lead us to make judgments on the basis of information that is actually less informative than is other less salient information. Imagine, for instance, that you wanted to buy a new smartphone for yourself. You\u2019ve been trying to decide whether to get the iPhone or a rival product. You went online and checked out the reviews, and you found that although the phones differed on many dimensions, including price, battery life, and so forth, the rival product was nevertheless rated significantly higher by the owners than was the iPhone. As a result, you decide to go and purchase one the next day. That night, however, you go to a party, and a friend of yours shows you her iPhone. You check it out, and it seems really great. You tell her that you were thinking of buying a rival product, and she tells you that you are crazy. She says she knows someone who had one and had a lot of\u00a0<span style=\"line-height: 1.5em\">problems\u2014it didn\u2019t download music properly, the battery died right after the warranty was up, and so forth, and that she would never buy one. Would you still buy it, or would you switch your plans? \u00a0<\/span><\/p>\n<\/div>\n<p><span style=\"line-height: 1.5em\">If you think about this question logically, the information that you just got from your friend isn\u2019t really all that important; you now know the opinions of one more person, but that can\u2019t really change the overall consumer ratings of the two machines very much. On the other hand, the information your friend gives you and the chance to use her iPhone are highly salient. The information is right there in front of you, in your hand, whereas the statistical information from reviews<\/span><span style=\"line-height: 1.5em\">\u00a0is only in the form of a table that you saw on your computer. The outcome in cases such as this is that people frequently ignore the less salient, but more important, information, such as <\/span><em>the likelihood that events occur across a large population<\/em><em style=\"line-height: 1.5em\">,\u00a0<\/em><span style=\"line-height: 1.5em\">known as <strong>base rates<em>,\u00a0<\/em><\/strong>in favor of the actually less important, but nevertheless more salient, information<em>.<\/em><\/span><\/p>\n<p>Another case in which we ignore base-rate information occurs when we use the <strong>representativeness heuristic, <\/strong>which occurs <em>when we base our judgments on information that seems to represent, or match, what we expect will happen, while ignoring more informative base-rate information.<\/em> Consider, for instance, the following puzzle. Let\u2019s say that you went to a hospital this week, and you checked the records of the babies that were born on that day (<a href=\"#Table2-2\">Table 2.2, &#8220;Using the Representativeness Heuristic&#8221;<\/a>). Which pattern of births do you think that you are most likely to find?<br \/>\n<a id=\"Table2-2\"><br \/>\nTable 2.2 Using the Representativeness Heuristic<\/p>\n<table>\n<thead>\n<tr>\n<th>\n<h2>List A<\/h2>\n<\/th>\n<th>\n<\/th>\n<th>\n<h2>List B<\/h2>\n<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td>6:31 a.m.<\/td>\n<td>Girl<\/td>\n<td>6:31 a.m<\/td>\n<td>Boy<\/td>\n<td>\n<\/td>\n<\/tr>\n<tr>\n<td>8:15 a.m.<\/td>\n<td>Girl<\/td>\n<td>8:15 a.m.<\/td>\n<td>Girl<\/td>\n<td>\n<\/td>\n<\/tr>\n<tr>\n<td>9:42 a.m.<\/td>\n<td>Girl<\/td>\n<td>9:42 a.m.<\/td>\n<td>Boy<\/td>\n<td>\n<\/td>\n<\/tr>\n<tr>\n<td>1:13 p.m.<\/td>\n<td>Girl<\/td>\n<td>1:13 p.m.<\/td>\n<td>Girl<\/td>\n<td>\n<\/td>\n<\/tr>\n<tr>\n<td>3:39 p.m.<\/td>\n<td>Boy<\/td>\n<td>3:39 p.m.<\/td>\n<td>Girl<\/td>\n<td>\n<\/td>\n<\/tr>\n<tr>\n<td>5:12 p.m.<\/td>\n<td>Boy<\/td>\n<td>5:12 p.m.<\/td>\n<td>Boy<\/td>\n<td>\n<\/td>\n<\/tr>\n<tr>\n<td>7:42 p.m.<\/td>\n<td>Boy<\/td>\n<td>7:42 p.m.<\/td>\n<td>Girl<\/td>\n<td>\n<\/td>\n<\/tr>\n<tr>\n<td>11:44 p.m.<\/td>\n<td>Boy<\/td>\n<td>11:44 p.m.<\/td>\n<td>Boy<\/td>\n<td>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>Most people think that List B is more likely, probably because it\u00a0looks more random and thus matches (is \u201crepresentative of\u201d) our ideas about randomness. But statisticians know that any pattern of four girls and four boys is equally likely and thus that List B is no more likely than List A. The problem is that we have an image of what randomness should be, which doesn\u2019t always match what is rationally the case. Similarly, people who see a coin that comes up heads five times in a row will frequently predict (and perhaps even bet!) that tails will be next\u2014it just seems like it has to be. But mathematically, this erroneous expectation (known as the gambler\u2019s fallacy) is simply not true: the base-rate likelihood of any single coin flip being tails is only 50%, regardless of how many times it has come up heads in the past.<\/p>\n<p>To take one more example, consider the following information:<\/p>\n<p>I have a friend who is analytical, argumentative, and is involved in community activism. Which of the following is she? (Choose one.)<\/p>\n<p>\u2014A lawyer<\/p>\n<p>\u2014A salesperson<\/p>\n<p>Can you see how you might be led, potentially incorrectly, into thinking that my friend is a lawyer? Why? The description (\u201canalytical, argumentative, and is involved in community activism\u201d) just seems more representative or stereotypical of our expectations about lawyers than salespeople. But the base rates tell us something completely different, which should make us wary of that conclusion. Simply put, the number of salespeople greatly outweighs the number of lawyers in society, and thus statistically it is far more likely that she is a salesperson. Nevertheless, the representativeness heuristic will often cause us to overlook such important information. One unfortunate consequence of this is that it can contribute to the maintenance of stereotypes. If someone you meet seems, superficially at least, to represent the stereotypical characteristics of a social group, you may incorrectly classify that person\u00a0as a member of that group, even when it is highly likely that he or she is\u00a0not.<\/p>\n<h2>Cognitive Accessibility<\/h2>\n<p>Although the characteristics that we use to think about objects or people are determined in part by their salience, individual differences in the person who is doing the judging are also important. People vary in the type of schemas that they tend to use when judging others and when thinking about themselves. One way to consider this is in terms of the <strong>cognitive accessibility<\/strong> of the schema. Cognitive accessibility refers to<em> the extent to which a schema is activated in memory and thus likely to be used in information processing.<\/em> Simply put, the schemas we tend to typically use are often those that are most accessible to us.<\/p>\n<p>You probably know people who are football nuts (or maybe tennis or some other sport nuts). All they can talk about is football. For them, we would say that football is a highly accessible construct. Because they love football, it is important to their self-concept; they set many of their goals in terms of the sport, and they tend to think about things and people in terms of it (\u201cIf he plays or watches football, he must be okay!\u201d). Other people have highly accessible schemas about eating healthy food, exercising, environmental issues, or really good coffee, for instance. In short, when a schema is accessible, we are likely to use it to make judgments of ourselves and others.<\/p>\n<p>Although accessibility can be considered a person variable (a given idea is more highly accessible for some people than for others), accessibility can also be influenced by situational factors. When we have recently or frequently thought about a given topic, that topic becomes more accessible and is likely to influence our judgments. This is in fact a potential explanation for the results of the priming study you read about earlier\u2014people walked slower because the concept of elderly had been primed and thus was currently highly accessible for them.<\/p>\n<p>Because we rely so heavily on our schemas and attitudes, and particularly on those that are salient and accessible, we can sometimes be overly influenced by them. Imagine, for instance, that I asked you to close your eyes and determine whether there are more words in the English language that begin with the letter <em>R<\/em> or that have the letter <em>R<\/em> as the third letter. You would probably try to solve this problem by thinking of words that have each of the characteristics. It turns out that most people think there are more words that begin with <em>R<\/em>, even though there are in fact more words that have <em>R<\/em> as the third letter.<\/p>\n<p>You can see that this error can occur as a result of cognitive accessibility. To answer the question, we naturally try to think of all the words that we know that begin with <em>R<\/em> and that have <em>R<\/em> in the third position. The problem is that when we do that, it is much easier to retrieve the former than the latter, because we store words by their first, not by their third, letter. We may also think that our friends are nice people because we see them primarily when they are around us (their friends). And the traffic might seem worse in our own neighborhood than we think it is in other places, in part because nearby traffic jams are more accessible for us than are traffic jams that occur somewhere else. And do you think it is more likely that you will be killed in a plane crash or in a car crash? Many people fear the former, even though the latter is much more likely: statistically, your chances of being involved in an aircraft accident are far lower than being killed in an automobile accident. In this case, the problem is that plane crashes, which are highly salient, are more easily retrieved from our memory than are car crashes, which often receive far less media coverage.<\/p>\n<p><em>The tendency to make judgments of the frequency of an event, or the likelihood that an event will occur, on the basis of the ease with which the event can be retrieved from memory <\/em>is known as the <strong>availability heuristic<\/strong> (Schwarz &amp; Vaughn, 2002; Tversky &amp; Kahneman, 1973).\u00a0The idea is that things that are highly accessible (in this case, the term <em>availability<\/em> is used) come to mind easily and thus may overly influence our judgments. Thus, despite the clear facts, it may be easier to think of plane crashes than of car crashes because the former are more accessible. If so, the availability heuristic can lead to errors in judgments.<\/p>\n<p>For example, as people tend to overestimate the risk of rare but dramatic events, including plane crashes and terrorist attacks, their responses to these estimations may not always be proportionate to the true risks. For instance, it has been widely documented that fewer\u00a0people chose to use air travel in the aftermath of the\u00a0September 11, 2001 (9\/11), terrorist attacks on the World Trade Center, particularly in the United States. Correspondingly, many individuals chose other methods of travel, often electing to drive rather than fly to their destination. Statistics across all regions of the world confirm that driving is far more dangerous than flying, and this prompted the cognitive psychologist Gerd Gigerenzer to estimate how many extra deaths that the increased road traffic following 9\/11 might have caused. He arrived at an estimate of around an additional 1,500 road deaths in the United States alone in the year following those terrorist attacks, which was six times the number of people killed on the airplanes on September 11, 2001 (Gigerenzer, 2006).<\/p>\n<p>Another way that the cognitive accessibility of constructs can influence information processing is through their effects on <strong>processing fluency<\/strong>. Processing fluency refers to<em> the ease with which we can process information in our environments.<\/em> When stimuli are highly accessible, they can be quickly attended to and processed, and they therefore have a large influence on our perceptions. This influence is due, in part, to the fact that we often react positively to information that we can process quickly, and we use this positive response as a basis of judgment (Reber, Winkielman, &amp; Schwarz, 1998; Winkielman &amp; Cacioppo, 2001).<\/p>\n<p>In one study demonstrating this effect, Norbert Schwarz and his colleagues (Schwarz et al., 1991)\u00a0asked one set of college students to list six\u00a0occasions when they had acted either assertively or unassertively, and asked another set of college students to list 12 such examples. Schwarz determined that for most students, it was pretty easy to list six examples but pretty hard to list 12.<\/p>\n<p>The researchers then asked the participants to indicate how assertive or unassertive they actually were. You can see from <\/a><a href=\"#Figure2-10\">Figure 2.10, &#8220;Processing Fluency,&#8221;<\/a> that the ease of processing influenced judgments. The participants who had an easy time listing examples of their behavior (because they only had to list six instances) judged that they did in fact have the characteristics they were asked about (either assertive or unassertive), in comparison with the participants who had a harder time doing the task (because they had to list 12 instances). Other research has found similar effects\u2014people rate that they ride their bicycles more often after they have been asked to recall only a few rather than many instances of doing so (Aarts &amp; Dijksterhuis, 1999),\u00a0and they hold an attitude with more confidence after being asked to generate few rather than many arguments that support it (Haddock, Rothman, Reber, &amp; Schwarz, 1999). Sometimes less really is more!<br \/>\n<a id=\"Figure2-10\"><\/p>\n<div id=\"attachment_2661\" style=\"width: 360px\" class=\"wp-caption alignnone\"><a href=\"http:\/\/opentextbc.ca\/socialpsychology\/wp-content\/uploads\/sites\/21\/2014\/09\/Figure-2-10.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-2661\" src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/457\/2016\/08\/09165233\/Figure-2-10.png\" alt=\"Processing Fluency\" class=\"wp-image-2661\" height=\"140\" width=\"350\" \/><\/a><\/p>\n<p id=\"caption-attachment-2661\" class=\"wp-caption-text\">Figure 2.10 Processing Fluency. When it was relatively easy to complete the questionnaire (only six examples were required), the student participants rated that they had more of the trait than when the task was more difficult (12 answers were required). Data are from Schwarz et al. (1991).<\/p>\n<\/div>\n<p>\u00a0<\/p>\n<p>Echoing the findings mentioned earlier in relation to schemas,\u00a0we are likely to use this type of quick and \u201cintuitive\u201d processing, based on our feelings about how easy it is to complete a task, when we don\u2019t have much time or energy for more in-depth processing, such as when we are under time pressure, tired, or unwilling to process the stimulus in sufficient detail. Of course, it is very adaptive to respond to stimuli quickly (Sloman, 2002; Stanovich &amp; West, 2002; Winkielman, Schwarz, &amp; Nowak, 2002),\u00a0and it is not impossible that in at least some cases, we are better off making decisions based on our initial responses than on a more thoughtful cognitive analysis (Loewenstein, Weber, Hsee, &amp; Welch, 2001).\u00a0For instance, Dijksterhuis, Bos, Nordgren, and van Baaren (2006)\u00a0found that when participants were given tasks requiring decisions that were very difficult to make on the basis of a cognitive analysis of the problem, they made better decisions when they didn\u2019t try to analyze the details carefully but simply relied on their intuitions.<\/p>\n<p>In sum, people are influenced not only by the information they get but on how they get it. We are more highly influenced by things that are salient and accessible and thus easily attended to, remembered, and processed. On the other hand, information that is harder to access from memory, is less likely to be attended to, or takes more effort to consider is less likely to be used in our judgments, even if this information is statistically more informative.<\/p>\n<h2>The False Consensus Bias Makes Us Think That Others Are More Like Us Than They Really Are<\/h2>\n<p>The tendency to base our judgments on the accessibility of social constructs can lead to still other errors in judgment.\u00a0One such error is known as the <strong>false consensus bias<\/strong>,<em>\u00a0the tendency to overestimate the extent to which other people hold similar views to our own<\/em>. As our own beliefs are highly accessible to us, we tend to rely on them too heavily when asked to predict those of others. For instance, if you are in favor of abortion rights and opposed to capital punishment, then you are likely to think that most other people share these beliefs (Ross, Greene, &amp; House, 1977).\u00a0In one demonstration of the false consensus bias, Joachim Krueger and his colleagues (Krueger &amp; Clement, 1994)\u00a0gave their research participants, who were college students, a personality test. Then they asked the same participants to estimate the percentage of other students in their school who would have answered the questions the same way that they did. The students who agreed with the items often thought that others would agree with them too, whereas the students who disagreed typically believed that others would also disagree. A closely related bias to the false consensus effect is the\u00a0<strong>projection bias<\/strong><em>, <\/em>which is\u00a0<em>the tendency to assume that others share our cognitive and affective states<\/em>\u00a0(Hsee, Hastie, &amp; Chen, 2008).<\/p>\n<p>In regards to our chapter case study, the false consensus effect has also been implicated in the potential causes of the 2008 financial collapse. Considering investor behavior within its social context, an important part of sound decision making is the ability to predict other investors&#8217; intentions and behaviors, as this will help to foresee potential market trends. In this context, Egan, Merkle, and Weber (in press) outline how the false consensus effect can lead investors to overestimate the extent to which other investors share their judgments about the likely trends, which can in turn lead them to make inaccurate predictions of their behavior, with dire economic consequences.<em><br \/>\n<\/em><\/p>\n<p>Although it is commonly observed, the false consensus bias does not occur on all dimensions. Specifically, the false consensus bias is not usually observed on judgments of positive personal traits that we highly value as important. People (falsely, of course) report that they have better personalities (e.g., a better sense of humor), that they engage in better behaviors (e.g., they are more likely to wear seatbelts), and that they have brighter futures than almost everyone else (Chambers, 2008).\u00a0These results suggest that although in most cases we assume that we are similar to others, in cases of valued personal characteristics the goals of self-concern lead us to see ourselves more positively than we see the average person. There are some important cultural differences here, though, with members of collectivist cultures typically showing less of this type of self-enhancing bias, than those from individualistic cultures (Heine, Lehman, Markus, &amp; Kitayama, 1999).<\/p>\n<h2>Perceptions of What \u201cMight Have Been\u201d Lead to Counterfactual Thinking<\/h2>\n<p>In addition to influencing our judgments about ourselves and others, the salience and accessibility of information can have an important effect on our own emotions and self-esteem. Our emotional reactions to events are often colored not only by what did happen but also by what <em>might have<\/em> happened. If we can easily imagine an outcome that is better than what actually happened, then we may experience sadness and disappointment; on the other hand, if we can easily imagine that a result might have been worse that what actually happened, we may be more likely to experience happiness and satisfaction. <em>The tendency to think about events according to what might have been <\/em>is known as <strong>counterfactual thinking<\/strong> (Roese, 1997).<\/p>\n<p>Imagine, for instance, that you were participating in an important contest, and you won the silver medal. How would you feel? Certainly you would be happy that you won, but wouldn\u2019t you probably also be thinking a lot about what might have happened if you had been just a little bit better\u2014you might have won the gold medal! On the other hand, how might you feel if you won the bronze medal (third place)? If you were thinking about the counterfactual (the \u201cwhat might have been\u201d), perhaps the idea of not getting any medal at all would have been highly accessible and so you\u2019d be happy that you got the medal you did get.<\/p>\n<p>Medvec, Madey, and Gilovich (1995)\u00a0investigated exactly this idea by videotaping the responses of athletes who won medals in the 1992 summer Olympic Games. They videotaped the athletes both as they learned that they had won a silver or a bronze medal and again as they were awarded the medal. Then they showed these videos, without any sound, to people who did not know which medal which athlete had won. The raters indicated how they thought the athlete was feeling, on a range from \u201cagony\u201d to \u201cecstasy.\u201d The results showed that the bronze medalists did indeed seem to be, on average, happier than were the silver medalists. Then, in a follow-up study, raters watched interviews with many of these same athletes as they talked about their performance. The raters indicated what we would expect on the basis of counterfactual thinking. The silver medalists often talked about their disappointments in having finished second rather than first, whereas the bronze medalists tended to focus on how happy they were to have finished third rather than fourth.<\/p>\n<div id=\"attachment_2662\" style=\"width: 410px\" class=\"wp-caption alignnone\"><a href=\"http:\/\/opentextbc.ca\/socialpsychology\/wp-content\/uploads\/sites\/21\/2014\/09\/Figure-2-11.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-2662\" src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/457\/2016\/08\/09165238\/Figure-2-11.png\" alt=\"Olympic Medalists\" class=\"wp-image-2662\" height=\"265\" width=\"400\" \/><\/a><\/p>\n<p id=\"caption-attachment-2662\" class=\"wp-caption-text\">Figure 2.11 Does the bronze medalist look happier to you than the silver medalist? Medvec, Madey, and Gilovich (1995) found that, on average, bronze medalists were happier than silver medalists.<br \/> Source: Tina Maze Andrea Fischbacher and Lindsey Vonn by Duncan Rawlinson (<a href=\"https:\/\/www.flickr.com\/photos\/44124400268@N01\/4374497787\">https:\/\/www.flickr.com\/photos\/44124400268@N01\/4374497787<\/a>) used under CC BY-NC 2.0 license (<a href=\"https:\/\/creativecommons.org\/licenses\/by-nc\/2.0\/\">https:\/\/creativecommons.org\/licenses\/by-nc\/2.0\/<\/a>)<\/p>\n<\/div>\n<p>Counterfactual thinking seems to be part of the human condition and has even been studied in numerous other social settings, including juries. For example, people who were asked to award monetary damages to others who had been in an accident offered them substantially more in compensation if they were almost not injured than they did if the accident seemed more inevitable (Miller, Turnbull, &amp; McFarland, 1988).<\/p>\n<p>Again, the moral of the story regarding the importance of cognitive accessibility is clear\u2014in the case of counterfactual thinking, the accessibility of the potential alternative outcome can lead to some seemingly paradoxical effects.<\/p>\n<h2>Anchoring and Adjustment Lead Us to Accept Ideas That We Should Revise<\/h2>\n<p>In some cases, we may be aware of the danger of acting on our expectations and attempt to adjust for them. Perhaps you have been in a situation where you are beginning a course with a new professor and you know that a good friend of yours does not like him. You may be thinking that you want to go beyond your negative expectation and prevent this knowledge from biasing your judgment. However,<em> the accessibility of the initial information frequently prevents this adjustment from occurring\u2014leading us to weight initial information\u00a0too heavily and thereby insufficiently move our judgment away from it. <\/em>This is called the problem of <strong>anchoring and adjustment<\/strong><em>.\u00a0<\/em><\/p>\n<p>Tversky and Kahneman (1974)\u00a0asked some of the student participants in one of their studies of anchoring and adjustment to solve this multiplication problem quickly and without using a calculator:<\/p>\n<p>1 \u00d7 2 \u00d7 3 \u00d7 4 \u00d7 5 \u00d7 6 \u00d7 7 \u00d7 8<\/p>\n<p>They asked other participants to solve this problem:<\/p>\n<p>8 \u00d7 7 \u00d7 6 \u00d7 5 \u00d7 4 \u00d7 3 \u00d7 2 \u00d7 1<\/p>\n<p>They found that students who saw the first problem gave an estimated answer of about 512, whereas the students who saw the second problem estimated about 2,250. Tversky and Kahneman argued that the students couldn\u2019t solve the whole problem in their head, so they did the first few multiplications and then used the outcome of this preliminary calculation as their starting point, or anchor. Then the participants used their starting estimate to find an answer that sounded plausible. In both cases, the estimates were too low relative to the true value of the product (which is 40,320)\u2014but the first set of guesses were even lower because they started from a lower anchor.<\/p>\n<p>Interestingly, the tendency to anchor on initial information seems to be sufficiently strong that in some cases, people will do so even when the anchor is clearly irrelevant to the task at hand. For example, Ariely, Loewenstein, and Prelec (2003)\u00a0asked students \u00a0to bid on items in an auction after having noted the last two digits of their social security numbers. They then asked the students to generate and write down a hypothetical price for each of the auction items, based on these numbers. \u00a0If the last two digits were 11, then the bottle of wine, for example, was priced at $11. If the two numbers were 88, the textbook was $88. After they wrote down this initial, arbitrary price, they then had to bid for the item. People with high numbers bid up to 346% more than those with low ones! Ariely, reflecting further on these findings, concluded that the \u201cSocial security numbers were the anchor in this experiment only because we requested them. We could have just as well asked for the current temperature or the manufacturer\u2019s suggested retail price. Any question, in fact, would have created the anchor. Does that seem rational? Of course not\u201d (2008, p. 26). A rather startling conclusion from the effect of arbitrary, irrelevant anchors on our judgments is that we will often grab hold of any available information to guide our judgments, regardless of whether it is actually germane to the issue.<\/p>\n<p>Of course, savvy marketers have long used the anchoring phenomenon to help them. You might not be surprised to hear that people are more likely to buy more products when they are listed as four for $1.00 than when they are listed as $0.25 each (leading people to anchor on the four and perhaps adjust only a bit away).\u00a0And it is no accident that a car salesperson always starts negotiating with a high price and then works down. The salesperson is trying to get the consumer anchored on the high price, with the hope that it will have a big influence on the final sale value.<\/p>\n<h2>Overconfidence<\/h2>\n<p>Still another potential judgmental bias, and one that has powerful and often negative effects on our judgments, is the <strong>overconfidence<\/strong> <strong>bias<\/strong><em>,<\/em> <em>a\u00a0tendency to be overconfident in our own skills, abilities, and judgments<\/em>. We often have little awareness of our own limitations, leading us to act as if we are more certain about things than we should be, particularly on tasks that are difficult. Adams and Adams (1960)\u00a0found that for words that were difficult to spell, people were correct in spelling them only about 80% of the time, even though they indicated that they were \u201c100% certain\u201d that they were correct. David Dunning and his colleagues (Dunning, Griffin, Milojkovic, &amp; Ross, 1990)\u00a0asked college students to predict how another student would react in various situations. Some participants made predictions about a fellow student whom they had just met and interviewed, and others made predictions about their roommates. In both cases, participants reported their confidence in each prediction, and accuracy was determined by the responses of the target persons themselves. The results were clear: regardless of whether they judged a stranger or a roommate, the students consistently overestimated the accuracy of their own predictions (<a href=\"#Figure2-12\">Figure <\/a><span style=\"text-decoration: underline\">2.12<\/span>).<br \/>\n<a id=\"Figure2-12\"><\/p>\n<div id=\"attachment_2663\" style=\"width: 410px\" class=\"wp-caption alignnone\"><a href=\"http:\/\/opentextbc.ca\/socialpsychology\/wp-content\/uploads\/sites\/21\/2014\/09\/Figure-2-12.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-2663\" src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/457\/2016\/08\/09165240\/Figure-2-12.png\" alt=\"Overconfidence\" class=\"wp-image-2663\" height=\"159\" width=\"400\" \/><\/a><\/p>\n<p id=\"caption-attachment-2663\" class=\"wp-caption-text\">Figure \u00a02.12 Dunning and colleagues\u00a0(1990) found that, regardless of whether they were judging strangers or their roommates, students were overconfident. The percentage confidence that they assigned to their own predictions was significantly higher than the actual percentage of their predictions that were correct.<\/p>\n<\/div>\n<p>Making matters even worse, Kruger and Dunning (1999)\u00a0found that people who scored low rather than high on tests of spelling, logic, grammar, and humor appreciation were also most likely to show overconfidence by overestimating how well they would do. Apparently, poor performers are doubly cursed\u2014they not only are unable to predict their own skills but also are the most unaware that they can\u2019t do so (Dunning, Johnson, Ehrlinger, &amp; Kruger, 2003).<\/p>\n<p>The tendency to be overconfident in our judgments can have some very negative effects. When eyewitnesses testify in courtrooms regarding their memories of a crime, they often are completely sure that they are identifying the right person. But their confidence doesn\u2019t correlate much with their actual accuracy. This is, in part, why so many people have been wrongfully convicted on the basis of inaccurate eyewitness testimony given by overconfident witnesses (Wells &amp; Olson, 2003). Overconfidence can also spill over into professional judgments, for example, in clinical psychology (Oskamp, 1965) and in market investment and trading (Chen, Kim, Nofsinger, &amp; Rui, 2007). Indeed, in regards to our case study at the start of this chapter, the role of overconfidence bias in the financial crisis of 2008 and its aftermath has been well documented (Abbes, 2012).<\/p>\n<p>This overconfidence also often seems to apply to social judgments about the future in general. A pervasive\u00a0<strong>optimistic bias<em>\u00a0<\/em><\/strong>has been noted in members of many cultures (Sharot, 2011), which can be defined as<em> a tendency to believe that positive outcomes are more likely to happen than negative ones, particularly in relation to ourselves versus others.<\/em> Importantly, this optimism is often unwarranted. Most people, for example, underestimate their risk of experiencing negative events like divorce and illness, and overestimate the likelihood of positive ones, including gaining a promotion at work or living to a ripe old age (Schwarzer, \u00a01994). There is some evidence of diversity in regards to optimism, however, across different groups. People in collectivist cultures tend not to show this bias to the same extent as those living in individualistic ones (Chang, Asakawa, &amp; Sanna, 2001). Moreover, individuals who have clinical depression have been shown to evidence a phenomenon termed\u00a0<strong>depressive realism<\/strong><em>,\u00a0<\/em>whereby their<em> social judgments about the future are less positively skewed and often more accurate than those who do not have depression<\/em> (Moore &amp; Fresco, 2012).<\/p>\n<p>The optimistic bias can also extend into the <strong>planning fallacy<\/strong><em>,\u00a0<\/em>defined as <em>a tendency to\u00a0overestimate the amount that we can accomplish over a particular time frame.<\/em>\u00a0This fallacy can also entail the underestimation of the resources and costs involved in completing a task or project, as anyone who has attempted to budget for home renovations can probably attest to. Everyday examples of the planning fallacy abound, in everything from the completion of course assignments to the construction of new buildings. On a grander scale, newsworthy items in any country hosting a major sporting event, for example, the Olympics or World Cup soccer always seem to include the spiralling budgets and overrunning timelines as the events approach.<\/p>\n<p>Why is the planning fallacy so persistent? Several factors appear to be at work here. Buehler, Griffin and Peetz (2010) argue that when planning projects, individuals orient to the future and pay too little attention to their past relevant experiences. This can cause them to overlook previous occasions where they experienced difficulties and over-runs. They also tend to plan for what time and resources are likely to be needed, if things run as planned. That is, they do not spend enough time thinking about all the things that might go wrong, for example, all the unforeseen demands on their time and resources that may occur during the completion of the task. Worryingly, the planning fallacy seems to be even stronger for tasks where we are highly motivated and invested in timely completions. It appears that wishful thinking is often at work here (Buehler et al., 2010).\u00a0For some further perspectives on the advantages and disadvantages of the optimism bias, see this engaging TED Talk by Tali Sharot at:\u00a0<a target=\"_blank\" href=\"http:\/\/www.ted.com\/talks\/tali_sharot_the_optimism_bias\">http:\/\/www.ted.com\/talks\/tali_sharot_the_optimism_bias<\/a><\/p>\n<p>If these biases related to overconfidence appear at least sometimes to lead us to inaccurate social judgments, a key question here is why are they so pervasive? What functions do they serve? One possibility is that they help to enhance people&#8217;s motivation and self-esteem levels. If we have a positive view of our abilities and judgments, and are confident that we can execute tasks to deadlines, we will be more likely to attempt challenging projects and to put ourselves forward for demanding opportunities. Moreover, there is consistent evidence that a mild degree of optimism can predict a range of positive outcomes, including success and even physical health (Forgeard &amp; Seligman, 2012).<\/p>\n<h2>The Importance of Cognitive Biases in Everyday Life<\/h2>\n<p>In our review of some of the many cognitive biases that affect our social judgment, we have seen that the effects on us as individuals range from fairly trivial decisions; for example, which phone to buy (which perhaps\u00a0doesn&#8217;t seem so trivial at the time) to potentially life and death decisions (about methods of travel, for instance).<\/p>\n<p>However, when we consider that many of these errors will not only affect us but also everyone around us, then their consequences can really add up. Why would so many people continue to buy lottery tickets or to gamble their money in casinos when the likelihood of them ever winning is so low? One possibility, of course, is the representative heuristic\u2014people ignore the low base rates of winning and focus their attention on the salient likelihood of winning a huge prize. And the belief in astrology, which all scientific evidence suggests is not accurate, is probably driven in part by the salience of the occasions when the predictions do occur\u2014when a horoscope is correct (which it will of course sometimes be), the correct prediction is highly salient and may allow people to maintain the (overall false) belief as they recollect confirming evidence more readily.<\/p>\n<p>People may also take more care to prepare for unlikely events than for more likely ones because the unlikely ones are more salient or accessible. For instance, people may think that they are more likely to die from a terrorist attack or as the result of a homicide than they are from diabetes, stroke, or tuberculosis. But the odds are much greater of dying from the health problems than from terrorism or homicide. Because people don\u2019t accurately calibrate their behaviors to match the true potential risks, the individual and societal costs are quite large (Slovic, 2000).<\/p>\n<p>As well as influencing our judgments relating to ourselves, salience and accessibility also color how we perceive our social worlds, which may have a big influence on our behavior. For instance, people who watch a lot of violent television shows also tend to view the world as more dangerous in comparison to those who watch less violent TV (Doob &amp; Macdonald, 1979).\u00a0This follows from the idea that our judgments are based on the accessibility of relevant constructs. We also overestimate our contribution to joint projects (Ross &amp; Sicoly, 1979),\u00a0perhaps in part because our own contributions are so obvious and salient, whereas the contributions of others are much less so. And the use of cognitive heuristics can even affect our views about global warming. Joireman, Barnes, Truelove, and Duell (2010)\u00a0found that people were more likely to believe in the existence of global warming when they were asked about it on hotter rather than colder days and when they had first been primed with words relating to heat. Thus the principles of salience and accessibility, because they are such an important part of our social judgments, can create a series of biases that can make a difference on a truly global level.<\/p>\n<p>As we have already seen specifically in relation to overconfidence, research has found that even people who should know better\u2014and who need to know better\u2014are subject to cognitive biases in general. Economists, stock traders, managers, lawyers, and even doctors have been found to make the same kinds of mistakes in their professional activities that people make in their everyday lives (Byrne &amp; McEleney, 2000; Gilovich, Griffin, &amp; Kahneman, 2002; Hilton, 2001).\u00a0And the use of cognitive heuristics is increased when people are under time pressure (Kruglanski &amp; Freund, 1983)\u00a0or when they feel threatened (Kassam, Koslov, &amp; Mendes, 2009),\u00a0exactly the situations that\u00a0often occur when professionals are required to make their decisions.<\/p>\n<h2>Biased About Our Biases: The Bias Blind Spot<\/h2>\n<p>So far, we have discussed some of the most important and heavily researched social cognitive biases that affect our appraisals of ourselves in relation to our social worlds and noted some of their key limitations. Recently, some social psychologists have become interested in how aware we are of how these biases and the ways in which they can affect our own and others&#8217; thinking. The short answer to this is that we often underestimate the extent to which our social cognition is biased, and that we typically (incorrectly) believe that we are less biased than the average person. Researchers have named<em>\u00a0<\/em>this<em> tendency to believe that our own judgments are less susceptible to the influence of bias than those of others <\/em>as the <strong>bias blind spot<\/strong> (Ehrlinger, Gilovich, &amp; Ross, 2005). Interestingly, the level of bias blind spot that people demonstrate is unrelated to the actual amount of bias they show in their social judgments (West, Meserve, &amp; Stanovich, 2012). Moreover, those scoring higher in cognitive ability actually tend to show a larger bias blind spot (West et al., 2012).<\/p>\n<p>So, if our social cognition appears to be riddled with multiple biases, and we tend to show biases about these biases, what hope is there for us in reaching sound social judgments? \u00a0Before we arrive at such a pessimistic conclusion, however, it is important to redress the balance of evidence a little. Perhaps just learning more about these biases, as we have done in this chapter, can help us to recognize when they are likely to be useful to our social judgments, and to take steps to reduce their effects when they hinder our understanding of our social worlds. Maybe, although many of the biases discussed tend to persist even in the face of our awareness, at the very least, learning about them could be an important first step toward reducing their unhelpful effects on our social cognition. In order to get reliably better at policing our biases, though, we probably need to go further. One of the world&#8217;s foremost authorities on social cognitive biases, Nobel Laureate Daniel Kahneman, certainly thinks so. He argues that individual awareness of biases is an important precursor to the development of a common vocabulary about them, that will then make us better able as communities to discuss their effects on our social judgments (Kahneman, 2011). Kahneman also asserts that we may be more likely to recognize and challenge bias in each other&#8217;s thinking than in our own, an observation that certainly fits with the concept of the bias blind spot. Perhaps, even if we cannot effectively police our thinking on our own, we can help to police one another&#8217;s.<\/p>\n<p>These arguments are consistent with some evidence that, although mere awareness is rarely enough to significantly attenuate the effects of bias, it can be helpful when accompanied by systematic cognitive retraining.\u00a0Many social psychologists and other scientists are working to help people make better decisions. One possibility is to provide people with better feedback. Weather forecasters, for instance, are quite accurate in their decisions (at least in the short-term), in part because they are able to learn from the clear feedback that they get about the accuracy of their predictions. Other research has found that accessibility biases can be reduced by leading people to consider multiple alternatives rather than focusing only on the most obvious ones, and by encouraging people to think about exactly the opposite possible outcomes than the ones they are expecting (Hirt, Kardes, &amp; Markman, 2004).\u00a0And certain educational experiences can help people to make better decisions. For instance, Lehman, Lempert, and Nisbett (1988)\u00a0found that graduate students in medicine, law, and chemistry, and particularly those in psychology, all showed significant improvement in their ability to reason correctly over the course of their graduate training.<\/p>\n<p>Another source for some optimism about the accuracy of our social cognition is that these heuristics and biases can, despite their limitations, often lead us to a broadly accurate understanding of the situations we encounter. Although we do have limited cognitive abilities, information, and time when making social judgments, that does not mean we cannot and do not make enough sense of our social worlds in order to function effectively in our daily lives. Indeed, some researchers, including Cosmides and Tooby (2000) and Gigerenzer (2004) have argued that these biases and heuristics have been sculpted by evolutionary forces to offer fast and frugal\u00a0<span style=\"line-height: 1.5em\">ways of reaching sound judgments about our infinitely complex social worlds enough of the time to have adaptive value. If, for example, you were asked to say which Spanish city had a larger population, Madrid or Valencia, the chances are you would quickly answer that Madrid was bigger, even if you did not know the relevant population figures. Why? Perhaps the availability heuristic and cognitive accessibility had something to do with it\u2014the chances are that most people have just heard more about Madrid in the global media over the years, and they can more readily bring these instances to mind. From there, it is a short leap to the general rule that larger cities tend to get more media coverage. So, although our journeys to\u00a0our social judgments may not be always be pretty, at least we often arrive at the right destination.\u00a0<\/span><\/p>\n<h3>Social Psychology in the Public Interest<\/h3>\n<p>The Validity of Eyewitness Testimony<\/p>\n<p>One social situation in which the accuracy of our person-perception skills is vitally important is the area of eyewitness testimony (Charman &amp; Wells, 2007; Toglia, Read, Ross, &amp; Lindsay, 2007; Wells, Memon, &amp; Penrod, 2006).\u00a0Every year, thousands of individuals are charged with and often convicted of crimes based largely on eyewitness evidence. In fact, many people who were convicted prior to the existence of forensic DNA have now been exonerated by DNA tests, and more than 75% of these people were victims of mistaken eyewitness identification (Wells, Memon, &amp; Penrod, 2006; Fisher, 2011).<\/p>\n<p>The judgments of eyewitnesses are often incorrect, and there is only a small correlation between how accurate and how confident an eyewitness is. Witnesses are frequently overconfident, and a person who claims to be absolutely certain about his or her identification is not much more likely to be accurate than someone who appears much less sure, making it almost impossible to determine whether a particular witness is accurate or not (Wells &amp; Olson, 2003).<\/p>\n<p>To accurately remember a person or an event at a later time, we must be able to accurately see and store the information in the first place, keep it in memory over time, and then accurately retrieve it later. But the social situation can influence any of these processes, causing errors and biases.<\/p>\n<p>In terms of initial encoding of the memory, crimes normally occur quickly, often in situations that are accompanied by a lot of stress, distraction, and arousal. Typically, the eyewitness gets only a brief glimpse of the person committing the crime, and this may be under poor lighting conditions and from far away. And the eyewitness may not always focus on the most important aspects of the scene. Weapons are highly salient, and if a weapon is present during the crime, the eyewitness may focus on the weapon, which would draw his or her attention away from the individual committing the crime (Steblay, 1997).\u00a0In one relevant study, Loftus, Loftus, and Messo (1987)\u00a0showed people slides of a customer walking up to a bank teller and pulling out either a pistol or a checkbook. By tracking eye movements, the researchers determined that people were more likely to look at the gun than at the checkbook and that this reduced their ability to accurately identify the criminal in a lineup that was given later.<\/p>\n<p>People may be particularly inaccurate when they are asked to identify members of a race other than their own (Brigham, Bennett, Meissner, &amp; Mitchell, 2007).\u00a0In one field study, for example, Meissner and Brigham (2001)\u00a0sent European-American, African-American, and Hispanic students into convenience stores in El Paso, Texas. Each of the students made a purchase, and the researchers came in later to ask the clerks to identify photos of the shoppers. Results showed that the clerks demonstrated the own-race bias: they were all more accurate at identifying customers belonging to their own racial or ethnic group, which may be more salient to them, than they were at identifying people from other groups. There seems to be some truth to the adage that \u201cThey all look alike\u201d\u2014at least if an individual is looking at someone who is not of his or her own race.<\/p>\n<div id=\"attachment_930\" style=\"width: 410px\" class=\"wp-caption alignnone\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-930\" class=\"wp-image-930\" alt=\"people\" src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/457\/2016\/08\/09165243\/people1-1024x256.jpg\" height=\"100\" width=\"400\" \/><\/p>\n<p id=\"caption-attachment-930\" class=\"wp-caption-text\">Figure 2.13 One source of error in eyewitness testimony is the relative difficulty of accurately identifying people who are not of one\u2019s own race.<br \/> Source: Ladakh, Hemis Shukpachan by Dietmar Temps (<a href=\"https:\/\/www.flickr.com\/photos\/deepblue66\/10607432526\">https:\/\/www.flickr.com\/photos\/deepblue66\/10607432526<\/a>) used under CC BY-NC-SA 2.0 license (<a href=\"https:\/\/creativecommons.org\/licenses\/by-nc-sa\/2.0\/\">https:\/\/creativecommons.org\/licenses\/by-nc-sa\/2.0\/<\/a>). Group Portrait by John Ragai (<a href=\"https:\/\/www.flickr.com\/photos\/johnragai\/13167551744\">https:\/\/www.flickr.com\/photos\/johnragai\/13167551744<\/a>) used under CC BY 2.0 (<a href=\"https:\/\/creativecommons.org\/licenses\/by\/2.0\/\">https:\/\/creativecommons.org\/licenses\/by\/2.0\/<\/a>). College students by Adam S (<a href=\"https:\/\/www.flickr.com\/photos\/111963716@N06\/11529206136\">https:\/\/www.flickr.com\/photos\/111963716@N06\/11529206136<\/a>) used under CC BY 2.0 (<a href=\"https:\/\/creativecommons.org\/licenses\/by\/2.0\/\">https:\/\/creativecommons.org\/licenses\/by\/2.0\/<\/a>)<\/p>\n<\/div>\n<p>Even if information gets encoded properly, memories may become distorted over time. For one thing, people might discuss what they saw with other people, or they might read information relating to it from other bystanders or in the media. Such postevent information can distort the original memories such that the witnesses are no longer sure what the real information is and what was provided later. The problem is that the new, inaccurate information is highly cognitively accessible, whereas the older information is much less so. The reconstructive memory bias suggests that the memory may shift over time to fit the individual&#8217;s current beliefs about the crime. Even describing a face makes it more difficult to recognize the face later (Dodson, Johnson, &amp; Schooler, 1997).<\/p>\n<p>In an experiment by Loftus and Palmer (1974),\u00a0participants viewed a film of a traffic accident and then, according to random assignment to experimental conditions, answered one of three questions:<\/p>\n<ol>\n<li>About how fast were the cars going when they hit each other?<\/li>\n<li>About how fast were the cars going when they smashed each other?<\/li>\n<li>About how fast were the cars going when they contacted each other?<\/li>\n<\/ol>\n<p>As you can see in in the Figure 2.14, &#8220;Reconstructive Memory,&#8221; although all the participants saw the same accident, their estimates of the speed of the cars varied by condition. People who had seen the \u201csmashed\u201d question estimated the highest average speed, and those who had seen the \u201ccontacted\u201d question estimated the lowest.<\/p>\n<div id=\"attachment_2664\" style=\"width: 410px\" class=\"wp-caption alignnone\"><a href=\"http:\/\/opentextbc.ca\/socialpsychology\/wp-content\/uploads\/sites\/21\/2014\/09\/Figure-2-14.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-2664\" src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/457\/2016\/08\/09165245\/Figure-2-14.png\" alt=\"Reconstructive Memory\" class=\"wp-image-2664\" height=\"110\" width=\"400\" \/><\/a><\/p>\n<p id=\"caption-attachment-2664\" class=\"wp-caption-text\">Figure 2.14 Reconstructive Memory<\/p>\n<\/div>\n<p>Participants viewed a film of a traffic accident and then answered a question about the accident. According to random assignment, the blank was filled by either \u201chit,\u201d \u201csmashed,\u201d or \u201ccontacted\u201d each other. The wording of the question influenced the participants\u2019 memory of the accident. Data are from Loftus and Palmer (1974).<\/p>\n<p>The situation is particularly problematic when the eyewitnesses are children, because research has found that children are more likely to make incorrect identifications than are adults (Pozzulo &amp; Lindsay, 1998)\u00a0and are also subject to the own-race identification bias (Pezdek, Blandon-Gitlin, &amp; Moore, 2003).\u00a0In many cases, when sex abuse charges have been filed against babysitters, teachers, religious officials, and family members, the children are the only source of evidence. The possibility that children are not accurately remembering the events that have occurred to them creates substantial problems for the legal system.<\/p>\n<p>Another setting in which eyewitnesses may be inaccurate is when they try to identify suspects from mug shots or lineups. A lineup generally includes the suspect and five to seven other innocent people (the fillers), and the eyewitness must pick out the true perpetrator. The problem is that eyewitnesses typically feel pressured to pick a suspect out of the lineup, which increases the likelihood that they will mistakenly pick someone (rather than no one) as the suspect.<\/p>\n<p>Research has attempted to better understand how people remember and potentially misremember the scenes of and people involved in crimes and to attempt to improve how the legal system makes use of eyewitness testimony. In many states, efforts are being made to better inform judges, juries, and lawyers about how inaccurate eyewitness testimony can be. Guidelines have also been proposed to help ensure that child witnesses are questioned in a nonbiasing way (Poole &amp; Lamb, 1998).\u00a0Steps can also be taken to ensure that lineups yield more accurate eyewitness identifications. Lineups are more fair when the fillers resemble the suspect, when the interviewer makes it clear that the suspect might or might not be present (Steblay, Dysart, Fulero, &amp; Lindsay, 2001),\u00a0and when the eyewitness has not been shown the same pictures in a mug-shot book prior to the lineup decision. And several recent studies have found that witnesses who make accurate identifications from a lineup reach their decision faster than do witnesses who make mistaken identifications, suggesting that authorities must take into consideration not only the response but how fast it is given (Dunning &amp; Perretta, 2002).<\/p>\n<p>In addition to distorting our memories for events that have actually occurred, misinformation may lead us to falsely remember information that never occurred. Loftus and her colleagues asked parents to provide them with descriptions of events that did happen (e.g., moving to a new house) and did not happen (e.g., being lost in a shopping mall) to their children. Then (without telling the children which events were real or made up) the researchers asked the children to imagine both types of events. The children were instructed to \u201cthink really hard\u201d about whether the events had occurred (Ceci, Huffman, Smith, &amp; Loftus, 1994).\u00a0More than half of the children generated stories regarding at least one of the made-up events, and they remained insistent that the events did in fact occur even when told by the researcher that they could not possibly have occurred (Loftus &amp; Pickrell, 1995).\u00a0Even college students are susceptible to manipulations that make events that did not actually occur seem as if they did (Mazzoni, Loftus, &amp; Kirsch, 2001).<\/p>\n<p>The ease with which memories can be created or implanted is particularly problematic when the events to be recalled have important consequences. Therapists often argue that patients may repress memories of traumatic events they experienced as children, such as childhood sexual abuse, and then recover the events years later as the therapist leads them to recall the information\u2014for instance, by using dream interpretation and hypnosis (Brown, Scheflin, &amp; Hammond, 1998).<\/p>\n<p>But other researchers argue that painful memories such as sexual abuse are usually very well remembered, that few memories are actually repressed, and that even if they are, it is virtually impossible for patients to accurately retrieve them years later (McNally, Bryant, &amp; Ehlers, 2003; Pope, Poliakoff, Parker, Boynes, &amp; Hudson, 2007).\u00a0These researchers have argued that the procedures used by the therapists to \u201cretrieve\u201d the memories are more likely to actually implant false memories, leading the patients to erroneously recall events that did not actually occur. Because hundreds of people have been accused, and even imprisoned, on the basis of claims about \u201crecovered memory\u201d of child sexual abuse, the accuracy of these memories has important societal implications. Many psychologists now believe that most of these claims of recovered memories are due to implanted, rather than real, memories (Loftus &amp; Ketcham, 1994).<\/p>\n<p>Taken together, then, the problems of eyewitness testimony represent another example of how social cognition\u2014including the processes that we use to size up and remember other people\u2014may be influenced, sometimes in a way that creates inaccurate perceptions, by the operation of salience, cognitive accessibility, and other information-processing biases.<\/p>\n<p>\u00a0<\/p>\n<div class=\"bcc-box bcc-success\">\n<h3>Key Takeaways<\/h3>\n<ul>\n<li>We use our schemas and attitudes to help us judge and respond to others. In many cases, this is appropriate, but our expectations can also lead to biases in our judgments of ourselves and others.<\/li>\n<li>A good part of our social cognition is spontaneous or automatic, operating without much thought or effort. On the other hand, when we have the time and the motivation to think about things carefully, we may engage in thoughtful, controlled cognition.<\/li>\n<li>Which expectations we use to judge others is based on both the situational salience of the things we are judging and the cognitive accessibility of our own schemas and attitudes.<\/li>\n<li>Variations in the accessibility of schemas lead to biases such as the availability heuristic, the representativeness heuristic, the false consensus bias, biases caused by counterfactual thinking, and those elated to overconfidence.<\/li>\n<li>The potential biases that are the result of everyday social cognition can have important consequences, both for us in our everyday lives but even for people who make important decisions affecting many other people. Although biases are common, they are not impossible to control, and psychologists and other scientists are working to help people make better decisions.<\/li>\n<li>The operation of cognitive biases, including the potential for new information to distort information already in memory, can help explain the tendency for eyewitnesses to be overconfident and frequently inaccurate in their recollections of what occurred at crime scenes.<\/li>\n<\/ul>\n<\/div>\n<p>\u00a0<\/p>\n<div class=\"bcc-box bcc-info\">\n<h3>Exercises and Critical Thinking<\/h3>\n<ol>\n<li>Give an example of a time when you may have committed one of the cognitive heuristics and biases discussed in this chapter. What factors (e.g., availability;\u00a0salience) caused the error, and what was the outcome of your use of the shortcut or heuristic? What do you see as the general advantages and disadvantages of using this bias in your everyday life? Describe one possible strategy you could use to reduce the potentially harmful effects of this bias in your life.<\/li>\n<li>Go to the website <a href=\"http:\/\/thehothand.blogspot.com\">http:\/\/thehothand.blogspot.com<\/a>, which analyzes the extent to which people accurately perceive \u201cstreakiness\u201d in sports. Based on the information provided on this site, as well as that in this chapter, in what ways might our sports perceptions be influenced by our expectations and the use of cognitive heuristics and biases?<\/li>\n<li>Different cognitive heuristics and biases often operate together to influence our social cognition in particular situations. Describe a situation where you feel that two or more biases were affecting your judgment. How did they interact? What combined effects on your social cognition did they have? Which of the heuristics and biases outlined in this chapter do you think might be particularly likely to happen together in social situations and why?<\/li>\n<\/ol>\n<\/div>\n<div class=\"textbox shaded\">\n<h3>References<\/h3>\n<p>Aarts, H., &amp; Dijksterhuis, A. (1999). How often did I do it? Experienced ease of retrieval and frequency estimates of past behavior.\u00a0<i>Acta Psychologica, 103<\/i>(1\u20132), 77\u201389.<\/p>\n<p>Abbes, M. B. (2012). Does overconfidence explain volatility during the global financial crisis?\u00a0<em>Transition Studies Review, 19(3)<\/em>, 291-312.<\/p>\n<p>Adams, P. A., &amp; Adams, J. K. (1960). Confidence in the recognition and reproduction of words difficult to spell.\u00a0<i>American Journal of Psychology, 73<\/i>, 544\u2013552.<\/p>\n<p>Ariely, D. (2008).\u00a0<i>Predictably irrational: The hidden forces that shape our decisions<\/i>. New York: Harper Perennial.<\/p>\n<p>Ariely, D., Loewenstein, D., \u00a0&amp; Prelec, D. (2003). Coherent arbitrariness: Stable demand curves without stable preferences. <em>Quarterly Journal of Economics 118 (1),<\/em>\u00a073\u2013106.<\/p>\n<p>Bargh, J. A., Chen, M., &amp; Burrows, L. (1996). Automaticity of social behavior: Direct effects of trait construct and stereotype activation on action.\u00a0<i>Journal of Personality and Social Psychology, 71<\/i>(2), 230\u2013244.<\/p>\n<p>Brewer, M. B. (1988). A dual process model of impression formation. In T. K. Srull &amp; R. S. Wyer (Eds.),\u00a0<i>Advances in social cognition<\/i>\u00a0(Vol. 1, pp. 1\u201336). Hillsdale, NJ: Erlbaum.<\/p>\n<p>Brigham, J. C., Bennett, L. B., Meissner, C. A., &amp; Mitchell, T. L. (Eds.). (2007).\u00a0<i>The influence of race on eyewitness memory<\/i>. Mahwah, NJ: Lawrence Erlbaum Associates Publishers.<\/p>\n<p>Brown, D., Scheflin, A. W., &amp; Hammond, D. C. (1998).\u00a0<i>Memory, trauma treatment, and the law<\/i>. New York, NY: Norton.<\/p>\n<p>Buehler, R., Griffin, D., &amp; Peetz, J. (2010). The planning fallacy: Cognitive, motivational, and social origins. In M. P. Zanna, J. M. Olson (Eds.) ,\u00a0<i>Advances in experimental social psychology, Vol 43<\/i>\u00a0(pp. 1-62). San Diego, CA US: Academic Press.\u00a0doi:10.1016\/S0065-2601(10)43001-4<\/p>\n<p>Byrne, R. M. J., &amp; McEleney, A. (2000). Counterfactual thinking about actions and failures to act.\u00a0<i>Journal of Experimental Psychology: Learning, Memory, and Cognition, 26<\/i>(5), 1318\u20131331.<\/p>\n<p>Ceci, S. J., Huffman, M. L. C., Smith, E., &amp; Loftus, E. F. (1994). Repeatedly thinking about a non-event: Source misattributions among preschoolers.\u00a0<i>Consciousness and Cognition: An International Journal, 3<\/i>(3\u20134), 388\u2013407.<\/p>\n<p>Chambers, J. R. (2008). Explaining false uniqueness: Why we are both better and worse than others.\u00a0<i>Social and Personality Psychology Compass, 2<\/i>(2), 878\u2013894.<\/p>\n<p>Chang, E. C., Asakawa, K., &amp; Sanna, L. J. (2001). Cultural variations in optimistic and pessimistic bias: Do Easterners really expect the worst and Westerners really expect the best when predicting future life events?.\u00a0<i>Journal of Personality and Social Psychology<\/i>,<i>81<\/i>(3), 476-491. doi:10.1037\/0022-3514.81.3.476<\/p>\n<p>Charman, S. D., &amp; Wells, G. L. (2007). Eyewitness lineups: Is the appearance-changes instruction a good idea?\u00a0<i>Law and Human Behavior, 31<\/i>(1), 3\u201322.<\/p>\n<p>Chen, G., Kim, K. A., Nofsinger, J. R., &amp; Rui, O. M. (2007). Trading performance, disposition effect, overconfidence, representativeness bias, and experience of emerging market investors.\u00a0<i>Journal of Behavioral Decision Making<\/i>,\u00a0<i>20<\/i>(4), 425-451. doi:10.1002\/bdm.561<\/p>\n<p>Cosmides, L., &amp; Tooby, J. (2000). Evolutionary psychology and the emotions. In M. Lewis &amp; J. M. Haviland-Jones (Eds.),\u00a0<em>Handbook of emotions, 2nd edition\u00a0<\/em>(pp. 91-115). New York, NY: The Guilford Press.<\/p>\n<p>Dijksterhuis, A., Bos, M. W., Nordgren, L. F., &amp; van Baaren, R. B. (2006). On making the right choice: The deliberation-without-attention effect.\u00a0<i>Science, 311<\/i>(5763), 1005\u20131007.<\/p>\n<p>Dodson, C. S., Johnson, M. K., &amp; Schooler, J. W. (1997). The verbal overshadowing effect: Why descriptions impair face recognition.\u00a0<i>Memory &amp; Cognition, 25<\/i>(2), 129\u2013139.<\/p>\n<p>Doob, A. N., &amp; Macdonald, G. E. (1979). Television viewing and fear of victimization: Is the relationship causal?\u00a0<i>Journal of Personality and Social Psychology, 37<\/i>(2), 170\u2013179.<\/p>\n<p>Dunning, D., &amp; Perretta, S. (2002). Automaticity and eyewitness accuracy: A 10- to 12-second rule for distinguishing accurate from inaccurate positive identifications.\u00a0<i>Journal of Applied Psychology, 87<\/i>(5), 951\u2013962.<\/p>\n<p>Dunning, D., Griffin, D. W., Milojkovic, J. D., &amp; Ross, L. (1990). The overconfidence effect in social prediction.\u00a0<i>Journal of Personality and Social Psychology, 58<\/i>(4), 568\u2013581.<\/p>\n<p>Dunning, D., Johnson, K., Ehrlinger, J., &amp; Kruger, J. (2003). Why people fail to recognize their own incompetence.\u00a0<i>Current Directions in Psychological Science, 12<\/i>(3), 83\u201387.<\/p>\n<p>Egan, D., Merkle, C., &amp; Weber, M. (in press). Second-order beliefs and the individual investor.\u00a0<em>Journal of Economic Behavior &amp; Organization.\u00a0<\/em><\/p>\n<p>Ehrlinger J.,\u00a0Gilovich, T.D., &amp; Ross, L. (2005). Peering into the bias blind spot: People\u2019s assessments of bias in themselves and others.\u00a0<em>Personality and Social Psychology Bulletin, 31,\u00a0<\/em>1-13.<\/p>\n<p>Ferguson, M. J., &amp; Bargh, J. A. (2003). The constructive nature of automatic evaluation. In J. Musch &amp; K. C. Klauer (Eds.),\u00a0<i>The psychology of evaluation: Affective processes in cognition and emotion<\/i>\u00a0(pp. 169\u2013188). Mahwah, NJ: Lawrence Erlbaum Associates Publishers.<\/p>\n<p>Ferguson, M. J., Hassin, R., &amp; Bargh, J. A. (2008). Implicit motivation: Past, present, and future. In J. Y. Shah &amp; W. L. Gardner (Eds.),\u00a0<i>Handbook of motivation science<\/i>\u00a0(pp. 150\u2013166). New York, NY: Guilford Press.<\/p>\n<p>Fisher, R. P. (2011). Editor\u2019s introduction: Special issue on psychology and law.\u00a0<i>Current Directions in Psychological Science, 20<\/i>, 4. doi:10.1177\/0963721410397654<\/p>\n<p>Forgeard, M. C., &amp; Seligman, M. P. (2012). Seeing the glass half full: A review of the causes and consequences of optimism.<i>Pratiques Psychologiques<\/i>,\u00a0<i>18<\/i>(2), 107-120. doi:10.1016\/j.prps.2012.02.002<\/p>\n<p>Gigerenzer, G. (2004). Fast and frugal heuristics: The tools of founded rationality. In D. J. Koehler &amp; N. Harvey (Eds.),\u00a0<em>Blackwell handbook of judgment and decision making\u00a0<\/em>(pp. 62-88). Malden, MA: Blackwell Publishing.<\/p>\n<p>Gigerenzer, G. (2006). Out of the frying pan and into the fire: Behavioral reactions to terrorist attacks.\u00a0<em>Risk Analysis, 26,\u00a0<\/em>347-351.<\/p>\n<p>Gilovich, T., Griffin, D., &amp; Kahneman, D. (Eds.). (2002).\u00a0<i>Heuristics and biases: The psychology of intuitive judgment<\/i>. New York, NY: Cambridge University Press.<\/p>\n<p>Haddock, G., Rothman, A. J., Reber, R., &amp; Schwarz, N. (1999). Forming judgments of attitude certainty, intensity, and importance: The role of subjective experiences.\u00a0<i>Personality and Social Psychology Bulletin, 25<\/i>, 771\u2013782.<\/p>\n<p>Heine, S. J., Lehman, D. R., Markus, H. R., &amp; Kitayama, S. (1999). Is there a universal need for positive self-regard? <em>Psychological Review, 106(4),<\/em> 766-794. doi: 10.1037\/0033-295X.106.4.766<\/p>\n<p>Hilton, D. J. (2001). The psychology of financial decision-making: Applications to trading, dealing, and investment analysis.\u00a0<i>Journal of Behavioral Finance, 2<\/i>, 37\u201353. doi: 10.1207\/S15327760JPFM0201_4<\/p>\n<p>Hirt, E. R., Kardes, F. R., &amp; Markman, K. D. (2004). Activating a mental simulation mind-set through generation of alternatives: Implications for debiasing in related and unrelated domains.\u00a0<i>Journal of Experimental Social Psychology, 40<\/i>(3), 374\u2013383.<\/p>\n<p>Hsee, C. K., Hastie, R., &amp; Chen, J. (2008). Hedonomics: Bridging decision research with happiness research. Perspectives On <em>Psychological Science, 3(3)<\/em>, 224-243. doi:10.1111\/j.1745-6924.2008.00076.x<\/p>\n<p>Joireman, J., Barnes Truelove, H., &amp; Duell, B. (2010). Effect of outdoor temperature, heat primes and anchoring on belief in global warming.\u00a0<i>Journal of Environmental Psychology, 30<\/i>(4), 358\u2013367.<\/p>\n<p>Kahneman, D. (2011). <i>Thinking fast and slow.\u00a0<\/i>New York: Farrar, Strauss, Giroux.<\/p>\n<p>Kassam, K. S., Koslov, K., &amp; Mendes, W. B. (2009). Decisions under distress: Stress profiles influence anchoring and adjustment.\u00a0<i>Psychological Science, 20<\/i>(11), 1394\u20131399.<\/p>\n<p>Krueger, J., &amp; Clement, R. W. (1994). The truly false consensus effect: An ineradicable and egocentric bias in social perception.\u00a0<i>Journal of Personality and Social Psychology, 67<\/i>(4), 596\u2013610.<\/p>\n<p>Kruger, J., &amp; Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one\u2019s own incompetence lead to inflated self-assessments.\u00a0<i>Journal of Personality and Social Psychology, 77<\/i>(6), 1121\u20131134.<\/p>\n<p>Kruglanski, A. W., &amp; Freund, T. (1983). The freezing and unfreezing of lay inferences: Effects on impressional primacy, ethnic stereotyping, and numerical anchoring.\u00a0<i>Journal of Experimental Social Psychology, 19,<\/i>\u00a0448\u2013468.<\/p>\n<p>Lehman, D. R., Lempert, R. O., &amp; Nisbett, R. E. (1988). The effects of graduate training on reasoning: Formal discipline and thinking about everyday-life events.\u00a0<i>American Psychologist, 43<\/i>(6), 431\u2013442.<\/p>\n<p>Loewenstein, G. F., Weber, E. U., Hsee, C. K., &amp; Welch, N. (2001). Risk as feelings.\u00a0<i>Psychological Bulletin, 127<\/i>(2), 267\u2013286.<\/p>\n<p>Loftus, E. F., &amp; Ketcham, K. (1994).\u00a0<i>The myth of repressed memory: False memories and allegations of sexual abuse<\/i>\u00a0(1st ed.). New York, NY: St. Martin\u2019s Press.<\/p>\n<p>Loftus, E. F., &amp; Palmer, J. C. (1974). Reconstruction of automobile destruction: An example of the interaction between language and memory.\u00a0<i>Journal of Verbal Learning &amp; Verbal Behavior, 13<\/i>(5), 585\u2013589.<\/p>\n<p>Loftus, E. F., &amp; Pickrell, J. E. (1995). The formation of false memories.\u00a0<i>Psychiatric Annals, 25<\/i>(12), 720\u2013725.<\/p>\n<p>Loftus, E. F., Loftus, G. R., &amp; Messo, J. (1987). Some facts about \u201cweapon focus.\u201d\u00a0<i>Law and Human Behavior, 11<\/i>(1), 55\u201362.<\/p>\n<p>Mazzoni, G. A. L., Loftus, E. F., &amp; Kirsch, I. (2001). Changing beliefs about implausible autobiographical events: A little plausibility goes a long way.\u00a0<i>Journal of Experimental Psychology: Applied, 7<\/i>(1), 51\u201359.<\/p>\n<p>McArthur, L. Z., &amp; Post, D. L. (1977). Figural emphasis and person perception.\u00a0<i>Journal of Experimental Social Psychology, 13<\/i>(6), 520\u2013535.<\/p>\n<p>McNally, R. J., Bryant, R. A., &amp; Ehlers, A. (2003). Does early psychological intervention promote recovery from posttraumatic stress?\u00a0<i>Psychological Science in the Public Interest, 4<\/i>(2), 45\u201379.<\/p>\n<p>Medvec, V. H., Madey, S. F., &amp; Gilovich, T. (1995). When less is more: Counterfactual thinking and satisfaction among Olympic medalists.\u00a0<i>Journal of Personality and Social Psychology, 69<\/i>(4), 603\u2013610.<\/p>\n<p>Meissner, C. A., &amp; Brigham, J. C. (2001). Thirty years of investigating the own-race bias in memory for faces: A meta-analytic review.\u00a0<i>Psychology, Public Policy, and Law, 7<\/i>(1), 3\u201335.<\/p>\n<p>Miller, D. T., Turnbull, W., &amp; McFarland, C. (1988). Particularistic and universalistic evaluation in the social comparison process.\u00a0<i>Journal of Personality and Social Psychology, 55<\/i>, 908\u2013917.<\/p>\n<p>Moore, M. T., &amp; Fresco, D. M. (2012). Depressive realism: A meta-analytic review.\u00a0<i>Clinical Psychology Review<\/i>,\u00a0<i>32<\/i>(6), 496-509. doi:10.1016\/j.cpr.2012.05.004<\/p>\n<p>Oskamp, S. (1965). Overconfidence in case-study judgments.\u00a0<em>Journal of Consulting Psychology, 29(3)<\/em>, 261-265.<\/p>\n<p>Pezdek, K., Blandon-Gitlin, I., &amp; Moore, C. (2003). Children\u2019s face recognition memory: More evidence for the cross-race effect.\u00a0<i>Journal of Applied Psychology, 88<\/i>(4), 760\u2013763.<\/p>\n<p>Poole, D. A., &amp; Lamb, M. E. (1998).\u00a0<i>The development of interview protocols<\/i>. Washington, DC: American Psychological Association.<\/p>\n<p>Pope, H. G., Jr., Poliakoff, M. B., Parker, M. P., Boynes, M., &amp; Hudson, J. I. (2007). Is dissociative amnesia a culture-bound syndrome? Findings from a survey of historical literature.\u00a0<i>Psychological Medicine: A Journal of Research in Psychiatry and the Allied Sciences, 37<\/i>(2), 225\u2013233.<\/p>\n<p>Pozzulo, J. D., &amp; Lindsay, R. C. L. (1998). Identification accuracy of children versus adults: A meta-analysis.\u00a0<i>Law and Human Behavior, 22<\/i>(5), 549\u2013570.<\/p>\n<p>Reber, R., Winkielman, P., &amp; Schwarz, N. (1998). Effects of perceptual fluency on affective judgments.\u00a0<i>Psychological Science, 9<\/i>(1), 45\u201348.<\/p>\n<p>Roese, N. J. (1997). Counterfactual thinking.\u00a0<i>Psychological Bulletin, 121<\/i>(1), 133\u2013148.<\/p>\n<p>Ross, L., Greene, D., &amp; House, P. (1977). The false consensus effect: An egocentric bias in social perception and attribution processes.\u00a0<i>Journal of Experimental Social Psychology, 13<\/i>(3), 279\u2013301.<\/p>\n<p>Ross, M., &amp; Sicoly, F. (1979). Egocentric biases in availability and attribution.\u00a0<i>Journal of Personality and Social Psychology, 37<\/i>(3), 322\u2013336.<\/p>\n<p>Schwarz, N., &amp; Vaughn, L. A. (Eds.). (2002).\u00a0<i>The availability heuristic revisited: Ease of recall and content of recall as distinct sources of information<\/i>. New York, NY: Cambridge University Press.<\/p>\n<p>Schwarz, N., Bless, H., Strack, F., Klumpp, G., Rittenauer-Schatka, H., &amp; Simons, A. (1991). Ease of retrieval as information: Another look at the availability heuristic.\u00a0<i>Journal of Personality and Social Psychology, 61,<\/i>\u00a0195\u2013202.<\/p>\n<p>Sharot, T. (2011).\u00a0<em>The optimism bias: A tour of the irrationally positive brain.\u00a0<\/em>New York: Pantheon Books.<\/p>\n<p>Sloman, S. A. (Ed.). (2002).\u00a0<i>Two systems of reasoning<\/i>. New York, NY: Cambridge University Press.<\/p>\n<p>Slovic, P. (Ed.). (2000).\u00a0<i>The perception of risk<\/i>. London, England: Earthscan Publications.<\/p>\n<p>Stanovich, K. E., &amp; West, R. F. (Eds.). (2002).\u00a0<i>Individual differences in reasoning: Implications for the rationality debate?<\/i>\u00a0New York, NY: Cambridge University Press.<\/p>\n<p>Steblay, N. M. (1997). Social influence in eyewitness recall: A meta-analytic review of lineup instruction effects.\u00a0<i>Law and Human Behavior, 21<\/i>(3), 283\u2013297.<\/p>\n<p>Steblay, N., Dysart, J., Fulero, S., &amp; Lindsay, R. C. L. (2001). Eyewitness accuracy rates in sequential and simultaneous lineup presentations: A meta-analytic comparison.\u00a0<i>Law and Human Behavior, 25<\/i>(5), 459\u2013473.<\/p>\n<p>Taylor, S. E., &amp; Fiske, S. T. (1978). Salience, attention and attribution: Top of the head phenomena.\u00a0<i>Advances in Experimental Social Psychology, 11,<\/i>\u00a0249\u2013288.<\/p>\n<p>Toglia, M. P., Read, J. D., Ross, D. F., &amp; Lindsay, R. C. L. (Eds.). (2007).\u00a0<i>The handbook of eyewitness psychology<\/i>\u00a0(Vols. 1 &amp; 2). Mahwah, NJ: Lawrence Erlbaum Associates Publishers.<\/p>\n<p>Tversky, A., &amp; Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability.\u00a0<i>Cognitive Psychology, 5<\/i>, 207\u2013232.<\/p>\n<p>Tversky, A., &amp; Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases.\u00a0<i>Science, 185<\/i>(4157), 1124\u20131131.<\/p>\n<p>Wells, G. L., &amp; Olson, E. A. (2003). Eyewitness testimony.\u00a0<i>Annual Review of Psychology, 54<\/i>, 277\u2013295.<\/p>\n<p>Wells, G. L., Memon, A., &amp; Penrod, S. D. (2006). Eyewitness evidence: Improving its probative value.\u00a0<i>Psychological Science in the Public Interest, 7<\/i>(2), 45\u201375.<\/p>\n<p>West, R. F., Meserve, R. J., &amp; Stanovich, K. E. (2012). Cognitive sophistication does not attenuate the bias blind spot.\u00a0<i>Journal Of Personality and Social Psychology<\/i>,\u00a0<i>103<\/i>(3), 506-519. doi:10.1037\/a0028857<\/p>\n<p>Willis, J., &amp; Todorov, A. (2006). First impressions: Making up your mind after a 100-Ms exposure to a face.\u00a0<i>Psychological Science, 17<\/i>(7), 592\u2013598.<\/p>\n<p>Winkielman, P., &amp; Cacioppo, J. T. (2001). Mind at ease puts a smile on the face: Psychophysiological evidence that processing facilitation elicits positive affect.\u00a0<i>Journal of Personality and Social Psychology, 81<\/i>(6), 989\u20131000.<\/p>\n<p>Winkielman, P., Schwarz, N., &amp; Nowak, A. (Eds.). (2002).\u00a0<i>Affect and processing dynamics: Perceptual fluency enhances evaluations<\/i>. Amsterdam, Netherlands: John Benjamins Publishing Company.<\/p>\n<\/div>\n\n\t\t\t <section class=\"citations-section\" role=\"contentinfo\">\n\t\t\t <h3>Candela Citations<\/h3>\n\t\t\t\t\t <div>\n\t\t\t\t\t\t <div id=\"citation-list-60\">\n\t\t\t\t\t\t\t <div class=\"licensing\"><div class=\"license-attribution-dropdown-subheading\">CC licensed content, Shared previously<\/div><ul class=\"citation-list\"><li>Principles of Social Psychology - 1st International Edition. <strong>Authored by<\/strong>: Rajiv Jhangiani, Hammond Tarry, and Charles Stangor. <strong>Provided by<\/strong>: BC Campus OpenEd. <strong>Located at<\/strong>: <a target=\"_blank\" href=\"https:\/\/open.bccampus.ca\/find-open-textbooks\/?uuid=66c0cf64-c485-442c-8183-de75151f13f5&#038;contributor=&#038;keyword=&#038;subject=\">https:\/\/open.bccampus.ca\/find-open-textbooks\/?uuid=66c0cf64-c485-442c-8183-de75151f13f5&#038;contributor=&#038;keyword=&#038;subject=<\/a>. <strong>License<\/strong>: <em><a target=\"_blank\" rel=\"license\" href=\"https:\/\/creativecommons.org\/licenses\/by-nc-sa\/4.0\/\">CC BY-NC-SA: Attribution-NonCommercial-ShareAlike<\/a><\/em><\/li><\/ul><\/div>\n\t\t\t\t\t\t <\/div>\n\t\t\t\t\t <\/div>\n\t\t\t <\/section>","protected":false},"author":26,"menu_order":3,"template":"","meta":{"_candela_citation":"[{\"type\":\"cc\",\"description\":\"Principles of Social Psychology - 1st International Edition\",\"author\":\"Rajiv Jhangiani, Hammond Tarry, and Charles Stangor\",\"organization\":\"BC Campus OpenEd\",\"url\":\"https:\/\/open.bccampus.ca\/find-open-textbooks\/?uuid=66c0cf64-c485-442c-8183-de75151f13f5&contributor=&keyword=&subject=\",\"project\":\"\",\"license\":\"cc-by-nc-sa\",\"license_terms\":\"\"}]","CANDELA_OUTCOMES_GUID":"","pb_show_title":"on","pb_short_title":"","pb_subtitle":"","pb_authors":[],"pb_section_license":""},"chapter-type":[],"contributor":[],"license":[],"class_list":["post-60","chapter","type-chapter","status-publish","hentry"],"part":43,"_links":{"self":[{"href":"https:\/\/courses.lumenlearning.com\/suny-hccc-social-psychology\/wp-json\/pressbooks\/v2\/chapters\/60","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/courses.lumenlearning.com\/suny-hccc-social-psychology\/wp-json\/pressbooks\/v2\/chapters"}],"about":[{"href":"https:\/\/courses.lumenlearning.com\/suny-hccc-social-psychology\/wp-json\/wp\/v2\/types\/chapter"}],"author":[{"embeddable":true,"href":"https:\/\/courses.lumenlearning.com\/suny-hccc-social-psychology\/wp-json\/wp\/v2\/users\/26"}],"version-history":[{"count":1,"href":"https:\/\/courses.lumenlearning.com\/suny-hccc-social-psychology\/wp-json\/pressbooks\/v2\/chapters\/60\/revisions"}],"predecessor-version":[{"id":291,"href":"https:\/\/courses.lumenlearning.com\/suny-hccc-social-psychology\/wp-json\/pressbooks\/v2\/chapters\/60\/revisions\/291"}],"part":[{"href":"https:\/\/courses.lumenlearning.com\/suny-hccc-social-psychology\/wp-json\/pressbooks\/v2\/parts\/43"}],"metadata":[{"href":"https:\/\/courses.lumenlearning.com\/suny-hccc-social-psychology\/wp-json\/pressbooks\/v2\/chapters\/60\/metadata\/"}],"wp:attachment":[{"href":"https:\/\/courses.lumenlearning.com\/suny-hccc-social-psychology\/wp-json\/wp\/v2\/media?parent=60"}],"wp:term":[{"taxonomy":"chapter-type","embeddable":true,"href":"https:\/\/courses.lumenlearning.com\/suny-hccc-social-psychology\/wp-json\/pressbooks\/v2\/chapter-type?post=60"},{"taxonomy":"contributor","embeddable":true,"href":"https:\/\/courses.lumenlearning.com\/suny-hccc-social-psychology\/wp-json\/wp\/v2\/contributor?post=60"},{"taxonomy":"license","embeddable":true,"href":"https:\/\/courses.lumenlearning.com\/suny-hccc-social-psychology\/wp-json\/wp\/v2\/license?post=60"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}