Difference between revisions of "Bias"

From Lesswrongwiki
Jump to: navigation, search
m (Changed specifics -> specific)
 
(28 intermediate revisions by 9 users not shown)
Line 1: Line 1:
 
{{wikilink|Bias|List of cognitive biases}}
 
{{wikilink|Bias|List of cognitive biases}}
Bias is a term used to describe a tendency or preference towards a particular perspective, ideology or result, especially when the tendency interferes with the ability to be impartial, unprejudiced, or objective.
 
  
Humans are subject to scores of specific, predictable error patterns that are likely to make your beliefs… well… not the sorts of things you'd like to bet your future on. The biases and heuristics research program within cognitive psychology gathered solid documentation of many of these specific error patterns – many reasons why you can expect particular sorts of errors in your current beliefs.  Take this research seriously, and you'll never think the same way again.
+
'''Bias''' or Cognitive Bias is a systematic deviation from [[rationality]] committed by our cognition. They are specific, predictable error patterns in the human mind <ref> POHL, Rüdiger (orgs.). (2005) "Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory". Psychology Press. p. 2 </ref>. The [[heuristics and biases]] program in cognitive psychology has documented hundreds of reproducible errors - often big errors. This continues to be a highly active area of investigation in cognitive psychology.
  
Some starting-points:
+
In our evolutionary past, in order that a cognitive algorithm turned out into a satisfactory solution to a given problem, it wasn't enough to solve it properly. It was necessary that the solution accounted for a large number of restrictions, such as time and energetic costs. This algorithm didn't need to be perfect, only good enough to guarantee the survival and reproduction of the individual:
 +
“What selective pressures impact on decision mechanisms? Foremost is selection for making an appropriate decision in the given domain. This domain-specific pressure does not imply the need to make the best possible decision, but rather one that is good enough (a satisficing choice, as Herbert Simon, 1955, put it) and, on average, better than those of an individual’s competitors, given the costs and benefits involved.” <ref> BUSS, David(orgs.). (2005) "The Handbook of Evolutionary Psychology". Wiley, New Jersey.  p. 778. </ref>
 +
 
 +
Therefore, the human brain make operations which solve cognitive tasks through ‘shortcuts’, that work well on some cases but fail in others. Since the cognitive modules that make those tasks are universals in the human species, how and where those shortcuts lead to mistakes are also regular. The study of why, how and where such errors arise is the field of cognitive bias.
 +
Understanding cognitive biases and trying to defend against their effects has been a basic theme of [[Less Wrong]] since the days it was part of [[Overcoming Bias]].
 +
 
 +
==Starting points==
 
*Daniel Kahneman's [http://nobelprize.org/nobel_prizes/economics/laureates/2002/kahneman-lecture.html Nobel prize acceptance speech], where he summarizes the work for which he won the prize;
 
*Daniel Kahneman's [http://nobelprize.org/nobel_prizes/economics/laureates/2002/kahneman-lecture.html Nobel prize acceptance speech], where he summarizes the work for which he won the prize;
 
*[[Wikipedia:List of cognitive biases]];  
 
*[[Wikipedia:List of cognitive biases]];  
 
*Kahneman et al's [http://www.amazon.com/Judgment-under-Uncertainty-Heuristics-Biases/dp/0521284147 three] [http://www.amazon.com/Choices-Values-Frames-Daniel-Kahneman/dp/0521627494/ref=pd_bxgy_b_text_b edited] [http://www.amazon.com/Heuristics-Biases-Psychology-Intuitive-Judgment/dp/0521796792/ref=pd_bxgy_b_img_c volumes] of research on heuristics and biases (this is the best solid source, but requires obtaining hard-copy books, and is slower reading);
 
*Kahneman et al's [http://www.amazon.com/Judgment-under-Uncertainty-Heuristics-Biases/dp/0521284147 three] [http://www.amazon.com/Choices-Values-Frames-Daniel-Kahneman/dp/0521627494/ref=pd_bxgy_b_text_b edited] [http://www.amazon.com/Heuristics-Biases-Psychology-Intuitive-Judgment/dp/0521796792/ref=pd_bxgy_b_img_c volumes] of research on heuristics and biases (this is the best solid source, but requires obtaining hard-copy books, and is slower reading);
 +
*Eliezer's introductory book chapter [http://intelligence.org/files/CognitiveBiases.pdf Cognitive biases affecting judgment of existential risks] (available online).
 
*Cialdini's book [http://www.amazon.com/Influence-Practice-Robert-B-Cialdini/dp/0205609996/ref=sr_1_3?ie=UTF8&s=books&qid=1239074671&sr=1-3 Influence: Science and Practice] (at once contentful and full of engaging anecdotes and cartoons, but, again, requires actually obtaining a book);
 
*Cialdini's book [http://www.amazon.com/Influence-Practice-Robert-B-Cialdini/dp/0205609996/ref=sr_1_3?ie=UTF8&s=books&qid=1239074671&sr=1-3 Influence: Science and Practice] (at once contentful and full of engaging anecdotes and cartoons, but, again, requires actually obtaining a book);
*Eliezer's introductory book chapter [http://intelligence.org/upload/cognitive-biases.pdf Cognitive biases affecting judgment of existential risks] (available online).
 
 
*[http://psychology.wikia.com/wiki/Category:Cognitive_biases Psychology Wiki's list of Cognitive Biases]
 
*[http://psychology.wikia.com/wiki/Category:Cognitive_biases Psychology Wiki's list of Cognitive Biases]
 +
 +
==Blog posts on the concept of "bias"==
 +
*[http://www.overcomingbias.com/2006/11/what_exactly_is.html What exactly is bias?] by [[Nick Bostrom]]
 +
*[http://www.overcomingbias.com/2006/11/to_the_barricad.html To the barricades! Against ... what exactly?] by [[Robin Hanson]]
 +
*[http://lesswrong.com/lw/gp/whats_a_bias_again/ ...What's a bias, again?] by [[Eliezer Yudkowsky]]
 +
*[http://www.overcomingbias.com/2006/11/the_big_four_ec.html Are The Big Four Econ Errors Biases?] by [[Robin Hanson]]
 +
*[http://www.overcomingbias.com/2006/11/incautious_defe.html In cautious defense of bias] by [[Paul Gowder]]
 +
*[http://www.overcomingbias.com/2006/12/seen_vs_unseen_.html Seen vs. Unseen Biases] by [[Robin Hanson]]
 +
*[http://lesswrong.com/lw/he/knowing_about_biases_can_hurt_people/ Knowing About Biases Can Hurt People] by [[Eliezer Yudkowsky]] - Knowing about common biases doesn't help you obtain truth if you only use this knowledge to attack beliefs you don't like.
 +
 +
==Blog posts about known cognitive biases==
 +
*[http://lesswrong.com/lw/hw/scope_insensitivity/ Scope Insensitivity] - The human brain can't represent large quantities: an environmental measure that will save 200,000 birds doesn't conjure anywhere near a hundred times the emotional impact and willingness-to-pay of a measure that would save 2,000 birds.
 +
*[http://lesswrong.com/lw/hz/correspondence_bias/ Correspondence Bias], also known as the fundamental attribution error, refers to the tendency to attribute the behavior of others to intrinsic dispositions, while excusing one's own behavior as the result of circumstance.
 +
*Confirmation bias, or [http://lesswrong.com/lw/iw/positive_bias_look_into_the_dark/ Positive Bias] is the tendency to look for evidence that confirms a hypothesis, rather than disconfirming evidence.
 +
*[http://lesswrong.com/lw/il/hindsight_bias/ Hindsight Bias] describes the tendency to seem much more likely in hindsight than could have been predicted beforehand.
 +
*[http://lesswrong.com/lw/jg/planning_fallacy/ Planning Fallacy] - We tend to plan envisioning that everything will go as expected. Even assuming that such an estimate is accurate conditional on everything going as expected, things will ''not'' go as expected. As a result, we routinely see outcomes worse than the ''ex ante'' worst case scenario.
 +
*[http://lesswrong.com/lw/ji/conjunction_fallacy/ Conjunction Fallacy] - Elementary probability theory tells us that the probability of one thing (we write P(A)) is necessarily greater than or equal to the <i>conjunction</i> of that thing <i>and</i> another thing (write P(A&B)). However, in the psychology lab, subjects' judgments do not conform to this rule. This is [http://lesswrong.com/lw/jj/conjunction_controversy_or_how_they_nail_it_down/ not an isolated artifact] of a particular study design. Debiasing [http://lesswrong.com/lw/jk/burdensome_details/ won't be as simple] as practicing specific questions, it requires certain general habits of thought.
 +
*[http://lesswrong.com/lw/jx/we_change_our_minds_less_often_than_we_think/ We Change Our Minds Less Often Than We Think] - we all change our minds occasionally, but we don't constantly, honestly reevaluate every decision and course of action. Once you think you believe something, the chances are good that you already do, for better or worse.
 +
*[http://lesswrong.com/lw/k3/priming_and_contamination/ Priming and Contamination] - Even slight exposure to a stimulus is enough to change the outcome of a decision or estimate.  See also [http://lesswrong.com/lw/3b/never_leave_your_room/ Never Leave Your Room] by Yvain, and [http://lesswrong.com/lw/4e/cached_selves/ Cached Selves] by Salamon and Rayhawk.
 +
*[http://lesswrong.com/lw/k4/do_we_believe_everything_were_told/ Do We Believe <i>Everything</i> We're Told?] - Some experiments on priming suggest that mere exposure to a view is enough to get one to passively accept it, at least until it is specifically rejected.
 +
*[http://lesswrong.com/lw/ke/illusion_of_transparency_why_no_one_understands/ Illusion of Transparency] - Everyone knows what their own words mean, but experiments have confirmed that we systematically overestimate how much sense we are making to others.
 +
*[http://lesswrong.com/lw/kf/selfanchoring/ Self-Anchoring] - Related to contamination and the illusion of transparancy, we "anchor" on our own experience and underadjust when trying to understand others.
 +
*[http://lesswrong.com/lw/lg/the_affect_heuristic/ Affect Heuristic] - Positive and negative emotional impressions exert a greater effect on many decisions than does rational analysis.
 +
*[http://lesswrong.com/lw/lh/evaluability_and_cheap_holiday_shopping/ Evaluability] - It's difficult for humans to evaluate an option except in comparison to other options. Poor decisions result when a poor category for comparison is used. Includes an application for cheap gift-shopping.
 +
*[http://lesswrong.com/lw/li/unbounded_scales_huge_jury_awards_futurism/ Unbounded Scales, Huge Jury Awards, and Futurism] - Without a metric for comparison, estimates of, e.g., what sorts of punative damages should be awarded, or when some future advance will happen, vary widely simply due to the lack of a scale.
 +
*[http://lesswrong.com/lw/lj/the_halo_effect/ The Halo Effect] - Positive qualities <i>seem</i> to correlate with each other, whether or not they ''actually'' do.
 +
*[http://lesswrong.com/lw/m9/aschs_conformity_experiment/ Asch's Conformity Experiment] - The unanimous agreement of surrounding others can make subjects disbelieve (or at least, fail to report) what's right before their eyes. The addition of just one dissenter is enough to dramatically reduce the rates of improper conformity.
 +
*[http://lesswrong.com/lw/my/the_allais_paradox/ The Allais Paradox] (and [http://lesswrong.com/lw/mz/zut_allais/ subsequent] [http://lesswrong.com/lw/n1/allais_malaise/ followups]) - Offered choices between gambles, people make decision-theoretically inconsistent decisions.
  
 
==References==
 
==References==
<!-- Always keep this header if there is at least one reference
+
{{Reflist|2}}
    Delete or add sections below as necessary -->
+
 
 +
==See also==
 +
*[[Heuristics and biases]], [[Heuristic]]
 +
*[[Debiasing]], [[Dangerous knowledge]]
 +
*[[No safe defense]]
  
=====Overcoming Bias Articles=====
+
==Not to be confused with==
<!-- For related Overcoming Bias articles,
+
*[[Statistical bias]]
    formatted as a list with each entry as "/Title/ by /Author/ -->
+
*[[Inductive bias]]
* [http://www.overcomingbias.com/2006/11/what_exactly_is.html What exactly is bias?] by [[Nick Bostrom]]
 
* [http://www.overcomingbias.com/2006/11/to_the_barricad.html To the barricades! Against ... what exactly?] by [[Robin Hanson]]
 
* [http://www.overcomingbias.com/2006/11/whats_a_bias_ag.html ... What's a bias, again?] by [[Eliezer Yudkowsky]]
 
* [http://www.overcomingbias.com/2006/11/the_big_four_ec.html Are The Big Four Econ Errors Biases?] by [[Robin Hanson]]
 
* [http://www.overcomingbias.com/2006/11/incautious_defe.html In<nowiki>[]</nowiki>cautious defense of bias] by [[Paul Gowder]]
 
* [http://www.overcomingbias.com/2006/12/seen_vs_unseen_.html Seen vs. Unseen Biases] by [[Robin Hanson]]
 
* [http://www.overcomingbias.com/2007/04/knowing_about_b.html Knowing About Biases Can Hurt People] — Knowing about common biases doesn't help you obtain truth if you only use this knowledge to attack beliefs you don't like.
 
* [http://www.overcomingbias.com/2007/05/scope_insensiti.html Scope Insensitivity] — The human brain can't represent large quantities: an environmental measure that will save 200,000 birds doesn't conjure anywhere near a hundred times the emotional impact and willingness-to-pay of a measure that would save 2,000 birds, even though ''in fact'' the former measure ''is'' two orders of magnitude more effective.  (See also further commentary in [http://www.overcomingbias.com/2007/05/one_life_agains.html One Life Against the World], and the terrible dilemma of  [http://www.overcomingbias.com/2007/10/torture-vs-dust.html Torture vs. Dust Specks])
 
* [http://www.overcomingbias.com/2007/06/correspondence-.html Correspondence Bias], also known as the fundamental attribution error, refers to the tendency to attribute the behavior of others to intrinsic dispositions, while excusing one's own behavior as the result of circumstance.
 
* Confirmation bias, or [http://www.overcomingbias.com/2007/08/positive-bias-l.html Positive Bias] is the tendency to look for evidence that confirms a hypothesis, rather than disconfirming evidence.
 
* [http://www.overcomingbias.com/2007/08/hindsight-bias.html Hindsight Bias] describes the tendency to seem much more likely in hindsight than could have been predicted beforehand.
 
* [http://www.overcomingbias.com/2007/09/planning-fallac.html Planning Fallacy] — We tend to plan envisioning that everything will go as expected. Even assuming that such an estimate is accurate conditional on everything going as expected, things will ''not'' go as expected. As a result, we routinely see outcomes worse then the ''ex ante'' worst case scenario.
 
* [http://www.overcomingbias.com/2007/09/conjunction-fal.html Conjunction Fallacy] — Elementary probability theory tells us that the probability of one thing (we write P(A)) is necessarily greater than or equal to the <i>conjunction</i> of that thing <i>and</i> another thing (write P(A&B)). However, in the psychology lab, subjects' judgments do not conform to this rule. This is [http://www.overcomingbias.com/2007/09/conjunction-con.html not an isolated artifact] of a particular study design. Debiasing [http://www.overcomingbias.com/2007/09/burdensome-deta.html won't be as simple] as practicing specific questions, it requires certain general habits of thought.
 
* [http://www.overcomingbias.com/2007/10/we-change-our-m.html We Change Our Minds Less Often Than We Think] — we all change our minds occasionally, but we don't constantly, honestly reevaluate every decision and course of action. Once you think you believe something, the chances are good that you already do, for better or worse.
 
* [http://www.overcomingbias.com/2007/10/priming-and-con.html Priming and Contamination], [http://lesswrong.com/lw/3b/never_leave_your_room/ Never Leave Your Room], and [http://lesswrong.com/lw/4e/cached_selves/ Cached Selves] — Even slight exposure to a stimulus is enough to change the outcome of a decision or estimate.
 
* [http://www.overcomingbias.com/2007/10/do-we-believe-e.html Do We Believe <i>Everything</i> We're Told?] — Some experiments on priming suggest that mere exposure to a view is enough to get one to passively accept it, at least until it is specifically rejected.
 
* [http://www.overcomingbias.com/2007/10/illusion-of-tra.html Illusion of Transparency] — Everyone knows what their own words mean, but experiments have confirmed that we systematically overestimate how much sense we are making to others.
 
* [http://www.overcomingbias.com/2007/10/self-anchoring.html Self-Anchoring] — Related to contamination and the illusion of transparancy, we "anchor" on our own experience and underadjust when trying to understand others.
 
* [http://www.overcomingbias.com/2007/11/affect-heuristi.html Affect Heuristic] — Positive and negative emotional impressions exert a greater effect on many decisions than does rational analysis.
 
* [http://www.overcomingbias.com/2007/11/evaluability.html Evaluability] — It's difficult for humans to evaluate an option except in comparison to other options. Poor decisions result when a poor category for comparison is used. Includes an application for cheap gift-shopping.
 
* [http://www.overcomingbias.com/2007/11/unbounded-scale.html Unbounded Scales, Huge Jury Awards, and Futurism] — Without a metric for comparison, estimates of, e.g., what sorts of punative damages should be awarded, or when some future advance will happen, vary widely simply due to the lack of a scale.
 
* [http://www.overcomingbias.com/2007/11/halo-effect.html The Halo Effect] — Positive qualities <i>seem</i> to correlate with each other, whether or not they ''actually'' do.
 
* [http://www.overcomingbias.com/2007/12/aschs-conformit.html Asch's Conformity Experiment] — The unanimous agreement of surrounding others can make subjects disbelieve (or at least, fail to report) what's right before their eyes. The addition of just one dissenter is enough to dramatically reduce the rates of improper conformity.
 
* [http://www.overcomingbias.com/2008/01/allais-paradox.html The Allais Paradox] (and [http://www.overcomingbias.com/2008/01/zut-allais.html subsequent] [http://www.overcomingbias.com/2008/01/allais-malaise.html followups]) — Offered choices between gambles, people make decision-theoretically inconsistent decisions.
 
  
{{stub}}
 
 
[[Category:Concepts]]
 
[[Category:Concepts]]
 
[[Category:Biases]]
 
[[Category:Biases]]
 +
[[Category:Psychology]]

Latest revision as of 03:09, 19 November 2013

Smallwikipedialogo.png
Wikipedia has articles about


Bias or Cognitive Bias is a systematic deviation from rationality committed by our cognition. They are specific, predictable error patterns in the human mind [1]. The heuristics and biases program in cognitive psychology has documented hundreds of reproducible errors - often big errors. This continues to be a highly active area of investigation in cognitive psychology.

In our evolutionary past, in order that a cognitive algorithm turned out into a satisfactory solution to a given problem, it wasn't enough to solve it properly. It was necessary that the solution accounted for a large number of restrictions, such as time and energetic costs. This algorithm didn't need to be perfect, only good enough to guarantee the survival and reproduction of the individual: “What selective pressures impact on decision mechanisms? Foremost is selection for making an appropriate decision in the given domain. This domain-specific pressure does not imply the need to make the best possible decision, but rather one that is good enough (a satisficing choice, as Herbert Simon, 1955, put it) and, on average, better than those of an individual’s competitors, given the costs and benefits involved.” [2]

Therefore, the human brain make operations which solve cognitive tasks through ‘shortcuts’, that work well on some cases but fail in others. Since the cognitive modules that make those tasks are universals in the human species, how and where those shortcuts lead to mistakes are also regular. The study of why, how and where such errors arise is the field of cognitive bias. Understanding cognitive biases and trying to defend against their effects has been a basic theme of Less Wrong since the days it was part of Overcoming Bias.

Starting points

Blog posts on the concept of "bias"

Blog posts about known cognitive biases

  • Scope Insensitivity - The human brain can't represent large quantities: an environmental measure that will save 200,000 birds doesn't conjure anywhere near a hundred times the emotional impact and willingness-to-pay of a measure that would save 2,000 birds.
  • Correspondence Bias, also known as the fundamental attribution error, refers to the tendency to attribute the behavior of others to intrinsic dispositions, while excusing one's own behavior as the result of circumstance.
  • Confirmation bias, or Positive Bias is the tendency to look for evidence that confirms a hypothesis, rather than disconfirming evidence.
  • Hindsight Bias describes the tendency to seem much more likely in hindsight than could have been predicted beforehand.
  • Planning Fallacy - We tend to plan envisioning that everything will go as expected. Even assuming that such an estimate is accurate conditional on everything going as expected, things will not go as expected. As a result, we routinely see outcomes worse than the ex ante worst case scenario.
  • Conjunction Fallacy - Elementary probability theory tells us that the probability of one thing (we write P(A)) is necessarily greater than or equal to the conjunction of that thing and another thing (write P(A&B)). However, in the psychology lab, subjects' judgments do not conform to this rule. This is not an isolated artifact of a particular study design. Debiasing won't be as simple as practicing specific questions, it requires certain general habits of thought.
  • We Change Our Minds Less Often Than We Think - we all change our minds occasionally, but we don't constantly, honestly reevaluate every decision and course of action. Once you think you believe something, the chances are good that you already do, for better or worse.
  • Priming and Contamination - Even slight exposure to a stimulus is enough to change the outcome of a decision or estimate. See also Never Leave Your Room by Yvain, and Cached Selves by Salamon and Rayhawk.
  • Do We Believe Everything We're Told? - Some experiments on priming suggest that mere exposure to a view is enough to get one to passively accept it, at least until it is specifically rejected.
  • Illusion of Transparency - Everyone knows what their own words mean, but experiments have confirmed that we systematically overestimate how much sense we are making to others.
  • Self-Anchoring - Related to contamination and the illusion of transparancy, we "anchor" on our own experience and underadjust when trying to understand others.
  • Affect Heuristic - Positive and negative emotional impressions exert a greater effect on many decisions than does rational analysis.
  • Evaluability - It's difficult for humans to evaluate an option except in comparison to other options. Poor decisions result when a poor category for comparison is used. Includes an application for cheap gift-shopping.
  • Unbounded Scales, Huge Jury Awards, and Futurism - Without a metric for comparison, estimates of, e.g., what sorts of punative damages should be awarded, or when some future advance will happen, vary widely simply due to the lack of a scale.
  • The Halo Effect - Positive qualities seem to correlate with each other, whether or not they actually do.
  • Asch's Conformity Experiment - The unanimous agreement of surrounding others can make subjects disbelieve (or at least, fail to report) what's right before their eyes. The addition of just one dissenter is enough to dramatically reduce the rates of improper conformity.
  • The Allais Paradox (and subsequent followups) - Offered choices between gambles, people make decision-theoretically inconsistent decisions.

References

  1. POHL, Rüdiger (orgs.). (2005) "Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory". Psychology Press. p. 2
  2. BUSS, David(orgs.). (2005) "The Handbook of Evolutionary Psychology". Wiley, New Jersey. p. 778.

See also

Not to be confused with