Выбрать главу

Das ursprüngliche Beispiel, dass man je nach Kontrast einen großen Weg auf sich nimmt, stammt von Kahneman/Tversky. Siehe: Kahneman, Daniel; Tversky, Amos: »Prospect Theory: An Analysis of Decision under Risk«, Econometrica 47 (2), März 1979.

The Availability Bias

»You see that again and again – that people have some information they can count well and they have other information much harder to count. So they make the decision based only on what they can count well. And they ignore much more important information because its quality in terms of numeracy is less – even though it’s very important in terms of reaching the right cognitive result. We  [at Berkshire] would rather be roughly right than precisely wrong. In other words, if something is terribly important, we’ll guess at it rather than just make our judgment based on what happens to be easily accountable«. (Munger, Charles T.: Poor Charlie’s Almanack, Third Edition, Donning, 2008, S. 486)

Der Availability Bias ist auch der Grund, warum sich Firmen beim Risikomanagement vorwiegend auf die Finanzmarktrisiken beschränken: Dort hat man Daten en masse. Bei den operativen Risiken hingegen hat man fast keine Daten. Sie sind nicht öffentlich. Man müsste sie sich von vielen Firmen mühsam zusammenkratzen, und das ist teuer. Also stellt man Theorien auf mit dem Material, das leicht zu beschaffen ist.

»The medical literature shows that physicians are often prisoners of their first-hand experience: their refusal to accept even conclusive studies is legendary.« (Dawes, Robyn M.: Everyday Irrationality: How Pseudo-Scientists, Lunatics, and the Rest of Us Systematically Fail to Think Rationally, Westview Press, 2001, S. 102 ff.)

Das Vertrauen in die Qualität der eigenen Entscheidungen hängt allein von der Anzahl der getroffenen Entscheidungen (Vorhersagen) ab, egal wie genau oder ungenau die Entscheidungen (Vorhersagen) waren. Man kann es auch das zentrale »Consultant-Problem« bezeichnen. Siehe: Einhorn, Hillel J.; Hogarth, Robin M.: »Confidence in judgment: Persistence of the illusion of validity«, Psychological Review 85 (5), September 1978, S. 395–416.

Tversky, Amos; Kahneman, Danieclass="underline" »Availability: A heuristic for judging frequency and probability«, Cognitive Psychology 5, 1973, S. 207–232.

Die Es-wird-schlimmer-bevor-es-besser-kommt-Falle

Keine Referenzliteratur. Dieser Denkfehler ist selbsterklärend.

Story Bias

Dawes, Robyn M.: Everyday Irrationality: How Pseudo-Scientists, Lunatics, and the Rest of Us Systematically Fail to Think Rationally, Westview Press, 2001, S. 111 ff.

Turner, Mark: The Literary Mind: The Origins of Thought and Language, Oxford University Press, 1998.

Der Rückschaufehler

Zu Reagans Wahlsieg: Stacks, John F.: »Where the Polls Went Wrong«, Time Magazine 1/12/1980.

Fischoff, B.: »An early history of hindsight research«, Social Cognition 25, 2007, S. 10–13.

Blank, H.; Musch, J.; Pohl, R. F.: »Hindsight Bias: On Being Wise After the Event«, Social Cognition 25 (1), 2007, S. 1–9.

Das Chauffeur-Wissen

Die Geschichte mit Max Planck findet sich in: »Charlie Munger – USC School of Law Commencement – May 13, 2007«. Abgedruckt in: Munger, Charlie: Poor Charlie’s Almanack, Donning, 2008, S. 436.

»Again, that is a very, very powerful idea. Every person is going to have a circle of competence. And it’s going to be very hard to enlarge that circle. If I had to make my living as a musician … I can’t even think of a level low enough to describe where I would be sorted out to if music were the measuring standard of the civilization. So you have to figure out what your own aptitudes are. If you play games where other people have their aptitudes and you don’t, you’re going to lose. And that’s as close to certain as any prediction that you can make. You have to figure out where you’ve got an edge. And you’ve got to play within your own circle of competence.« (Munger,Charlie: »A Lesson on Elementary Worldly Wisdom as It Relates to Investment Management and Business«, University of Southern California, 1994 in Poor Charlie’s Almanack, Donning, 2008, S. 192)

Die Kontrollillusion

Das Beispiel mit den Giraffen aus Mayer, Christopher: »Illusion of Control – No One Can Control the Complexity and Mass of the U.S. Economy«, Freeman – Ideas on Liberty 51 (9), 2001.

Zum Würfeln im Kasino: Henslin, J. M.: »Craps and magic«, American Journal of Sociology 73, 1967, S. 316–330.

Plous, Scott: The Psychology of Judgment and Decision Making, McGraw-Hill, 1993, S. 171.

Der Psychologe Roy Baumeister hat nachgewiesen, dass Menschen mehr Schmerz tolerieren, wenn sie das Gefühl haben, sie würden eine Krankheit verstehen. Chronisch Kranke gehen viel besser mit ihrer Krankheit um, wenn der Arzt ihnen einen Namen für die Krankheit gibt und ihnen erklärt, was es mit der Krankheit auf sich hat. Das muss nicht einmal wahr sein. Der Effekt funktioniert selbst dort, wo es nachweislich kein Mittel gegen die Krankheit gibt. Siehe: Baumeister, Roy F.: The Cultural Animaclass="underline" Human Nature, Meaning, and Social Life, Oxford University Press, 2005, S. 97 ff.

Das klassische Paper dazu: Rothbaum, Fred; Weisz, John R.; Snyder, Samuel S.: »Changing the world and changing the self: A two-process model of perceived control«, Journal of Personality and Social Psychology 42 (1), 1982, S. 5–37.

Jenkins, H. H.; Ward, W. C.: »Judgement of contingency between responses and outcomes«, Psychological Monographs 79 (1), 1965.

Zu den Placeboknöpfen gibt es diese vier Referenzen:

Lockton, Dan: »Placebo buttons, false affordances and habit-forming«, Design with Intent, 2008: http://architectures.danlockton.co.uk/2008/10/01/placebo-buttons-false-affordances-and-habit-forming/

Luo, Michaeclass="underline" »For Exercise in New York Futility, Push Button«, New York Times, 27.02.2004.

Paumgarten, Nick: »Up and Then Down — The lives of elevators«, The New Yorker, 21.04.2008.

Sandberg, Jared: »Employees Only Think They Control Thermostat«, The Wall Street Journal, 15.01.2003.

Die Incentive-Superresponse-Tendenz

Munger Charles T.: Poor Charlie’s Almanack, Third Edition, Donning, 2008, S. 450 ff.

Die Geschichte mit den Fischen, ebenda S. 199.

»Perhaps the most important rule in management is: ›Get the incentives right.‹« (ebenda S. 451).

»Fear professional advice when it is especially good for the advisor.« (»The Psychology of Human Misjudgment«, in: ebenda S. 452).

Die Regression zur Mitte

Vorsicht: Die Regression zur Mitte ist kein kausaler Zusammenhang, sondern ein rein statistischer.

Kahneman: »I had the most satisfying Eureka experience of my career while attempting to teach flight instructors that praise is more effective than punishment for promoting skill-learning. When I had finished my enthusiastic speech, one of the most seasoned instructors in the audience raised his hand and made his own short speech, which began by conceding that positive reinforcement might be good for the birds, but went on to deny that it was optimal for flight cadets. He said, ›On many occasions I have praised flight cadets for clean execution of some aerobatic maneuver, and in general when they try it again, they do worse. On the other hand, I have often screamed at cadets for bad execution, and in general they do better the next time. So please don‘t tell us that reinforcement works and punishment does not, because the opposite is the case.‹ This was a joyous moment, in which I understood an important truth about the world.« (Quote: See Wikipedia entry Regression Toward The Mean)