Sam McNerney brilliantly distills how philosophy, psychology and neuroscience helped him "examine" his thinking in this three part series:
Part 1 - Being Irrational About Reason:
Only later did I realize that learning about decision-making gives rise to what I term the confirmation bias bias, or the tendency for people to generate a narrow (and pessimistic) conception of the mind after reading literature on how the mind thinks narrowly. Biases restrict our thinking but learning about them should do the opposite. Yet you’d be surprised how many decision-making graduate students and overzealous Kahneman, Ariely and Gilbert enthusiasts do not understand this. Knowledge of cognitive biases, perhaps the most important thing to learn for good thinking, actually increases ignorance in some cases. This is the Sartre Fallacy – we think worse after learning how to think better.
Part 2 - Is It Inevitable?:
Learning about change blindness caused participants in one study to overestimate their vulnerability to the visual mistake. They suffered from change blindness blindness. The planning fallacy provides another example. When planners notice poor projections made in similar projects they become more confident instead of making appropriate adjustments (“We’ll never be that over budget and that late”). This was my problem. When I imagined the worst-case scenario my confidence in the best-case scenario increased. The idea, simply, is that we tend to read about biases and conclude that we are immune from them because we know they exist. This is of course itself a bias and as we’ve seen it quickly leads to an ad infinitum problem.
Part 3 - The (Real) Socratic Problem:
The Socratic problem usually refers to the fact that our accounts of Socrates are second hand, a problem exacerbated by his stubborn refusal to document, with pen and papyrus, his philosophy. I want to rebrand the Socratic problem to describe his false belief – a belief that pervades the Western world today – that knowledge is necessarily a panacea.
Why is the Sartre Fallacy difficult to avoid?
We’ve seen that humans did not evolve to be philosophers, which is fine except when we act like it.There are no guarantees that acquiring more knowledge – reading more self-help, diet, business, or decision-making books – will improve life. We’re cognitive misers; we only use System 2 if we have to, and even then we’re reluctant. The intellect is, evolutionarily speaking, the youngest addition to the human condition. We don’t know how to use it very well.
This is why the Sartre Fallacy is difficult to avoid: we don’t “do” knowledge well. Natural selection focused on the body, not the mind.
Part 1 - Being Irrational About Reason:
Only later did I realize that learning about decision-making gives rise to what I term the confirmation bias bias, or the tendency for people to generate a narrow (and pessimistic) conception of the mind after reading literature on how the mind thinks narrowly. Biases restrict our thinking but learning about them should do the opposite. Yet you’d be surprised how many decision-making graduate students and overzealous Kahneman, Ariely and Gilbert enthusiasts do not understand this. Knowledge of cognitive biases, perhaps the most important thing to learn for good thinking, actually increases ignorance in some cases. This is the Sartre Fallacy – we think worse after learning how to think better.
Part 2 - Is It Inevitable?:
Learning about change blindness caused participants in one study to overestimate their vulnerability to the visual mistake. They suffered from change blindness blindness. The planning fallacy provides another example. When planners notice poor projections made in similar projects they become more confident instead of making appropriate adjustments (“We’ll never be that over budget and that late”). This was my problem. When I imagined the worst-case scenario my confidence in the best-case scenario increased. The idea, simply, is that we tend to read about biases and conclude that we are immune from them because we know they exist. This is of course itself a bias and as we’ve seen it quickly leads to an ad infinitum problem.
Part 3 - The (Real) Socratic Problem:
The Socratic problem usually refers to the fact that our accounts of Socrates are second hand, a problem exacerbated by his stubborn refusal to document, with pen and papyrus, his philosophy. I want to rebrand the Socratic problem to describe his false belief – a belief that pervades the Western world today – that knowledge is necessarily a panacea.
Why is the Sartre Fallacy difficult to avoid?
We’ve seen that humans did not evolve to be philosophers, which is fine except when we act like it.There are no guarantees that acquiring more knowledge – reading more self-help, diet, business, or decision-making books – will improve life. We’re cognitive misers; we only use System 2 if we have to, and even then we’re reluctant. The intellect is, evolutionarily speaking, the youngest addition to the human condition. We don’t know how to use it very well.
This is why the Sartre Fallacy is difficult to avoid: we don’t “do” knowledge well. Natural selection focused on the body, not the mind.
No comments:
Post a Comment