Why we are usually all wrong
I was awful at productivity until I was about 28 years old. Then I went freelance, realised I was no longer surrounded by a great team to whom I could delegate all the crazy things I’d started and that I’d have to develop my own skills for follow-through so that things got done.
So I read all the blogs and books, transformed my productivity, and even wrote a couple of books of my own (How to be a Productivity Ninja & Introducing Productivity). My business works around the world helping some pretty intelligent people who, despite their considerable abilities, have never had anyone actually teach them good habits, or help them reflect on or perfect their productivity.
It got me thinking. Why don’t they teach all this stuff in schools, or at university? So I’m now writing a new book, Knowledge Ninja, centered around the productivity and habits needed for great learning. It turns out great productivity and great learning are pretty similar.
Perhaps it shouldn’t be that surprising. After all, there’s crossover between productivity and a whole range of other topics: efficiency, processes, happiness, getting rich, leading a balanced life, etc. At the heart of all of these is thinking about how we think, and thinking about how our thinking affects our habits. The science of how we think and of our decision-making processes and abilities is actually pretty interesting – and one of the conclusions I’m drawing? Most of us are wrong, most of the time.
There are a couple of classic studies that are widely referenced, so let’s start there.
The planning fallacy
The most commonly cited example of the planning fallacy comes from a 1994 study conducted by academics from Simon Fraser University in British Columbia, Canada in which final year university students were asked for realistic estimate as to when they would have their theses finished, along with optimistic and pessimistic estimates. The average actual time taken was 7.4 days longer than the pessimistic estimate, 21.6 days longer than the realistic, and 28.1 more than the optimistic one. This experiment is a prime example of how we overestimate our powers of planning and doing, and it’s by no means a single case: tax form completion, computer programming, origami, furniture assembly — they’ve all been used in studies to demonstrate the planning fallacy.
This research supports productivity laws such as Parkinson’s Law (“work expands to fill the time available”) and Hofstadter’s Law (“work takes longer than you expect, even when you take into account Hofstadter’s Law”). And I’m happy to admit that this is one of my own pet productivity weaknesses. Yet, how we think in lots of other situations is equally unsophisticated.
Self control & salivating for marshmallows
Walter Mischel’s famous study, “The Stanford Marshmallow Experiment,” has been repeated and reconfirmed many times since. It involved giving children a marshmallow. The person conducting the experiment was going to leave the room for a short time. The child could, if they wished, eat the marshmallow in that time. However, if when the adult returned, the marshmallow was still there, the child would be given another marshmallow. But if they’d already eaten the marshmallow, that’s the only one they’d get. The idea was the child could have instant gratification now, or have delayed gratification, but doubled.
What this study found, as it tracked participants over time, is that your ability to delay gratification has huge ramifications, not just for your ability to consume marshmallows, but also for your health, wealth and education.
Cognitive biases
Cognitive biases describe things that our brain chooses to ignore or place too much emphasis on. This skews a thought-process or decision. There are lots of examples:
The “endowment effect” & the curse of flatpack furniture
The “endowment effect” suggests that we place more value on things that we own over things that we don’t. One study found that owners of tickets for a high-profile basketball match overvalued them by a factor of 14 (Carmon & Ariely, 2000). In other words, people wanted the tickets 14 times more than others were prepared to pay. Most similar studies don’t show such a high ratio as 14:1, but it does point to there being a common problem in us overvaluing our own things. This has repercussions for our ability to declutter, for example.
Linked to this is what’s known as “the Ikea effect.” You’ll perhaps not be surprised to learn that this is the tendency for people to place a disproportionately high value on objects that they have partially assembled themselves, such as furniture from Ikea, regardless of the quality of the end result.
Anchoring
Anchoring, or “focalism” as it’s otherwise referred to in the cognitive psychology world, is a another cognitive bias. It describes the common human tendency to rely too heavily on the first piece of information offered (the “anchor”) when making decisions. Once an anchor is set, other judgments are made by adjusting away from that anchor rather than evaluating each piece of new evidence on its own, so there is a bias toward interpreting other information around the anchor. This is why with so many scenarios, the first impression becomes so crucial.
Confirmation bias
Confirmation bias describes the tendency for people to (consciously or unconsciously) seek out information that conforms to their pre-existing viewpoints, and subsequently ignore information that goes against them, both positive and negative. If you think about how your political views inform your reading or consuming of the news, or how your impression of a particular company affects the approach you take with your next purchase (or boycott!) with them, you get the idea. Avoiding confirmation bias is an important part of rationalism and of science in general.
Whether they think you can or think you can’t, they’re right…
An experiment by Rosenthal and Jacobson, known as “Pygmalion in the classroom” (and subsequently written up in a 1968 book of the same name) demonstrates that, “when teachers expect students to do well and show intellectual growth, they do; when teachers do not have such expectations, performance and growth… are discouraged.” This is the ultimate example of a self-fulfilling prophesy, and one that we’d do well to take heed of — it’s perhaps also a contributing factor in why it’s often said that you are as successful as the average of your five closest friends: so if you want to be a millionaire, hang out with millionaires.
And if you’ve read any of the above so far thinking, “Sure, but I know those things don’t apply to me,” then think again.
The bias blind spot
Social psychologist Emily Pronin (who coined the term “bias blind spot”) and her co-authors did an experiment where they first explained to subjects a whole range of cognitive biases, including some of the ones I’ve explained above and others such as the “better-than-average effect,” where people are likely to see themselves as inaccurately “better than average” for possible positive traits and “less than average” for negative traits (everyone thinks they’re a better-than-average driver, and no one describes themselves as a worse-than-average listener).
Having just heard all the scientific research about biases, the subjects were then asked how biased they themselves were. Subjects still rated themselves as being much less vulnerable to those biases than the average person. So we’re even biased towards how biased we are!
Photo: Flickr / MrdOeSe CC BY-NC-ND 2.0