Pocketblog has gone back to basics This is part of an extended course in management.
In understanding decision-making, there are three key things to focus on:
Using a structured process
The role of intuition, gut instinct and hunches
The effects of bias and automatic thinking
Let’s look at each of these in turn.
Structured Decision Making Process
… like the example below.
One of the most important choices in your decision process will be whether to go for an adversarial process of setting the options against one another – perhaps even having advocates for each, competing with one another to win the decision – or to go for a process of inquiry, learning as much as you can before assessing the options.
Although Malcolm Gladwell received a lot of attention for his book Blink, his work leans heavily on the research by Gary Klein and his books, The Power of Intuition and the more technical Sources of Power are first rate. Klein shows how, in domains that are very complex and in which you have extensive experience, your intuition can quickly get you to the right understanding, well ahead of your ability to explain why or how you reached the conclusion you did. But, if you don’t have sufficient experience, then your hunches are likely to be wrong, due to the existence of…
Bias and Automatic Thinking
Two psychologists, Daniel Kahnemann and Amos Tversky, were responsible for overthrowing the crude assumption that economics is based on rational decisions. In fact, they showed that many decisions are a result of automatic thinking and biases. The automatic thinking is a short cut that works well in the domains in which humans evolved, but leads frequently to wrong answers in a modern world context. An example is the ‘horns and halo effect’ and another is our bias towards noticing examples that confirm what we believe to be true, whilst being blind to counter examples. Daniel Kahnemann wrote the wonderful ‘Thinking, Fast and Slow’ to summarise a life’s research and it is, without a doubt, one of the most important and stimulating reads of the last few years.
Your brain is wired to think fast. So, to do this, it needs to take shortcuts, that psychologists call heuristics. But these shortcuts don’t always give the right answer. They give rise to cognitive bias.
Cognitive bias is the result of the shortcuts. If every car door you’ve ever encountered opens outwards, it’s a good bet that the next one you encounter will too. That’s a bias in your assumptions. Usually, it serves you well. One day, it may let you down.
But the cognitive biases that we need to worry about are those that are baked into our mental operating system. We make the mistakes without realising it. They lead to bad decisions – sometimes to catastrophe.
In understanding how we think, one big idea has dominated in recent years. It became widely known through Daniel Kahneman‘s phenomenal best-seller, ‘Thinking, fast and slow‘. It’s the idea that we process information in two ways. There are two parallel thinking systems in our minds: System 1 and System 2.
There are many terms for these two systems. They have been called:
associative and rule-based
implicit and explicit
intuitive and analytical
experiential and rational
and many more
The terms System 1 and System 2 are marvellously neutral. They first emerged in a paper by Keith Stanovich and Richard West. But it’s Kahneman’s adoption of this language and the popularity of his book that gave them fame.
Chip and Dan Heath have a writing style that turns important ideas into simple formulations, and illustrates them with compelling case studies. Their three books (to date) are all best-sellers and each is well-worth reading for any manager, professional, or entrepreneur.
Of the three, the first is not only the one that made their name, but the one that, for me, has the stickiest ideas: Made to Stick.
Chip Heath is a graduate of Texas A&M University where he studied Industrial Engineering. He went on to do a PhD in psychology at Stanford University. He is there today, as Professor of Organisational Behaviour at the Graduate School of Business, having also held academic posts at The University of Chicago Graduate School of Business (1991 to 97) and the Fuqua School of Business at Duke University (1997-2000).
Dan Heath has a BA from the University of Texas at Austin and an MBA from Harvard Business School. He has been a researcher for the Harvard Business School and also co-founded an innovative academic publisher, Thinkwell, whch provides school level textbooks. He now works at Duke University, as a Senior Fellow at The Center for the Advancement of Social Entrepreneurship (CASE), where he also founded the Change Academy.
The Heath Brothers’ Books
Chip and Dan Heath have written three books together:
Made to Stick: Why Some Ideas Survive and Others Die (2007)
Each of them describes a series of steps for being effective in doing something – communicating ideas, making change, and taking decisions. I strongly recommend you to read these books – I have gained a lot from each of them. Here, all I’ll do is summarise the main content.
Made to Stick
Why is it that some ideas circulate easily? People like to share them and, when they do, the ideas are memorable, compelling and soon become pervasive. They seem to be almost made to stick.
If we can understand the answer, perhaps we can also make our own ideas sticky. This is the substance of the Heath’s ideas, which they present in a handy acronym: SUCCESs.
Simple: We need to simplify our ideas by whittling away every superfluous detail to find their core, which we can then communicate to others.
Unexpected: One way to get attention is with surprise, and then we can hold that attention by stimulating curiosity.
Concrete: Real stories and examples make our ideas solid. Abstract theory is the enemy of engagement with your ideas.
Credible: People need to believe your idea for it to stick, which means giving them examples they can relate to, demonstrating your authority, and providing ways they can access proof for themselves.
Emotional: We make choices and remember ideas, when they trigger powerful emotions, so you need to demonstrate what’s in it for your audience, in terms of self-interest and emotional payback.
Stories: We are story-telling creatures, and we use stories to guide us in how to respond to situations. They make things real and inspire us.
One of the key roles for managers is to make changes in our organisations. But it is fiendishly difficult. The Heaths argue that the reason is a conflict that’s built into our brains, between our rational mind and our emotional mind. This idea will be familiar to readers of Daniel Kahneman’s Thinking: Fast and Slow.
The Heaths use the metaphor of an elephant and its rider. The elephant is the powerful emotional aspect of our brain, which can easily take us where it’s going anyway, while the rider is our rational side that needs to motivate the elephant to go in the right direction. They offer a three way prescription to:
Direct the rider
Motivate the elephant
Shape the path
Direct the Rider
Here, we have to find out what works and repeat it, discover specific steps that will get people where you need them to go, and create a direction to go and a reason to go there.
Motivate the Elephant
We don’t do things because we know they are right, we do them because they feel right. So we need to appeal to people’s emotions as well as their reason. We also need to make change easy, by presenting small, simple steps. Finally, they advocate instilling a growth mindset.
Shape the Path
Change people’s environment to shift behaviours and make the changes feel easier. Then turn the new behaviours into habits, by making repetition easy. Finally, use successes to spread the ideas and engage others.
Back to Kahneman! Our decisions are disrupted by an array of biases and irrationalities. We jump to conclusions and then become overconfident that we’re right. We look for confirming evidence and disregard other information that conflicts with our prejudices. We’re distracted by emotions – which make emotionally resonant ideas sticky.
In short, we’re rubbish at making good decisions!
And knowing it doesn’t help, ‘any more than knowing that we are nearsighted helps us to see’, say the Heaths. But luckily they also give us a four-step framework to help us make better decisions: WRAP.
Widen Your Options
Yes or no, this or that, big or small. Narrow choices make bad decisions, so the first step is to explore a wider space of options. And the book shows you how.
Reality-test Your Assumptions
Stop trying to show you’re right and start trying to prove you’re wrong. Only if you fail, then you can start to be confident in your assumptions.
Attain Distance Before Deciding
Shift your perspective in time, place or emotion. How will this decision look in five years, what do people do somewhere different, what would you tell your friend to do?
Prepare to be Wrong
Overconfidence hides the flaws in your thinking, so look for the things that can go wrong and find ways to alert yourself when events mean you need to shift decision.
What? You want more of a summary than summarising three chunky books in a thousand words. Just go out and read them!
By the way, there are lots of great resources linked to their books, on the Heath Brothers website.
What the world needs now, more than anything else, is a greater degree of rationality. And Julia Galef is on a mission to help us get there.
Julia Galef was born in 1983, in Maryland. She studied statistics at Columbia University, graduating in 2005. Initially, Galef continued an academic career, starting an economics PhD course. However, it was not for her, and she moved to New York and began working as a freelance journalist.
There, she joined the New York Skeptics and, with philosopher Massimo Pigliucci started the podcast, Rationally Speaking, in 2010. In 2015, Pigliucci dropped out and Galef continues as the sole host.
In 2011, Galef moved to California to join a group of friends who had secured funding to start the Center for Applied Rationality. It began its work in 2012 and predominantly provides training in how to think more rationally. She is currently its president.
Hang on, Galef is a Public Intellectual…
What has that to do with Management?
Management needs to be more rational. It isn’t that there is no place for intuition. It is, however, because intuition only serves us well in situations where we have deep experience.
And in a rapidly changing world where technology, commercial opportunities, and social policy are evolving at a phenomenal rate, none of your really crucial decisions can possibly be based on deep experience. Nobody has that.
Galef has a great metaphor for understanding two mindsets, or ways of approaching reality. These mindsets manifest most clearly when we get into discussions or arguments in which we disagree with the other person’s analysis.
A soldier needs to fight to survive. They are therefore trained to be defensive and combative. And by the nature of fighting forces, they are tribal too. The Soldier Mindset is therefore one of feeling safest when we are certain, and fighting against an opponent to protect ourselves. This may be defensive or offensive in nature, but there is value in being right and defending our position – even if it means attacking the other person.
Galef doesn’t say it, but I will. How familiar is this in modern western political discourse?
Scouts on the other hand are not tasked to fight, but to gather information. Facts, data and evidence are valuable to a scout, as is objective assessment of what they learn. Consequently, scouts are open to re-evaluate their evaluation, based on new information. The Scout Mindset is one of curiosity and a desire to cut through bias and prejudice to get at the truth. There is value for a scout in testing long-held assumptions and beliefs, so for them, there is no sense of losing face if they need to change their opinion.
If what you value is the certainty of a simple analysis, and don’t want to let a few rogue facts spoil a good story, then you have a Soldier Mindset. And those facts will, eventually, spoil your story.
If, on the other hand, you recognise that the world is complex and the decisions you make are neither straightforward nor familiar, then you may feel you need to interrogate the data fully, listen to different perspectives, and draw careful but provisional conclusions. These will stand until conflicting evidence forces you to re-evaluate.
That is the Scout Mindset, and it sounds like the basis of grown up management to me.
Julia Galef at TED
Here is Galef speaking about the Soldier and Scout Mindsets at TED, in 2016.
Jennifer Aaker wants you to get your message across. And her conclusion is that the best way you can do it is by telling a story. Stories are powerful, memorable, and impactful.
Jennifer Aaker was born in 1967 and grew up in California. She studied psychology at UC California, Berkeley, under Daniel Kahneman and Philip Tetlock, graduating in 1989. She went on to win a PhD at Stanford University’s Graduate School of Business in 1995.
She went straight into an academic role as Assistant Professor in the School of Management at UCLA Anderson. She then returned to The Stanford Graduate School of Business in 1999, becoming a full professor in 2004, and General Atlantic Professor of Marketing in 2005.
We try to avoid framing our management thinkers in terms of their family members, but it is relevant to note in passing that Jennifer Aaker’s father is David Aaker – now an emeritus professor of advertising. Clearly he was influential in Aaker’s interest in branding and you can watch the two Aakers in conversation about brand and marketing.
However, she has moved away from that as her primary interest, focusing on two areas:
the psychology of happiness, and how it relates to our perceptions of time and money
how we can communicate via social media, using the power of storytelling
The two link together, because small acts, often mediated by social media messaging, can have an effect on our happiness.
Jennifer Aakers came to prominence researching the personalities we associate with brands. Her idea was to see if there are a small subset of ‘personality types’ that consumers associate with brands. These would be like the ‘Big Five’ personality factors* in people. Each one is clearly distinct from the others and together, they account for a large proportion of personality traits.
Her assessment was that bands do have ‘personalities’ and that consumers make consistent interpretations. So her research set out to narrow the number of different personality types down to five. In her paper**, she shows how she reduced brand personality labels down to:
The personality dimension that a brand chooses to emphasise will influence consumer buying and loyalty choices. She advocated that brands can select a dominant personality type to emphasise, and present related characteristics to its audience. This creates a way to communicate brand identity and values.
Interestingly, subsequent work show that her five dimensions are far more parochial than the true Big Five Personality Factors. Outside the US, where she conducted her work, other brand personality dimensions are dominant, including Peacefulness in Japan, and Passion in Spain.
The Dragonfly Effect
The metaphor Aaker and Smith chose is one of a dragonfly’s agility being dependent upon it co–ordinating the use of four wings. In communicating effectively using digital media, Aaker and Smith’s four components are:
What one goal will you pursue?
How will you seize your audience’s attention in a noisy environment?
What story will engage your audience and appeal to their emotions?
What will you ask of your audience, and what difference will they make?
Before you communicate, you need to decide on a goal. It will need to meet five design criteria:
Humanistic – affecting people
Actionable – inspire action
Measurable – clear success criteria
Clarity – cannot be further simplified
Happiness – achieving the goal will make people happier
To grab attention, your message must be at least one of:
To engage your audience, you need to tell a story. Stories connect the audience to the story-teller and create an emotional response. This is important because we primarily make our decisions emotionally, and use reason to justify them afterwards.
People should fee ready and able to take action. As much as possible, make it easy for them, and fun. And the more they feel you are offering them something that is uniquely tailored to them and their circumstances, the more readily they will act.
Jennifer Aaker talking about her Research on Happiness
… and how it relates to social media.
* The Big Five Personality Factors are: Openness, Conscientiousness, Extraversion, Agreeableness, Neuroticism
Jim Collins has built an astonishingly successful career as an author, speaker and corporate commentator, on a simple methodology. Pick the best, compare them with the rest, and find differences in behaviours that appear to explain the causes. It is a methodology that has created simple, coherent lessons and led to vast book sales, in the millions.
Jim Collins was born in Colorado in the USA, in 1958, and lives there today. He studied at Stanford University, where he earned a BSc in Mathematics and an MBA. He then went to work at management consulting firm McKinsey, before moving into industry as a product manager for Hewlett Packard. He returned to Stanford as a lecturer in the Graduate School for Business, and then, in 1995, returned to Boulder Colorado to found his own ‘Management Laboratory’.
This move followed the success of his second* book – and the one that made his name – ‘Built To Last: Successful Habits of Visionary Companies‘, which he co-authored with Jerry Porras. It has sold over 4 million copies. In his management laboratory, he works with a research team, gathering and analysing the data that form the basis of his books:
In Built to Last, Collins and Porras established a simple methodology: hypothesise a reason why some companies endure and thrive, identify a set of thriving, enduring companies and compare them with others that are not, and look for evidence to confirm your hypothesis. Their hypothesis, for which they found ample evidence, was that corporations with a strong vision, purpose and value set that would guide all of their choices would endure and thrive over a long period. Each of their prime examples were industry leaders and had been since the 1950s. Each was compared to a similar competitor that had not fared so well and was far less admired. Each was shown to have stronger, more immutable core values, a central raison d’etre, and a clear sight towards its future, guided by them.
In Good to Great, Collins went on to look at why some companies perform and achieve results that are markedly superior to direct competitors and others in their sector. Again, Collins’ research, with a team of 20 researchers, identified their best performers, paired each one to a similar but lesser performing comparator, and also compared them to other players in their sector. His research data found seven characteristics that distinguished the great from the good:
A distinctive style of leadership, which they described as Level 5 Leadership: with a drive for the company to succeed paired to a personal humility
Starting with selecting the right people and then deciding on their roles; ‘First Who, Then What’.
A climate of tough conversations that face realities and move forward with deliberate optimism: ‘Confront the Brutal Facts’.
A real sense of focus on one thing, rather than dissipating efforts across many: ‘The Hedgehog Concept’. Collins illustrated this with three overlapping circles of passion (‘What creates real passion?’), excellence (What can you be best in the world at?), and infrastructure (What drives your economic engine?)
A ‘Culture of Discipline’ that means people follow the rules yet paradoxically are freed up to innovate.
Careful adoption of technology that deepens their success in the three circles: ‘Technology Accelerators’.
Continual innovation and change that constantly improves the business (within the constraints of the Hedgehog concept): ‘The Flywheel’.
A Critique of Collins’ Methodology
Collins’ books are readable and their logic is compelling. However, he misses one key point: correlation does not imply causation. Because there is a pattern, we cannot be confident that any one element of that pattern has caused the differences. Both Built to Last and Good to Great contain exemplars of excellence that have since declined and even failed. Collins and his supporters assert that this is because they deviated from what made them great. Critics would say that the examples were little more than the tail of a distribution of poor, good and great data points.
Indeed, as a scientist, it seems clear to me that, no matter how compelling Collins’ evidence is, it conflicts with the scientific method. Data analysis alone does not make good science. In science, we form a hypothesis and then try to disprove it. We shake it, test it, challenge it and try to break it. The more resilient our hypothesis is to these insults, the greater confidence we can have in its predictive capacity. Collins, on the other hand, forms his hypotheses and then builds ever more evidence to support them. He does not try to break them, and so falls into the trap that cognitive psychologists refer to as ‘confirmation bias’. Indeed, one of his strongest critics is Nobel Prize-winning psychologist, Daniel Kahneman.
It seems to me that it is events and time that will shake Collins’ hypotheses and test what stands or falls. I don’t feel confident enough to critique his findings, but I do worry that I find his answers compellingly plausible. This is far from proof and we must accept that there may be an equally large role for contingency in the levels of success of some businesses. However, I will make my last words in support of Collins’ analysis. Luck or design: it is hard to see how adopting his ideas can possibly harm a business; it seems to me that all of Collins’ seven Good to Great characteristics are self evidently ‘good things’. You makes your choices and you takes your chances!
Daniel Kahneman has won many awards and honours, but none more surprising, perhaps, than a Nobel Prize. Why is this surprising? Kahneman is, after all, one the most eminent and influential psychologists of his time. It is surprising because there is no Nobel Prize for psychology: Kahneman was co-recipient of the 2002 Nobel Prize in Economics ‘for having integrated insights from psychological research into economic science, especially concerning human judgement and decision making under uncertainty’.
In short, what Kahneman taught us was that, before he and his co-worker, Amos Tversky (who sadly died six years before the Nobel Committee considered this prize and so was not eligible), had started to study human decision making, all economic theories were based on the same, false assumption. Kahneman and Tversky taught us that human beings are not rational agents when we make economic decisions: we are instinctive, intuitive, biased decision makers.
And, if that sounds pretty obvious to us now, then we have Kahneman and Tversky, and their long walks together, to thank.
Daniel Kahneman was born in 1934 to Lithuanian emigré parents living in Paris (although he was born when they were visiting family members in Tel Aviv). When Nazi Germany occupied France, the family went on the run, ending up after the war in what was then (1948) Palestine under the British Mandate, shortly before the formation of the State of Israel.
In 1954 he gained his BSc from the Hebrew University, in Psychology and Maths, and joined the psychology department of the Israeli Defence Forces, helping with officer selection. Four years later, he went to the University of California, Berkeley, where he was awarded a PhD in 1961. He returned to the Hebrew University in 1961.
It was in 1968, while hosting a seminar, that he met Amos Tversky. They started collaborating shortly afterwards. Their fertile discussions often involved thought experiments about how we make decisions and judgements, uncovering in themselves a series of heuristics – or thinking shortcuts – which they went on to observe in controlled laboratory experiments. Their collaboration continued until Tversky’s death in 1996.
In that time, they collaborated with other researchers, most notably, Paul Slovic and economists Richard Thaler and Jack Knetsch. Their many insights into how we make judgements and the application to economic decision-making eventually led to the Nobel Committee recognising Kahneman with the 2002 Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel.
Kahneman’s 2011 book, Thinking, Fast and Slow is a summary of a remarkable life’s work. If the ideas are new to you, they may well rock your world. It is not an easy read, but it is remarkably well-written for an intelligent lay audience. Even if Kahneman’s work is familiar to you, this book will repay close reading.
There is far too much in Kahneman’s work to even begin to summarise it, so I want to focus on three biases that he discovered, which have a profound impact on the choices we make; often leading us far astray.
The Anchoring Bias
The first information we get biases any subsequent choices we make. Your father was right, or your mother or anyone else who told you at a young age that first impressions count. Systematically, the human brain takes the first information it receives, and creates an interpretation of everything else that is anchored in the inferences it draws from that first impression. In management terms, this accounts for the horns and halo effect, that biases us to seek and spot confirming evidence for our pre-existing assessment.
The Representativeness Bias
Who is a more likely person to find working in a car repair shop, changing your brakes? Is it A: a young woman with blond hair and pink eyeliner, or B: a young woman with blond hair and pink eyeliner, whose father owns the car repair shop?
If you think B, you have fallen for representativeness bias. The story makes more sense in our experience, doesn’t it? A young woman with blond hair and pink eyeliner is not a person you’d expect to see in that environment. But a young woman with blond hair and pink eyeliner, whose father owns the car repair shop, may feel right at home. But statistically, this is rubbish. For every young woman with blond hair and pink eyeliner, only a small proportion will also have fathers who own a car repair shop.
The Availability Bias
Recent events bias our perception of risk. They are more available to recall and hence have a stronger impact on our intuition than do counter examples. The classic example is perceptions of risk of train travel, after a train crash. Trains are safe: they rarely crash. Cars crash a lot: there are many accidents every day. But they are rarely reported, so we have no immediate intuitive sense of the statistics.
The Impact of Kahneman’s Work
Kahneman’s work has had a huge impact. Decision theory existed before he came along, but he and Tversky revolutionised it. But it was Kahneman, along with Tversky, Knetsch and Thaler who pretty much invented the discipline of behavioural economics – and perhaps the relationship that drove that development was the friendship between Thaler and Kahneman.
Now Behavioural Economics infuses much of public policy and social influence that corporations try to exert over us. Thaler’s book, Nudge (with Cass Sunstein) is a best seller and Thaler and Sunstein both advise Prime Ministers and Presidents. Next time you get a document from Government, or go into a store, and you find yourself complying with their wishes without thinking, there is a chance that you have been ‘nudged’. And the roots of these ‘choice architectures’? The roots are in understanding our heuristics and biases. And that was Kahneman’s contribution.
Kahneman at TED
Here is Daniel Kahneman, talking about how we perceive happiness and discomfort.
We then moved into a subject that was much in the news in February; and still is. With ‘Bankers’ Bonuses and Brain Biology’, we looked at recent neuroscience and how that relates to Adams’ Equity Theory.
In May, inspiration waned for a week, so where did I go to find it? ‘The Gemba’. I got it back, and later that month, got idealistic in ‘Reciprocity and Expectation’ looking at the Pay it Forward ideal and the realities of Game Theory.
5. Why do we do what we do?
In the first of two blogs on how to predict human behaviour, I looked at ‘How to Understand your Toddler’ (mine actually) and Icek Ajzen’s Theory of Planned Behaviour. Later in the year, in ‘Predicting Behaviour’, I looked at whether a simple equation (hypothesised by Kurt Lewin) could predict all behaviour.
Has any one individual been as influential in establishing management as a pragmatic academic discipline as Peter Drucker? To recognise his various achievements, I wrote a triptych of blogs over the summer:
Will history look on Tom Peters with the respect that it holds for Drucker and Deming? Who knows? But without a doubt, Peters has been influential, insightful and provocative for thirty years or more, and I am sure many of his ideas will survive. In ‘Crazy Times Again’, I drew a line from FW Taylor (father of ‘Scientific Management’) to Peters.
10. The Circle Chart
In ‘Going Round in Circles’ I returned to management models and one of my all time favourites: Fisher and Ury’s Circle Chart. I applied it to problem solving rather than, as they did, to negotiation.
Fisher and Ury are experts on conflict resolution, as is Morton Deutsch. In ‘Conflict: As simple as AEIOU’, I looked at a fabulously simple conflict resolution model that originated in Deutsch’s International Centre for Cooperation and Conflict Resolution.
11. Two Notable Events
Two notable events made the autumn memorable for Pocketblog: one sad and one happy.
In last week’s post we discussed some of the decision-making traps that board members–or, indeed, any decision-making group–can fall into.
At the heart of our understanding of these biases is the work of Daniel Kahneman. He was awarded the Nobel Prize for Economics for his work in this area, with co-worker Amos Tversky. in 2002. Sadly Tversky died in 1996 and was ineligible for the prize, under Nobel rules.
Kahneman is perhaps the leading psychologist in the field of behavioural economics – very much a field du jour. His research was carried out with many collaborators including Paul Slovic, an expert in the field of perception of risk, and Richard Thaler, most notable for his use of the term “nudge” to describe how we can use perceptions to shift behaviour.
The classic paper that Kahneman and Tversky wrote was ‘Judgment under Uncertainty: Heuristics and Biases’, published in the Journal Science in September 1974.
Heuristic: A rule of thumb or simple procedure for reaching a decision or solving a problem.
In this article, they introduced three important heuristics, which guide many of our decisions – and frequently let us down.
We tend to believe a possible event is more likely when we can recognise it as a part of a familiar pattern. It is as if we like to create stories about our world that follow standard arcs or plots (see for example Christopher Booker’s wonderfully argued The Seven Basic Plots: Why We Tell Stories. If a potential event slips easily into one of these plots, we rate it as more likely than if the plot seems to need adjusting.
Recent salient examples render a possible event more likely in our minds than other events that do not trigger such easy recall. Immediately after a rail accident, people fear rail travel more than normal and take to their cars. The resulting spike in road deaths usually exceeds the immediate effects of the road accident.
We make estimates and decisions from a starting point and the point we choose can bias our estimate or decision. Surprisingly, even an unrelated figure presented randomly can skew a later numerical estimate.
Kahneman won’t stand still
In 2007, Kahneman received the American Psychological Association’s Award for Outstanding Lifetime Contributions to Psychology and this year, he was included in Bloomberg 50 most influential people in global finance. He says that all he knows about economics, he learned from co-workers like Richard Thaler. He is a much sought-after speaker and commentator in the business arena, and you can find recent work documented on TED (see below), in an extended interview, and in two excellent articles on the websites of prominent global strategy consulting firms,McKinsey and Booz & Company:
Daniel Kahneman: The riddle of experience vs. memory
Using examples from vacations to colonoscopies, Kahneman reveals how our “experiencing selves” and our “remembering selves” perceive happiness, and why experience does not influence decisions, in this 2010 TED talk.