Posted on

Philip Tetlock: Expert Judgment

Philip Tetlock has done more than any other academic to help us understand the process of forecasting and making predictions. He has shown us why experts don’t do well, and, with his latest work, has found the secret sauce of ‘Superforecasting‘.

Philip Tetlock
Philip Tetlock

Short Biography

Philip Tetlock was born in 1954 and grew up in Toronto. He studied psychology, gaining his BA and MA at University of British Columbia, before moving to the US, to research decision-making for his PhD at Yale.

His career has been entirely academic, with posts at University of California, Berkley (Assistant Professor, 1979-1995), Ohio State University (Chair of Psychology and Political Science, 1996-2001), a return to UC Berkley (Chair at the Haas Business School, 2002-2011), and currently, he is Annenberg University Professor at the University of Pennsylvania, where he is jointly appointed between the School of Psychology, Political Science, and the Wharton Business School.

Tetlock’s early books are highly academic, but he started to come to prominence with the publication, in 2005, of ‘Expert Political Judgment: How Good Is It? How Can We Know?‘ This book has become highly influential, by documenting the results of Tetlock’s research into the forecasting and decision making of experts. The bottom line is that the more prominent the expert: the poorer their ability to forecast accurately.

Tetlock’s most recent book, 2015’s ‘Superforecasting: The Art and Science of Prediction‘ is one of those few magic books that can change your view of the world, make you smarter, make you feel wiser, and inspire you at the same time. It is co-written with journalist Dan Gardner (whose earlier books cover Tetlock’s work [Future Babble], and that of Daniel Kahneman [Risk]) and so is also highly readable.

The Tetlock Two-step

In ‘Expert Political Judgment‘, Tetlock is a pessimist. He finds substantial evidence to warn us not to accept the predictions of pundits and experts. They are rarely more accurate than a chimp with a dartboard (okay, he actually compares them to random guessing).

Ten years later, in ‘Superforecasting’, Tetlock is an optimist. He still rejects the predictions of experts, but he has found light at the end of the predictions tunnel. The people he calls ‘Superforecasters’ are good at prediction; far better than experts, far better than chance, and highly consistent too.

If you want to understand how to make accurate predictions and reliable decisions; you need to understand Tetlock’s work.

Hedgehogs and Foxes: The Failure of Experts

In a long series of thorough tests of forecasting ability, Tetlock discovered a startling truth. Experts rarely perform better than chance. Simple computer algorithms that extrapolate the status quo often outperformed them. The best human predictors were those with lesser narrow expertise and a broader base of knowledge. In particular, the higher the public profile of the expert, the poorer their performance as a forecaster.

This led Tetlock to borrow a metaphor from philosopher Isiah Berlin: The fox knows many things but the hedgehog knows one big thing. The experts are hedgehogs: they know one thing very well, but are often outsmarted by the generalists who recognise the limitations of their knowledge and therefore take a more nuanced view. This is often because experts create for themselves a big theory that they are then seduced into thinking will explain everything. Foxes don’t have a grand theory. So they synthesise many different points of view, and therefore see the strengths and weaknesses of each one, better than the hedgehogs.

One result of Tetlock’s work was that the US Government’s Intelligence Advanced Research Projects Activity (IARPA) set up a forecasting tournament. This is an ‘Intelligence Community’ think tank. Eventually, Tetlock moved from helping design and manage the tournament, to participating.

Superforecasting: The Triumph of Collective Reflection

Tetlock, along with his wife (University of Pennsylvania Psychology and Marketing Professor, Barbara Mellers) created and co-led the Good Judgment Project. This was a collaborative team that was able to win the IARPA tournament consistently.

The book, Superforecasting, documents what Tetlock learned about how to forecast well. He identified ‘Superforecasters’ as people who can consistently make better predictions than other pundits. Superforecasters think in a different way. They are more thoughtful, reflective, open-minded and intellectually humble. But despite their humility, they tend to be widely read, hard-working, and highly numerate.

In a recent (at time of writing – https://twitter.com/PTetlock/status/738667852568350720 – 3 jJune 2016) Tweet, Tetlock said of  Trump University’s ‘Talk Like a winner’ guidelines :

Guidelines for “talking like a winner” are roughly the direct opposite of those for thinking like a superforecaster

The other characteristics that enable superforecasting, which you can implement in your own organisation’s decision-making, are:

  1. Screen forecasters for high levels of open-mindedness, rationality and fluid intelligence (reasoning skills), and low levels of superstitious thinking (Tetlock has developed a ‘Rationality Quotient’ or RQ). Also choose people with a ‘Growth Mindset’ andGrit.
  2. Collect forecasters together to work as a team
  3. Aim to maximise diversity of experiences, backgrounds, and perspectives
  4. Train them in how to work as a team effectively
  5. Good questions get good answers, so focus early effort on framing the question well to reduce bias and increase precision
  6. Understand biases and how to counter them
  7. Embrace and acknowledge uncertainty
  8. Take a subtle approach and use high levels of precision in estimating probabilities of events
  9. Adopt multiple models, and compare the predictions each one offers to gain deeper insights
  10. Start to identify the best performers, and allocate higher weight to their estimates
  11. Reflect on outcomes and draw lessons to help revise your processes and update your forecasts

 

Tetlock Explaining Fox and Hedgehog Theory

Share this:
Posted on

Victor Vroom: Motivation and Decision-making

Why do people make the choices they do at work, and how can managers and leaders make effective decisions? These are two essential questions for managers to understand. They were both tackled with characteristic clear-thinking and rigour by one man.

Victor Vroom

Short Biography

Victor Vroom was born in 1932 and grew up in the suburbs of Montreal. Initially, he was a bright child with little academic interest – unlike his two older brothers. Instead, his passion was big-band jazz music and, as a teenager, he dedicated up to 10 hours a day to practising Alto Sax and Clarinet.

Leaving school, but finding the move to the US as a professional musician was tricky, Vroom enrolled in college and learned, through psychometric testing, that the two areas of interest that would best suit him were music (no surprise) and psychology. Unfortunately, whilst he now enjoyed learning, his college did not teach psychology.

At the end of the year, he was able to transfer, with a full year’s credit, to McGill University, where he earned a BSc in 1953 and a Masters in Psychological Science (MPs Sc) in 1955. He then went to the US to study for his PhD at the University of Michigan. It was awarded to him in 1958.

His first research post was at the University of Michigan, from where he moved to the University of Pennsylvania in 1960 and then, in 1963, to Carnegie Mellon University. He remained there until receiving a second offer from Yale University – this time to act as Chairman of the Department of Administrative Sciences, and to set up a graduate school of organisation and management.

He has remained there for the rest of his career, as John G Searle Professor and, currently, as BearingPoint Professor Emeritus of Management & Professor of Psychology.

Vroom’s first book was Work and Motivation (1964) which introduced the first of his major contributions; his ‘Expectancy Theory’ of motivation. He also collaborated with Edward Deci to produce a review of workplace motivation, Management and Motivation, in 1970. They produced a revised edition in 1992.

His second major contribution was the ‘Vroom-Yetton model of leadership decision making’. Vroom and Philip Yetton published Leadership and Decision-Making in 1973. He later revised the model with Arthur Jago, and together, they published The New Leadership: Managing Participation in Organizations in 1988.

It is also worth mentioning that Vroom had a bruising experience while pursued through the courts by an organisation he had earlier collaborated with. They won their case for copyright infringement so I shall say no more. The judgement is available online. Vroom’s account of this, at the end of a long autobiographical essay, is an interesting read. It was written as part of his presidency of the Society for Industrial and Organizational Psychology in 1980-81.

Vroom’s Expectancy Theory of Motivation

Pocketblog has covered Vroom’s expectancy theory in an earlier blog, and it is also described in detail in The Management Models Pocketbook. It is an excellent model that deserves to be far better known than it is. Possibly the reason is because Vroom chose to express his theory as an equation: bad move! Most people are scared of equations. That’s why we at Management Pocketbooks prefer to use the metaphor of a chain. Motivation breaks down if any of the links is compromised. Take a look at our short and easy to follow article.

The Vroom-Yetton-Jago Model of Leadership Decision-making

This one is  a bit of a handful. Vroom has expressed some surprise that it became a well-adopted tool and, more recently, noted that societies and therefore management styles have changed, rendering it less relevant now than it was in its time. That said, it is instructive to understand the basics.

Decision-making is a leadership role, and (what I shall call) the V-Y-J model is a situational leadership model for what style of decision-making a leader should select.

It sets out the different degrees to which a manager or leader can involve their team in decision-making, and also the situational characteristics that would lead to a choice of each style.

Five levels of Group Involvement in Decision-making

Level 1: Authoritative A1
The leader makes their decision alone.

Level 2: Authoritative A2
The leader invites information and then makes their decision alone.

Level 3: Consultative C1
The leader invites group members to offer opinions and suggestions, and then makes their decision alone.

Level 4: Consultative C2
The leader brings the group together to hear their discussion and suggestions, and then makes their decision alone.

Level 5: Group Consensus
The leader brings the group together to discuss the issue, and then facilitates a group decision.

Choosing a Decision-Making Approach

The V-Y-J model sets out a number of considerations and research indicates that, when a decision approach is chosen that follows these considerations, leaders self-report greater levels of success than when the model is not followed. The considerations are:

  1. How important is the quality of the decision?
  2. How much information and expertise does the leader have?
  3. How well structured is the problem or question?
  4. How important is group-member acceptance of the decision?
  5. How likely is group-member acceptance of the decision?
  6. How much do group members share the organisation’s goals (against pursuing their own agendas)?
  7. How likely is the group to be able to reach a consensus?

A Personal Reflection

I have found both of Vroom’s principal models enormously helpful, both as a project leader and as a management trainer. I find it somewhat sad that, in Vroom’s own words, ‘the wrenching changes at Yale and the … lawsuit have taken their emotional and intellectual toll.’ Two major events created a huge mental and emotional distraction for Vroom in the late 1980s. At a time when he should still have been at the peak of his intellectual powers, he was diverted from his research. I think this is sad and wonder what insights we may have lost as a result.


 

Pocketbooks you might Like

The Motivation Pocketbook – has a short introduction to Vroom’s Expectancy Theory, which it refers to as ‘Valence Theory. It also has a wealth of other ideas about motivation.

The Management Models Pocketbook – has a thorough discussion of Expectancy Theory, and also Motivational Needs Theory, alongside eight other management models.

 

 

Share this:
Posted on

Philip Green: Risk & Control

Sir Philip Green is rarely out of the news. A self-made business man, he has long been a dominant figure in the UK retail scene and a figure with much to admire and much to criticise. When a TV audience is split 50:50 in loving and loathing a programme, it usually becomes a hit. On those grounds, Philip Green is a business hit!

Sir Philip Green

Short Biography

Philip Green was born in the Surrey (now South London) town of Croydon in 1952, where his parents were both involved in property and retail businesses. At the age of twelve, while at boarding school, his father died, leaving his mother to continue to run the family businesses – something she carried on into her eighties. Green, who had been used to earning money from a young age on the forecourt of the family’s petrol (gas in the US) station, left school as soon as he could, to enter the world of work.

His endeavour allowed him to work his way up through all levels of a shoe importer, to discover a real talent for selling. When he left the company he travelled the world, learning practical lessons in business, which he brought back to the UK. Through the late 1970s and early 1980s, he became adept at deals involving buying stock nobody else wanted and selling it quickly. He operated primarily in the retail apparel market. He then turned this acumen towards buying and selling companies. His deals got larger and more profitable, his reputation for rapid deal-making grew, and so did his asset base.

In 2000, Green acquired BHS – the former British Home Stores – which he rapidly transferred to his wife, Tina Green. He followed this acquisition in 2002 with the purchase of the Arcadia Group of fashion retail companies, that included some of the big names on the UK high street: Topshop, Burton, Wallis, Evans, Miss Selfridge and Dorothy Perkins. This was also transferred to his wife’s name. As a Monaco resident, the tax implications of this ownership structure have attracted much criticism in the UK.

Already owning the second largest share of the UK clothes retail market, Green tried in 2004 to acquire Marks & Spencer – the largest clothes retailer. His bid failed with much vitriol between him and the then M&S boss Sir Stuart Rose. In 2006, Green was knighted for services to the retail industry. The 2010 general election saw him coming out strongly for the Conservative party – a move that was reciprocated by the new Conservative/Liberal coalition with his appointment to chair a review into Government procurement – of which he was highly critical.

Perhaps Green’s largest business was BHS, so his business story is not one of total success. By 2012, the company’s fortunes were waning and in March 2015, Green sold the now loss-making business – debt free but with substantial pensions liabilities – for £1.

As a multi-billionaire (with his wife), Green’s spending and tax affairs attract as much media attention as his business activities. He is famed for lavish parties (spending several million pounds at a time) and equally known for his charitable and philanthropic spending. Forbes rate the couple’s joint wealth in 2015 at US $5 billion.

Business Lessons from Sir Philip Green

Whatever your view of him, Sir Philip has a talent for making decisions and turning a profit. Here are some lessons I draw from his experiences and choices.

Pace and Decisiveness

Green built his business on fast deals: rapidly doing the deal (often making a multi-million pound acquisition in days) and quickly turning that deal into a profit. Yes, Green is adept at risk taking, but taking risks is not a secret to success. Quickly assessing the risk and understanding your own capacity to handle it is what matters, and Green was a master – particularly during the 1980s and 90s.

The Rich get Richer

Money begets money, and Green used a very simple ploy (conceptually) time and time again, to grow his wealth. He would convince banks to lend him money to make his acquisitions – of stock in the early days and of businesses later – and then turn a profit and repay his debt quickly. On one occasion in 1985, he bought a bankrupt business with a large loan, traded for a short while and sold it six months later for nearly twice as much as he’d borrowed.  He then went to his bank and asked ‘what do I owe you?’ They replied ‘3 million 430 thousand pounds’ and so Green wrote a cheque there and then, putting it on the counter and saying ‘Done.’

Discipline and Control

Green has a fiendish attention to every detail of his business, devoting much of his energy to driving efficiency into every last nook and cranny. Why did BHS fail, then? I wish I could ferret that one out, because his regal processes through his London Oxford Street empire of shops are well known within the business for ferreting out even tiny discrepancies in the selling process.

Customers first: Owners second

Perhaps Green’s most closely held business belief is that shareholders drive the wrong decisions. Everything should be about giving your customers what they want, rather than pandering to shareholders. This is why he turned both BHS and Arcadia from publicly listed to privately owned companies. Maybe it is also why BHS failed for him: he could no longer figure out how to give customers what they want in a general purpose multi product store. It will be interesting to see if and how its new owners can square the circle that Green could not.

And…

Of course there are other things too, but most of them are what any manager would tell you are obvious ‘no-brainer’ habits; like: know your business inside out, respect and trust your people, keep working hard, stay alert for opportunities, and protect your supply chain. But the fact that Green does all of these does not make him different from many other successful business leaders. It’s the fact that he does them well and consistently, on top of the differentiators that make him exceptional.

Share this:
Posted on

Daniel Kahneman: Judgement and Bias

Daniel Kahneman has won many awards and honours, but none more surprising, perhaps, than a Nobel Prize. Why is this surprising? Kahneman is, after all, one the most eminent and influential psychologists of his time. It is surprising because there is no Nobel Prize for psychology: Kahneman was co-recipient of the 2002 Nobel Prize in Economics ‘for having integrated insights from psychological research into economic science, especially concerning human judgement and decision making under uncertainty’.

In short, what Kahneman taught us was that, before he and his co-worker, Amos Tversky (who sadly died six years before the Nobel Committee considered this prize and so was not eligible), had started to study human decision making, all economic theories were based on the same, false assumption. Kahneman and Tversky taught us that human beings are not rational agents when we make economic decisions: we are instinctive, intuitive, biased decision makers.

And, if that sounds pretty obvious to us now, then we have Kahneman and Tversky, and their long walks together, to thank.

Daniel Kahneman

 

Short Biography

Daniel Kahneman was born in 1934 to Lithuanian emigré parents living in Paris (although he was born when they were visiting family members in Tel Aviv). When Nazi Germany occupied France, the family went on the run, ending up after the war in what was then (1948) Palestine under the British Mandate, shortly before the formation of the State of Israel.

In 1954 he gained his BSc from the Hebrew University, in Psychology and Maths, and joined the psychology department of the Israeli Defence Forces, helping with officer selection. Four years later, he went to the University of California, Berkeley, where he was awarded a PhD in 1961. He returned to the Hebrew University in 1961.

It was in 1968, while hosting a seminar, that he met Amos Tversky. They started collaborating shortly afterwards. Their fertile discussions often involved thought experiments about how we make decisions and judgements, uncovering in themselves a series of heuristics – or thinking shortcuts – which they went on to observe in controlled laboratory experiments. Their collaboration continued until Tversky’s death in 1996.

In that time, they collaborated with other researchers, most notably, Paul Slovic and economists Richard Thaler and Jack Knetsch. Their many insights into how we make judgements and the application to economic decision-making eventually led to the Nobel Committee recognising Kahneman with the 2002 Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel.

Kahneman’s 2011 book, Thinking, Fast and Slow is a summary of a remarkable life’s work. If the ideas are new to you, they may well rock your world. It is not an easy read, but it is remarkably well-written for an intelligent lay audience. Even if Kahneman’s work is familiar to you, this book will repay close reading.

Kahneman’s Ideas

There is far too much in Kahneman’s work to even begin to summarise it, so I want to focus on three biases that he discovered, which have a profound impact on the choices we make; often leading us far astray.

The Anchoring Bias

The first information we get biases any subsequent choices we make. Your father was right, or your mother or anyone else who told you at a young age that first impressions count. Systematically, the human brain takes the first information it receives, and creates an interpretation of everything else that is anchored in the inferences it draws from that first impression. In management terms, this accounts for the horns and halo effect, that biases us to seek and spot confirming evidence for our pre-existing assessment.

The Representativeness Bias

Who is a more likely person to find working in a car repair shop, changing your brakes? Is it A: a young woman with blond hair and pink eyeliner, or B: a young woman with blond hair and pink eyeliner, whose father owns the car repair shop?

If you think B, you have fallen for representativeness bias. The story makes more sense in our experience, doesn’t it? A young woman with blond hair and pink eyeliner is not a person you’d expect to see in that environment. But a young woman with blond hair and pink eyeliner, whose father owns the car repair shop, may feel right at home. But statistically, this is rubbish. For every young woman with blond hair and pink eyeliner, only a small proportion will also have fathers who own a car repair shop.

The Availability Bias

Recent events bias our perception of risk. They are more available to recall and hence have a stronger impact on our intuition than do counter examples. The classic example is perceptions of risk of train travel, after a train crash. Trains are safe: they rarely crash. Cars crash a lot: there are many accidents every day. But they are rarely reported, so we have no immediate intuitive sense of the statistics.

The Impact of Kahneman’s Work

Kahneman’s work has had a huge impact. Decision theory existed before he came along, but he and Tversky revolutionised it. But it was Kahneman, along with Tversky, Knetsch and Thaler who pretty much invented the discipline of behavioural economics – and perhaps the relationship that drove that development was the friendship between Thaler and Kahneman.

Now Behavioural Economics infuses much of public policy and social influence that corporations try to exert over us. Thaler’s book, Nudge (with Cass Sunstein) is a best seller and Thaler and Sunstein both advise Prime Ministers and Presidents. Next time you get a document from Government, or go into a store, and you find yourself complying with their wishes without thinking, there is a chance that you have been ‘nudged’. And the roots of these ‘choice architectures’? The roots are in understanding our heuristics and biases. And that was Kahneman’s contribution.

Kahneman at TED

Here is Daniel Kahneman, talking about how we perceive happiness and discomfort.

[ted id=779]

 

Share this:
Posted on

Six times Four: More de Bono

Last week, I discussed Edward de Bono’s (or maybe his and others’) Six Thinking Hats.  In my blog title, I described his mind as fertile and that fertility led, step by step, from:

  1. Six Thinking Hats (1985) to:
  2. Six Action Shoes (1991)
  3. Six Value Medals (2005)
  4. Six Frames – for thinking about information (2008)

We’ve listed the six hats.  Let’s do the same for the others.  Whilst I own copies of Six Action Shoes and Six Value Medals, it was only in researching this blog that I learned about the newest book here, so I am indebted to Professor Tortoise for the primer in the Six Frames.

Six Action Shoes

Six Action Shoes - de Bono

Navy Formal Shoes
Represent formal routines, processes and procedures.

Grey Sneakers
Represent exploring, investigating and gathering information.

Brown Practical Brogues
Represent practical, pragmatic, roll-your-sleeves-up action.

Orange Gumboots
Represent safety-conscious activities and emergency action.

Pink Comfy Slippers
Represent caring, concerned, compassionate and sensitive action.

Purple Riding Boots
Represent leadership, authority and command.

Six Value Medals

Six Value Medals - de Bono

Gold Medal – Human Values
Values relating to putting people first.

Silver Medal – Organisational Values
Values relating to your organisation’s purpose.

Steel Medal – Quality Values
Values relating to your product, service or function.

Glass Medal  – Creativity Values
Values relating to creating, innovating and simplicity.

Wood Medal – Environmental Values
Values relating to sustainability and impact on the community and on society.

Brass Medal – Perceptual Values
Values relating to the way things appear.

Six Frames for Thinking

Six Thinking Frames - de Bono

Triangle Frame – Purpose
Understanding the information at hand – the What, the Why and the Where.

Circle Frame – Accuracy
Is the information consistent, accurate and adequate for our needs (to solve a problem or make a decision, for example)?

Square frame – Perspectives
We can look at information and a situation from different points of view, with different biases and prejudices.  Which ones are present?

Heart Frame – Interest
Focuses our attention on the relevant, salient, interesting information that matters most to you.

Diamond Frame – Value
How do we evaluate the value of our information?  We can use the six value medals to prioritise its importance.

Slab Frame – Conclusions
What does the information tell us and, crucially, what next?

Share this:
Posted on

Three ways to get it wrong

imageIn last week’s post we discussed some of the decision-making traps that board members–or, indeed, any decision-making group–can fall into.

At the heart of our understanding of these biases is the work of Daniel Kahneman.  He was awarded the Nobel Prize for Economics for his work in this area, with co-worker Amos Tversky. in 2002.  Sadly Tversky died in 1996 and was ineligible for the prize, under Nobel rules.

Behavioural Economics

Daniel KahnemanKahneman is perhaps the leading psychologist in the field of behavioural economics – very much a field du jour.  His research was carried out with many collaborators including Paul Slovic, an expert in the field of perception of risk, and Richard Thaler, most notable for his use of the term “nudge” to describe how we can use perceptions to shift behaviour.

The classic paper that Kahneman and Tversky wrote was ‘Judgment under Uncertainty: Heuristics and Biases’, published in the Journal Science in September 1974.

Heuristic: A rule of thumb or simple procedure for reaching a decision or solving a problem.

In this article, they introduced three important heuristics, which guide many of our decisions – and frequently let us down.

Representativeness

We tend to believe a possible event is more likely when we can recognise it as a part of a familiar pattern.  It is as if we like to create stories about our world that follow standard arcs or plots (see for example Christopher Booker’s wonderfully argued The Seven Basic Plots: Why We Tell Stories.  If a potential event slips easily into one of these plots, we rate it as more likely than if the plot seems to need adjusting.

Availability

Recent salient examples render a possible event more likely in our minds than other events that do not trigger such easy recall.  Immediately after a rail accident, people fear rail travel more than normal and take to their cars.  The resulting spike in road deaths usually exceeds the immediate effects of the road accident.

Anchoring

We make estimates and decisions from a starting point and the point we choose can bias our estimate or decision.  Surprisingly, even an unrelated figure presented randomly can skew a later numerical estimate.

Kahneman won’t stand still

In 2007, Kahneman received the American Psychological Association’s Award for Outstanding Lifetime Contributions to Psychology and this year, he was included in Bloomberg 50 most influential people in global finance.  He says that all he knows about economics, he learned from co-workers like Richard Thaler.  He is a much sought-after speaker and commentator in the business arena, and you can find recent work documented on TED (see below), in an extended interview, and in two excellent articles on the websites of prominent global strategy consulting firms,McKinsey and Booz & Company:

Daniel Kahneman: The riddle of experience vs. memory

Using examples from vacations to colonoscopies, Kahneman reveals how our “experiencing selves” and our “remembering selves” perceive happiness, and why experience does not influence decisions, in this 2010 TED talk.

[youtube=http://www.youtube.com/watch?v=XgRlrBl-7Yg]
Share this:
Posted on

Decision Failure

Young Apprentice candidate, Hannah RichardsIn episode 3 of the current series of Young Apprentice, the candidate Hannah Richards lost her place in the competition because of her poor decision making in the boardroom.

In Hannah’s case, she let personal loyalties and enmities over-ride good judgement, but this is not the only reason for failed boardroom decisions.  In fact there is a whole array of decision-making traps available to us.  Let’s look at a small sample:

1. The Anchoring Trap

In a discussion at a board meeting, if the first speaker has a strong opinion, they can sway the whole tone of the debate, focusing not on what is right, but on the extent to which the first speaker is right.  This “first speaker advantage” can lead to poor decisions, when the first speaker takes an extreme position.

2. The Confirming Evidence Trap

When a Board approaches a consensus view it will rarely discuss information that conflicts with this view, focusing rather on evidence that confirms it.

3. The Sunk Cost Trap

Board members invest a lot of political and social capital in key decisions that can make those decisions hard to reverse if the situation changes or new information comes to light.  Errors can be perpetuated and good money thrown after bad.

4. The Seduction of Appearances Trap

Beautifully and eloquently presented evidence often carries more weight than more robust but less attractively presented evidence.  This is one of the reasons why PowerPoint-style presentations are a dangerous component of any board meeting.

5. The Prudence Trap

Caution is wise.  Risk is dangerous.  Uncertainty is risky.  Prudence is called for.  But risk can be managed and the status quo is also a dangerous strategy in fast-changing times.  Risk-taking is neither good nor bad, but a strategy to discuss and evaluate in the light of all options and all mitigating strategies.

Good Decision Making is an Art

… and a science too.  There are a lot of tools you can draw upon and it is also important to understand the vital role of intuition, when you are operating in complex environments, where you have substantial relevant experience.

Management Pocketbooks you Might Enjoy

The Decision-Making Pocketbook

The Decision-Making Pocketbook is filled with practical tools to support decision making.

Also try:

The Problem Solving Pocketbook

The Thinker’s Pocketbook

Share this:
Posted on

Learning Decision Making from Dr House

At its best, television can inspire and educate.  It can also make us think.  Some of us mourn the loss of House from free to air TV in the UK – ho hum: there are always DVDs.

So what can you learn from House?  Masses, it seems and there is even a book ‘House and Philosophy’ to guide you.  It is part of the Blackwell philosophy and pop culture series.

Introducing Dr House

image For those who didn’t catch the US drama series, House is an ornery, arrogant, self-centred and devious doctor who specialises in diagnosing and treating the most mysterious of cases that come into his hospital.

He is played by Hugh Laurie, about whom one American writer said: ‘does a terrific job with his British accent on Jeeves and Wooster.’

.

Don’t Care

House appears not to care about his patients – his concern is for solving the case.  Whilst there are episodes with exceptions to this rule, a distinct illustration of this is when he risks brain damage to a boy in order to find the evidence that will allow him to save the boy’s life.  When challenged about this, he replies to the effect that he does not worry about things he cannot do anything about.

Agent Regret

For most of us, however, the consequences of our decisions weigh heavily.  Regardless of our intention, if the outcome is bad, we have to live with the guilt.  This is known to philosophers as ‘agent regret’.  For House, the ends entirely justify the means, but it works both ways.

Moral Luck

If House makes a wrong decision – or indeed one of his subordinates does – it is not enough, for him, to hide behind ‘we followed procedure’.  In judging responsibility, it is again the outcome that matters.

A Good Decision

When we think about decision-making in organisations, we talk about a ‘good decision’ as one that can be defended.  It requires three things:

  1. The decision maker or makers have the authority and expertise to make the decision
  2. The decision maker or makers have the best information available
  3. The process that the decision maker or makers follow is sound – it is transparent, logical, and fair

But a good decision is not the same as the right decision.  We require good decisions, because they appear to maximise our chances of getting it right.  But we also require them, because we cannot require that all decisions are right.

How much do your decisions matter?

Agent Regret seems to me to be a fancy philosophers’ phrase for conscience.  Knowing about it can have two effects:

  • It could freeze you to the spot and stop you making a decision
  • It could galvanise you to take just a little more time to look for one more fact, or conduct one more test, before finally saying ‘go’.

So here’s the deal

Of course, when House takes the latter course, it usually works out.  Real life is rarely as obliging.  But even so, what is there to lose if you make one last check?

Management Pocketbooks you may enjoy

9781870471763

The Decision Making Pocketbook will give you a sound process and a range of useful tools to help you make your decisions.  They won’t prevent Agent Regret if you get it wrong, but they will limit your regret to the consequences, rather than ignorance or negligence.

You may also enjoy:

Share this: