Pages

Monday, August 13, 2012

Hidden flaws in strategy

Can insights from behavioral economics explain why good executives back bad strategies?

McKinsey Quarterly
MAY 2003 • Charles Roxburgh

After nearly 40 years, the theory of business strategy is well developed and widely disseminated. Pioneering
MSI Behavioral Econ - John Gourville
MSI Behavioral Econ - John Gourville (Photo credit: Iamctodd)
work by academics such as Michael E. Porter and Henry Mintzberg has established a rich literature on good strategy. Most senior executives have been trained in its principles, and large corporations have their own skilled strategy departments.

Yet the business world remains littered with examples of bad strategies. …Flawed analysis, excessive ambition, greed, and other corporate vices are possible causes, but this article doesn’t attempt to explore all of them. Rather, it looks at one contributing factor that affects every strategist: the human brain.

MSI Behavioral Econ - Dan Ariely
MSI Behavioral Econ - Dan Ariely (Photo credit: Iamctodd)
The brain is a wondrous organ. As scientists uncover more of its inner workings through brain-mapping techniques,1 our understanding of its astonishing abilities increases. But the brain isn’t the rational calculating machine we sometimes imagine. … But whatever the root cause, the brain can be a deceptive guide for rational decision making.

The basic assumption of modern economics—rationality—does not stack up against the evidence. …
Insights from behavioral economics have been used to explain bad decision making in the business world,2 and bad investment decision making in particular. … Likewise, behavioral economics has been applied to personal finance,3 thereby providing an easier route to making money than any hot stock tip. However, the field hasn’t permeated the day-to-day world of strategy formulation.

This article aims to help rectify that omission by highlighting eight4 insights from behavioral economics that best explain some examples of bad strategy. …

Behavioral economics tells us that the mistakes made in the late 1990s were exactly the sorts of errors our brains are programmed to make—and will probably make again.

Flaw 1: Overconfidence

Our brains are programmed to make us feel overconfident. This can be a good thing; for instance, it requires great confidence to launch a new business. … The world would be duller and poorer if our brains didn’t inspire great confidence in our own abilities. But there is a downside when it comes to formulating and judging strategy.

John Maynard Keynes Русский: Джон Мейнард Кейн...
John Maynard Keynes Русский: Джон Мейнард Кейнс Türkçe: John Maynard Keynes (Photo credit: Wikipedia)
The brain is particularly overconfident of its ability to make accurate estimates. Behavioral economists often illustrate this point with simple quizzes: guess the weight of a fully laden jumbo jet or the length of the River Nile, say. Participants are asked to offer not a precise figure but rather a range in which they feel 90 percent confidence—for example, the Nile is between 2,000 and 10,000 miles long. … Most of us are unwilling and, in fact, unable to reveal our ignorance by specifying a very wide range. Unlike John Maynard Keynes, most of us prefer being precisely wrong rather than vaguely right.

We also tend to be overconfident of our own abilities.5 This is a particular problem for strategies based on assessments of core capabilities. …

Related to overconfidence is the problem of overoptimism. … The twin problems of overconfidence and overoptimism can have dangerous consequences when it comes to developing strategies, as most of them are based on estimates of what may happen—too often on unrealistically precise and overoptimistic estimates of uncertainties. …

There are ways to counter the brain’s overconfidence:
Dick Thaler & Leon Wieseltier, best men
Dick Thaler & Leon Wieseltier, best men (Photo credit: bettina n)
  1. Test strategies under a much wider range of scenarios. But don’t give managers a choice of three, as they are likely to play safe and pick the central one. For this reason, the pioneers of scenario planning at Royal Dutch/Shell always insisted on a final choice of two or four options.6
  2. Add 20 to 25 percent more downside to the most pessimistic scenario.7 Given our optimism, the risk of getting pessimistic scenarios wrong is greater than that of getting the upside wrong. …
  3. Build more flexibility and options into your strategy to allow the company to scale up or retrench as uncertainties are resolved. Be skeptical of strategies premised on certainty.

Flaw 2: Mental accounting

Richard Thaler, a pioneer of behavioral economics, coined the term "mental accounting," defined as "the inclination to categorize and treat money differently depending on where it comes from, where it is kept, and how it is spent."8 Gamblers who lose their winnings, for example, typically feel that they haven’t really lost anything, though they would have been richer had they stopped while they were ahead. …

Avoiding mental accounting traps should be easier if you adhere to a basic rule: that every pound (or dollar or euro) is worth exactly that, whatever the category. In this way, you will make sure that all investments are judged on consistent criteria and be wary of spending that has been reclassified. Be particularly skeptical of any investment labeled "strategic."

Flaw 3: The status quo bias

In one classic experiment,9 students were asked how they would invest a hypothetical inheritance. Some received several million dollars in low-risk, low-return bonds and typically chose to leave most of the money alone. The rest received higher-risk securities—and also left most of the money alone. What determined the students’ allocation in this experiment was the initial allocation, not their risk preference. People would rather leave things as they are. One explanation for the status quo bias is aversion to loss—people are more concerned about the risk of loss than they are excited by the prospect of gain. The students’ fear of switching into securities that might end up losing value prevented them from making the rational choice: rebalancing their portfolios.

A similar bias, the endowment effect, gives people a strong desire to hang on to what they own; the very fact of owning something makes it more valuable to the owner. Richard Thaler tested this effect with coffee mugs imprinted with the Cornell University logo. Students given one of them wouldn’t part with it for less than $5.25, on average, but students without a mug wouldn’t pay more than $2.75 to acquire it. The gap implies an incremental value of $2.50 from owning the mug.

The status quo bias, the aversion to loss, and the endowment effect contribute to poor strategy decisions in several ways. First, they make CEOs reluctant to sell businesses. McKinsey research shows that divestments are a major potential source of value creation but a largely neglected one.10 CEOs are prone to ask, "What if we sell for too little—how stupid will we look when this turns out to be a great buy for the acquirer?"  …

These phenomena also make it hard for companies to shift their asset allocations. Before the recent market downturn, the UK insurer Prudential decided that equities were overvalued and made the bold decision to rebalance its fund toward bonds. Many other UK life insurers, unwilling to break with the status quo, stuck with their high equity weightings and have suffered more severe reductions in their solvency ratios.

This isn’t to say that the status quo is always wrong. Many investment advisers would argue that the best long-term strategy is to buy and hold equities (and, behavioral economists would add, not to check their value for many years, to avoid feeling bad when prices fall). In financial services, too, caution and conservatism can be strategic assets. The challenge for strategists is to distinguish between a status quo option that is genuinely the right course and one that feels deceptively safe because of an innate bias.

To make this distinction, strategists should take two approaches:
  1. Adopt a radical view of all portfolio decisions. View all businesses as "up for sale." Is the company the natural parent, capable of extracting the most value from a subsidiary? View divestment not as a failure but as a healthy renewal of the corporate portfolio.
  2. Subject status quo options to a risk analysis as rigorous as change options receive. Most strategists are good at identifying the risks of new strategies but less good at seeing the risks of failing to change.

Flaw 4: Anchoring

One of the more peculiar wiring flaws in the brain is called anchoring. Present the brain with a number and then ask it to make an estimate of something completely unrelated, and it will anchor its estimate on that first number. The classic illustration is the Genghis Khan date test. Ask a group of people to write down the last three digits of their phone numbers, and then ask them to estimate the date of Genghis Khan’s death. Time and again, the results show a correlation between the two numbers; people assume that he lived in the first millennium, when in fact he lived from 1162 to 1227.

Anchoring can be a powerful tool for strategists. In negotiations, naming a high sale price for a business can help secure an attractive outcome for the seller, as the buyer’s offer will be anchored around that figure. Anchoring works well in advertising too. Most retail-fund managers advertise their funds on the basis of past performance. …  By citing the past-performance record, though, the manager anchors the notion of future top-quartile performance to it in the consumer’s mind.

However, anchoring—particularly becoming anchored to the past—can be dangerous. Most of us have long believed that equities offer high real returns over the long term, an idea anchored in the experience of the past two decades. …Our expectations about equity returns have been seriously distorted by recent experience. …

Besides remaining unswayed by the anchoring tactics of others, strategists should take a long historical perspective. Put trends in the context of the past 20 or 30 years, not the past 2 or 3; for certain economic indicators, such as equity returns or interest rates, use a very long time series of 50 or 75 years. …

Flaw 5: The sunk-cost effect

A familiar problem with investments is called the sunk-cost effect, otherwise known as "throwing good money after bad." When large projects overrun their schedules and budgets, the original economic case no longer holds, but companies still keep investing to complete them. …

Why is it so hard to avoid? One explanation is based on loss aversion: we would rather spend an additional $10 million completing an uneconomic $110 million project than write off $100 million. Another explanation relies on anchoring: once the brain has been anchored at $100 million, an additional $10 million doesn’t seem so bad.

What should strategists do to avoid the trap?
  1. Apply the full rigor of investment analysis to incremental investments, looking only at incremental prospective costs and revenues. This is the textbook response to the sunk-cost fallacy, and it is right.
  2. Be prepared to kill strategic experiments early. In an increasingly uncertain world, companies will often pursue several strategic options.11 Successfully managing a portfolio of them entails jettisoning the losers. The more quickly you get out, the lower the sunk costs and the easier the exit.
  3. Use "gated funding" for strategic investments, much as pharmaceutical companies do for drug development: release follow-on funding only once strategic experiments have met previously agreed targets.

Flaw 6: The herding instinct

… This desire to conform to the behavior and opinions of others is a fundamental human trait and an accepted principle of psychology.12 Warren Buffett put his finger on this flaw when he wrote, "Failing conventionally is the route to go; as a group, lemmings may have a rotten image, but no individual lemming has ever received bad press."13 For most CEOs, only one thing is worse than making a huge strategic mistake: being the only person in the industry to make it.

… At times of mass enthusiasm for a strategic trend, pressure to follow the herd rather than rely on one’s own information and analysis is almost irresistible. Yet the best strategies break away from the trend. Some actions may be necessary to match the competition—imagine a bank without ATMs or a good on-line banking offer. But these are not unique sources of strategic advantage, and finding such sources is what strategy is all about. "Me-too" strategies are often simply bad ones.14 Seeking out the new and the unusual should therefore be the strategist’s aim. Rather than copying what your most established competitors are doing, look to the periphery15 for innovative ideas, and look outside your own industry.

Initially, an innovative strategy might draw skepticism from industry experts. They may be right, but as long as you kill a failing strategy early, your losses will be limited, and when they are wrong, the rewards will be great.

Flaw 7: Misestimating future hedonic states

What does it mean, in plain English, to misestimate future hedonic states? Simply that people are bad at estimating how much pleasure or pain they will feel if their circumstances change dramatically. … People adjust surprisingly quickly, and their level of pleasure (hedonic state) ends up, broadly, where it was before.

This research strikes a chord with anyone who has studied compensation trends in the investment-banking industry. Ever-higher compensation during the 1990s led only to ever-higher expectations—not to a marked change in the general level of happiness on the Street. …

Another illustration of our poor ability to judge future hedonic states in the business world is the way we deal with a loss of independence. More often than not, takeovers are seen as the corporate equivalent of death, to be avoided at all costs. Yet sometimes they are the right move. …

… We do seem very bad at estimating how we would feel if our circumstances changed dramatically—changes in corporate control, like changes in our personal health or wealth.

How can the strategist avoid this pitfall?
  1. In takeovers, adopt a dispassionate and unemotional view. Easier said than done—especially for a management team with years of committed service to an institution and a personal stake in the status quo. Nonexecutives, however, should find it easier to maintain a detached view.
  2. Keep things in perspective. Don’t overreact to apparently deadly strategic threats or get too excited by good news. During the high and low points of the crisis at Lloyd’s of London in the mid-1990s, the chairman used to quote Field Marshall Slim—"In battle nothing is ever as good or as bad as the first reports of excited men would have it."  …

Flaw 8: False consensus

People tend to overestimate the extent to which others share their views, beliefs, and experiences—the false-consensus effect. Research shows many causes, including these:
  • confirmation bias, the tendency to seek out opinions and facts that support our own beliefs and hypotheses
  • selective recall, the habit of remembering only facts and experiences that reinforce our assumptions
  • biased evaluation, the quick acceptance of evidence that supports our hypotheses, while contradictory evidence is subjected to rigorous evaluation and almost certain rejection; 
  • groupthink,16 the pressure to agree with others in team-based cultures
… False consensus, which ranks among the brain’s most pernicious flaws, can lead strategists to miss important threats to their companies and to persist with doomed strategies. But it can be extremely difficult to uncover—especially if those proposing a strategy are strong role models. We are easily influenced by dominant individuals and seek to emulate them. This can be a force for good if the role models are positive. But negative ones can prove an irresistible source of strategic error.

Many of the worst financial-services strategies can be attributed to over-dominant individuals. …

The dangers of false consensus can be minimized in several ways:
  1. Create a culture of challenge. As part of the strategic debate, management teams should value open and constructive criticism. … CEOs and strategic advisers should understand criticisms of their strategies, seek contrary views on industry trends, and, if in doubt, take steps to assure themselves that opposing views have been well researched. They shouldn’t automatically ascribe to critics bad intentions or a lack of understanding.
  2. Ensure that strong checks and balances control the dominant role models. A CEO should be particularly wary of dominant individuals who dismiss challenges to their own strategic proposals; the CEO should insist that these proposals undergo an independent review by respected experts. The board should be equally wary of a domineering CEO.
  3. Don’t "lead the witness." Instead of asking for a validation of your strategy, ask for a detailed refutation. … Establish a "challenger team" to identify the flaws in the strategy being proposed by the strategy team.
An awareness of the brain’s flaws can help strategists steer around them. All strategists should understand the insights of behavioral economics just as much as they understand those of other fields of the "dismal science." Such an understanding won’t put an end to bad strategy; greed, arrogance, and sloppy analysis will continue to provide plenty of textbook cases of it. Understanding some of the flaws built into our thinking processes, however, may help reduce the chances of good executives backing bad strategies.

About the Author

Charles Roxburgh is a director in McKinsey’s London office.

Notes

1See Rita Carter, Mapping the Mind, London: Phoenix, 2000.
2See, for example, J. Edward Russo and Paul J. H. Schoemaker, Decision Traps: The Ten Barriers to Brilliant Decision-Making and How to Overcome Them, New York: Fireside, 1990; and John S. Hammond III, Ralph L. Keeney, and Howard Raiffa, "The hidden traps in decision making," Harvard Business Review, September–October 1998, pp. 47–57.
3See Gary Belsky and Thomas Gilovich, Why Smart People Make Big Money Mistakes and How to Correct Them, New York: Simon and Schuster, 1999.
4This is far from a complete list of all the flaws in the way we make decisions. For a full description of the irrational biases in decision making, see Jonathan Baron, Thinking and Deciding, New York: Cambridge University Press, 1994.
5In a 1981 survey, for example, 90 percent of Swedes described themselves as above-average drivers.
6See Pierre Wack’s two-part article, "Scenarios: Uncharted waters ahead," Harvard Business Review, September–October 1985, pp. 73–89; and "Scenarios: Shooting the rapids," Harvard Business Review, November–December 1985, pp. 139–50.
7This rule of thumb was suggested by Belsky and Gilovich in Why Smart People Make Big Money Mistakes and How to Correct Them.
8See Belsky and Gilovich.
9This is a simplified account of an experiment conducted by William Samuelson and Richard Zeck-hauser, described in "Status-quo bias in decision making," Journal of Risk and Uncertainty, 1, March 1988, pp. 7–59.
10See Lee Dranikoff, Tim Koller, and Antoon Schneider, "Divestiture: Strategy’s missing link," Harvard Business Review, May–June 2002, pp. 75–83; and Richard N. Foster and Sarah Kaplan, Creative Destruction: Why Companies That Are Built to Last Underperform the Market—and How to Successfully Transform Them, New York: Currency/Doubleday, 2001.
11See Eric D. Beinhocker, "Robust adaptive strategies," Sloan Management Review, spring 1999, pp. 95–106; Hugh Courtney, Jane Kirkland, and Patrick Viguerie, "Strategy under uncertainty," Harvard Business Review, November–December 1997, pp. 67–79; and Lowell L. Bryan, "Just-in-time strategy for a turbulent world," The McKinsey Quarterly, 2002 Number 2 special edition: Risk and resilience, pp. 16–27.
12See Belsky and Gilovich.
13Warren Buffett, "Letter from the chairman," Berkshire Hathaway Annual Report, 1984.
14See Philipp M. Nattermann, "Best practice ? best strategy," The McKinsey Quarterly, 2000 Number 2, pp. 22–31.
15See Foster and Kaplan.
16Famously described in Irving Lester Janis’s study of the Bay of Pigs and Cuban missile crises, among others. See his book Groupthink: Psychological Studies of Policy Decisions and Fiascoes, Boston: Houghton Mifflin, June 1982.
Enhanced by Zemanta

No comments:

Post a Comment