Three Mistakes We Make With Models

Imagine that you live in Australia and you would like to eat a good, genuine bagel. After fairly extensive research, I have discovered that there are two places that you can go. One is called Bagel Nook, and it’s here in Brisbane. Not many people know about it, and one of the reasons is that it’s really hard to find.

Here it is on a map:

Here’s why it’s hard: it’s address is Creek Street, but you can’t actually reach it from Creek Street. To get to Bagel Nook, you have to go down that tiny little laneway that comes off of Adelaide Street. It’s a classic example of the map not being the territory.

The other place in Australia with good bagels is Glicks in Melbourne. It’s also on a tiny hard-to-find street, so to get there you need an equally detailed map.

Now, imagine making an Australian Bagel Road Trip and travel from Bagel Nook to Glicks. If you start with the map with Bagel Nook, and stick with maps of that scale, you’ll need roughly 8,335 pages to cover the trip that you’ll take.

That’s no good. Instead, for most of the trip, this map is what you need:

Maps are models, and we use models all the time to help us understand the world. We use models of roads to help us get around. We use models in science to help us understand physics, the way that economies work, and many other things. John Kay makes a good point about how we use models:

All science uses unrealistic simplifying assumptions. Physicists describe motion on frictionless plains, gravity in a world without air resistance. Not because anyone believes that the world is frictionless and airless, but because it is too difficult to study everything at once. A simplifying model eliminates confounding factors and focuses on a particular issue of interest. To put such models to practical use, you must be willing to bring back the excluded factors. You will probably find that this modification will be important for some problems, and not others – air resistance makes a big difference to a falling feather but not to a falling cannonball.

Our use of mental models is so ubiquitous that we’re often not aware of using them at all. However, we can use the Australian Bagel Road Trip and the quote from Kay to look at three common mistakes that we make with models:

  1. Using the wrong scale: just as we need a map at the right scale to get from Bagel Nook to Glicks, our business mental models also need to be at the right scale.

    In her excellent book The Plugged-In ManagerTerri Griffith talks about the thought process that a manager goes through in making the decision to start using the cloud for some of their computing functions. She talks about how to make this decision, you have to think about how the technology, your people, and the organisation’s processes interact.

    But it’s also important to have a good model of how cloud computing works. And this means having a model at the right scale. For most managers, you don’t need a hugely detailed model that includes servers, packet-switching and communication protocols. That’s the wrong scale – too small. But you do probably need to have a model that includes issues like back-ups, security and mobile access.

    If you use a model that is the wrong scale, it will be very hard to make good decisions. That’s the first mistake to avoid.

  2. The map isn’t the territory: even if you have the map for Bagel Nook, it’s hard to find it. You need to be on the ground to figure out to go into that little laneway.

    Mistaking the map for the territory is a huge problem in business. Roger Martin addresses this in his book Fixing the Game: Bubbles, Crashes, and What Capitalism Can Learn from the NFL. Martin talks about the difference between the real market and the expectations market. In the real market, firms make and sell real goods and services, and their performance depends on how effectively they do this. The expectations market is the stock market – and here, a stock is a model of how the firm is expected to do.

    Steve Denning talks about the implications of mistaking the expectations market (map) for the real market (territory):

    “Maximizing shareholder value” turned out to be the disease of which it purported to be the cure. Between 1960 and 1980, CEO compensation per dollar of net income earned for the 365 biggest publicly traded American companies fell by 33 percent. CEOs earned more for their shareholders for steadily less and less relative compensation. By contrast, in the decade from 1980 to 1990 , CEO compensation per dollar of net earnings produced doubled. From 1990 to 2000 it quadrupled.

    Meanwhile real performance was declining. From 1933 to 1976, real compound annual return on the S&P 500 was 7.5 percent. Since 1976, Martin writes, the total real return on the S&P 500 was 6.5 percent (compound annual). The situation is even starker if we look at the rate of return on assets, or the rate of return on invested capital, which according to a comprehensive study by Deloitte’s Center For The Edge are today only one quarter of what they were in 1965.

    In other words, mistaking the model for reality has destroyed shareholder value, the opposite of what was intended. We always have to be aware of the models we’re using, and ensure that we’re managing the reality, not the model.

  3. Using the wrong map: a lot of people contend that a significant cause of many of the recent stock market crashes has been the use of incorrect models. That’s the fundamental issue that Nassim Nicholas Taleb keeps trying to get people to acknowledge. His contention is that the market models in use have vastly underestimated the probability of large price fluctuations. Consequently, when these fluctuations do occur, things blow up.

    The post by John Kay addresses the problems with this, as does this one by Mark Buchanan, and they’re both worth reading. The key point though is simple: if you use a model that isn’t accurate, you can’t make good decisions.

Models are an important part of how we make sense of the world. However, we often make mistakes in our use of models. To avoid these mistakes, try to make sure that the model you use is at the right scale for the decision you’re making, try to manage the real market, not the model built on top of reality, and try to make your models as accurate as possible.

And if you know of any other good bagel places here in Australia, please let me know!

Student and teacher of innovation - University of Queensland Business School - links to academic papers, twitter, and so on can be found here.

Please note: I reserve the right to delete comments that are offensive or off-topic.

6 thoughts on “Three Mistakes We Make With Models

  1. Really enjoyed reading the perspective and how things as small as models are mistaken as reality. Bookmarked it for reminder :).

  2. This post also reminds me of a KM perspective on codification.

    From Dave Snowden:
    http://www.cognitive-edge.com/blogs/dave/2007/06/reporting_on_sin.php

    “…shifting from the tacit and explicit words to thinking about knowledge as ranging from knowledge that can only be acquired by experience, to that which can be codified and diffused rapidly in consequence. I talked about the role of narrative as a mediation and meaning making exercise between the two. To illustrate this I used the comparison between a taxi driver and a map user. The London taxi driver acquires knowledge through experience, but in consequence can get to a destination faster than the map user and is more resilient when things go wrong. However the map assumes shared context. I told the story of when I used a map in New York and came near to getting mugged, because of the assumptions of a shared knowledge context between map maker and map user. When I complained that the map did not say Here be muggers and other strange beasts, I was told …but everyone knows that to which my response was Well I don’t. Its one of the most common mistakes with information management, assuming shared context around the common place.”

Comments are closed.