Noun 1. exigency – an unstable situation of extreme danger or difficulty;
I’ve had some jobs in which I’ve performed pretty well, and some where I haven’t been quite so good. Probably the worst job I’ve ever done was part of my portfolio when I was managing sales & marketing for a polytechnic in New Zealand. The specific job was new course evaluation.
We put in a modified stage-gate type process to evaluate potential new courses. It was my job to fill in the numbers. The one number that drove everything else was the expected number of students. If the expected number of students was high, we’d try to run the course. If not, we’d kill the proposal.
I developed a very elaborate model, based on historical data. I knew how many enquiries we could generate from advertising, how many of those we could convert into applications, and how many enrolments we got per application. In fact, figuring out these numbers was one of the biggest innovations that I executed there, and this model was of tremendous use in trying to figure out around November how many total students we could expect at the start of the new school year the next February.
However, the model was terrible at predicting new course enrolments. Why? In large part, because we’re really lousy at figuring out how something new will perform. We rejigged our new course approval process after I pointed out that we hadn’t approved a single new offering in over 6 months – our process was killing everything.
I was originally going to call this post The Perils of Prediction, but Greg Satell beat me to that title. Also, the specific problem that I’m talking about is really extrapolation. You should read all of Greg’s post, but here’s part of what he says:
The problem starts when smart people in nice suits and lab jackets proclaim that “the data says…” In truth, the data never says anything. We interpret it in one way or another and there are lots of ways to interpret it incorrectly.
Data is, after all, messy. It doesn’t spring forth whole, but must be collected in some way. We count, measure, survey, aggregate, slice and dice, picking up errors all the time. We need to make choices about which data we want to focus on and which fades into the background.
How do we deal with this? Usually by finding some numbers from the past and extrapolating them. However, there are a few problems with this approach, including:
- We tend to think in straight lines, but there aren’t any straight lines in business: that’s really the point being made by the xkcd cartoon. Taking a straight line and extrapolating it into the future almost never gives us the right answer.
- It’s really hard to tell what kind of curved line we’re on: this complicates things too. Even when we have historical data, it is nearly impossible to figure out what kind of engine is generating the output. Take a look at this data from an interesting post on climate change:
Is it likely that the data will progress in a straight line? Or will it level out at some point? Or will it increase exponentially? We don’t know. But when we’re predicting, it pays to consider what circumstances might lead to each of these outcomes.
- Even when things are accelerating quickly, they tend to level out: innovations spread through an s-curve, and this is a very common pattern in business.
This is one of the issues with everyone talking about the singularity – it assumes that exponential growth will continue forever. It might, but usually exponential growth levels out, and then it looks like an s-curve.
- However, by the far the biggest problem with extrapolation is that if we depend on extrapolation for predicting, we will never anticipate something new happening: extrapolation can only predict that things in the future will be mostly like things in the past. Here’s how Greg puts it:
And that’s what most analysts miss. The future is hard to predict not just because of our cognitive biases or inexplicable natural events, but because we have the power to make our own future.
The first new course that we approved at my Polytechnic after we scrapped the stage-gate process was a program that offered free computer and internet lessons to people in the community, particularly targeted at older adults. And the number of enrolments that we got went so far beyond anything that we had ever seen before that it was almost impossible to believe.
None of my models could have predicted that. When we innovate, our job is to invent the future. The exigency of extrapolation is that if that is the tool we use to predict, we won’t be able to invent anything that doesn’t already exist. And what kind of innovation is that?
Uncertainty in innovation is as complex as eco-system. I still recall I sit in your class and you explain about how we deal poorly with outliers. It helps me to walk our project in the center to manage the outcomes from 32 RD center. I combine Business Model Innovation, Outcome Driven Innovation with Stage Gate.
Thanks for the comment Rizal! I hope that you’re doing well. I’m glad that you’re able to use some of the ideas from the classes. Outliers is another topic that I should probably write a bit more about.