Trying to predict the future is hard. And in some ways, it's impossible. But it's a valuable exercise. Because in "predicting" the future, we also increase the likelihood that we can shape it. 

Lately, I've been referring to myself as a "futurist". It's entirely tongue-in-cheek.

My most recent, world-changing, hot take: the songs "The Man" and "Cruel Summer" by Taylor Swift will be popular on the radio in the coming months. Let's see how those predictions pan out.

One of the reasons I joke about being a futurist is in reference to the many times I've spoken (and will continue to speak) about the follies of predicting the future. It has become a common theme on this blog. For example:

You'd almost think I consider this guy better than most of us at making predictions:

In some domains, you're probably not wrong.

The purpose of this article is to take a positive approach to predicting the future.

Because it's necessary to predict the future. Most of the important decisions we make in life involve predictions of a sort:

  • When you get married, you're making a guess about your compatibility with, and the suitability of, your partner for what will hopefully be a life-long relationship that spans many seasons of your life. You can only have one 50 year marriage, after all. 
  • When you buy a house and take out a mortgage, you're making a prediction about your ability to continue to generate the income to repay the mortgage, and that the house won't be a major financial burden.
  • When you make major career decisions -- whether in terms of what field to work in, or whether to make a switch to a job within another team or organisation -- you're trying to determine whether it's going to be in your best interests.
  • When you go to a dinner party, you're (usually) doing it because you're predicting it'll be a good way to spend your time. (Or at the very least, it's an investment in marital harmony. Either way, you're predicting the consequences.)

Most of us use a prediction tool all the time. It's called a calendar. I can predict with pretty strong accuracy what I'll be doing at various times over the coming weeks, simply by looking at my calendar.

Most of my scepticism about making predictions relates to long-term predictions, and especially predictions that involve many variables and have second-, third-, fourth-, millionth-order effects. One variable impacts another, which impacts another, which impacts the behaviour of other people who are trying to make similar predictions, ad infinitum.

Let's put it this way: macroeconomics or stock picking aren't rocket science, and that's precisely what makes it these activities so hard. Rocket science -- at least as I perceive it -- involves a (relatively) limited set of variables and nth-order effects. Macroeconomics and active investment involve an enormous number of of nth-order effects, which interact deeply with one another.

If only these they were rocket science.

On second thoughts...

This shouldn't stop us from at least trying to predict the future, however. Especially when the stakes are high.

At the very least, predicting the future is an interesting exercise.

So at some point in the near future I'm going to put my "futurist" hat on, and give some predictions, or working hypotheses, about the future. This article is a prologue to those predictions.

But first I want to talk about how I think about predicting the future.

I borrow a lot of my philosophies relating to prediction from the work of Philip Tetlock, author of Expert Political Judgment and Superforecasting (which Tetlock co-authored with Dan Gardner, author of another two books I love, Risk and Future Babble).

The message that many people -- myself included -- got out of Tetlock's first book, Expert Political Judgment, is that experts are terrible at making predictions. It's pretty much the origin of the "most forecasters are less accurate than chimpanzees throwing darts at a dartboard" metaphor.

Superforecasting paints a more positive light. In this book, Tetlock and Gardner still skewer the predictions of many pundits and talking heads. But they also talk about the characteristics of people who tend to actually be good at making predictions.

The appendix to Superforecasting includes a good list of commandments for making good predictions.

Below I'm going to discuss some of these commandments, while adding some of my own thoughts.

Think in terms of possible futures, and degrees of certainty

I don't believe in "the future", singular. I believe in "futures", plural. I don't think anything is inevitable, although everything feels inevitable in retrospect.

Instead of saying "this will definitely happen" or "that definitely won't happen", it's more useful to think in more probabilistic terms. As in, this is very likely to happen, but there's also a chance it won't.

One of my pet peeves is when people talk about something that may or may not happen with absolutely certainty: "it's a certainty", or "there's not a hope in hell". That's the wrong approach!

(Except where you need to go "all or nothing" in a particular decision. But even then it should be informed by your best-informed assessments. And even then, there's a major risk to being prematurely confident rather than making a timely decision. And and and, you can still make all-or-nothing decisions with the humility that you could be wrong.)

It's easy to say something is 50/50. But the next time you feel this temptation, I challenge you to get more specific. Do you think each outcome is equally likely? Would you be prepared to change this to, say, 40/60? Or better still, 45/55? Or even more specifically, 43/57?

As Tetlock and Gardner say: "[Y]our uncertainty dial needs more than three settings [ie, certain, impossible, maybe]... The more degrees of uncertainty you can distinguish, the better a forecaster you are likely to be".

Take different perspectives

Whenever we're involved in a project or venture, we're inclined to focus on how this particular project or venture is different from anything that we or anyone else has done before. This is what Tetlock and Gardner refer to as the "inside view".

The reality, however, is that for most projects and tasks, there are usually projects and tasks that are somewhat analogous. They can inform our predictions about what will happen with this particular project. This is the "outside view" -- and is often more illuminating than the inside view.

In fact, in many cases subject matter experts do not usually have the best predictions, because they focus too much on the inside view. It's often people who have access to subject matter experts (ie, relevant information), but are a little further removed, who are better positioned to make accurate predictions. This is because they don't over-weigh certain idiosyncratic aspects of the specific matter in question.

Tetlock and Gardner use the metaphor of a "dragonfly eye, [where] one view meets another and another and another -- all of which must be synthesised into a single image".

Alternatively, an apt metaphor is contrasting the "fox" and the "hedgehog". Tetlock borrowed this from an essay by Isaiah Berlin. Basically: the hedgehog knows one big thing, and the fox knows lots of little things. Or: the hedgehog is someone who sees the world through one overarching lens -- and when all the tool you have is a hammer, everything seems like a nail.

Foxes tend to be much better at making predictions than hedgehogs.

Tetlock and Gardner talk about a fox with dragonfly eyes. I Googled to see if there was an image like that. There isn't. So use your imagination. Or enjoy this completely irrelevant (and irreverent) song, which ponders the age-old question: what does the fox say?

The better you can look at the world from different perspectives, the more accurate you're likely to be. You'll see the angles, you'll see the motivations, you'll see the incentives, and you'll have a different level of humility about the validity of any single perspective (including your own — and subject matter specialists, who can end up focusing too narrowly rather than stepping back and seeing their knowledge within a bigger picture).

We may not have much hope in determining second-, third-, and fourth-order effects. But this will give us a better chance.

Update your predictions based on new information

Some people think that being a "flip-flopper" is a major character flaw. Maybe it is, if you're flipping and flopping from one minute to the next, based on factors like who you're talking to in the moment.

But if you're not updating your beliefs in response to evidence, that's a major character flaw.

When it comes to predicting a fundamentally uncertain future, you should update your views and predictions as evidence comes to light. As Tetlock and Gardner say: "Belief updating is to good forecasting as brushing and flossing are to good dental hygiene".

Another way to think of it is that it's important to recalibrate predictions based on new data.

For instance: at the start of this year, I entertained creating a chart where I shared my level of confidence that Donald Trump wouldn't be US President by the end of the year.

I thought the likelihood was actually quite high: it would have been close to 50%, especially after Mitt Romney's op-ed in the Washington Post on the 1st of January this year. Unfortunately, I think the likelihood is much lower now (and I put this largely down to Attorney-General William Barr and his misleading representations relating to the Mueller report; also, journalistic negligence and GOP cowardice).

If pushed, I'd currently put the probability of Trump no longer being US President by the end of the year as perhaps 15%, and only half of that is political -- the other half relates to health issues or some other wild card scenario.

(At the time of writing, I put the probability of Trump no longer being US President by the time of the next election as being around 35%: that might be hopeful, because I can't entertain the thought of Trump losing the election and remaining President between the period of him losing and the inauguration of the next President. Talk about risks of a Franz Ferdinand event. This figure is substantially higher than it was a few months ago, as it becomes clear that the House Judiciary Committee is finally taking serious steps towards impeachment.)

Break big predictions into small predictions

Most predictions are usually lots of small predictions turned into one big prediction.

Big predictions are made up of lots of small predictions in much the same way that Voltron is made up of lots of lions.

Back to Trump: there are many reasons Donald Trump might not be US President at the end of the year. It could be impeachment or resignation in the face of impeachment (a la Nixon). But it could be for other reasons, such as health or death: he's not the healthiest 73-year-old on the block, and the stress of the Presidency (or more specifically the investigations spiraling around him) have to be taking their toll.

Another example: Rafael Nadal is on a bit of a roll at the moment. He's only one grand slam away from catching up to Roger Federer's tally of grand slams. If you're confident that Rafael Nadal will win the next French Open, what is the probability that he'll be healthy and able to play, as opposed to having an injury which will rule him out from competing in the first place?

Assess your predictions and learn from them

Once something happens, it can seem inevitable. It wasn't.

Whether our predictions are "right" or "wrong", it's valuable to do postmortems:

  • In what respects were you right, and what respects were you wrong?
  • Are there areas where your intuition consistently leads you astray?
  • Are there any areas where you tend to be right more often than you think, and should give better credence to your predictions?

In the words of Tetlock and Gardner: "Don't try to justify or excuse your failures. Own them! Conduct unflinching postmortems"

And: "don't forget to do postmorems on your successes too. Not all successes imply that your reasoning was right".

Every now and then I maintain a spreadsheet which I refer to as my "prediction chamber". I try to make a number of specific predictions and include specific probabilities of these things occurring (or not). Looking at these predictions holistically, I try to get a sense of my accuracy: if I'm saying various events are 70% likely, are 70% of them occurring, or am I being over-confident (only 50% are occurring) or under-confident (they're occurring 90% of the time). I haven't done this in a while, but I observed a number of useful things that improved my accuracy and it reinforced a spirit of humility and a desire to reassess my views and predictions.

Spend your time and energy on predictions that matter

Tetlock and Gardner explain that you need to triage -- "Focus on questions where your hard work is likely to pay off."

We only have so much time and energy. Making robust predictions is hard, and time consuming. So only concentrate on areas and put your reputation at stack where it will matter.

If it's your job, it might be worth trying to guess what the Reserve Bank will change the Official Cash Rate (OCR). But if it's not, is your prediction going to move the needle in terms of your personal long-term outcomes? I doubt it.

I'm also highly sceptical of being able to predict prices of shares of the share market in general. (If you've read this blog for any length of time, you could have, um, predicted this comment.)

What are your predictions about your profession, however? What social, technological, economic, environmental, and political changes are likely to impact your career risks, or create opportunities for you?

Here's another perspective: sometimes it can be valuable to make predictions that you don't want to happen... so you can take steps to make sure that prediction doesn't come to fruition. (This is the nature of the job for some lawyers: think about worst-case scenarios and help clients stop them from happening.)

In A Brief History of Tomorrow, Jonathan Margolis explains: "forecasting is far from a neutral activity". He follows: "That is really what futurology is about -- helping make choices, rather than saying this is the inevitable future."

At the end of the day, predicting the future isn't just about prediction. If you do it well, it can help you shape the future. 


Soon, I’ll be sharing some of my predictions for the future. Email me, and let me know some of yours!


Sonnie Bailey

Sonnie is an Authorised Financial Adviser (AFA) and former lawyer with experience in the financial services and trustee industries. Sonnie operates Fairhaven Wealth (www.fairhavenwealth.co.nz).

FOR NOTIFICATIONS, SIGN UP