Paul Saffo spoke to Stanford’s Media X conference on the art of predicting the future. Specifically predicting which technology will come to dominate the next decade. Paul’s talk may at first seem somewhat contradictory in nature: Demonstrating how to do it, while simultaneously showing it can’t be done. This article summarises the talk.
30 Year Cycle
Every 30-50 years a new science turns into a technology. With approximate dates:
- 1900: Chemistry
- 1930: Physics
- 1960: Electronics
- 2000: Biology
We are now on the cusp of a revolution from electronics to biology. The precise inflection point, the point of change, may not yet be clear.
Paul noted that Thomas Watson’s famous misquote, “I think there is a world market for maybe 5 computers”, was made in 1953, right on the cusp of the electronics revolution: Aside from the fact that he was talking about a specific machine, and not all computers, the quote is a good example of how it is difficult to predict the future at such points of radical change.
Forecasting the Future
The goal is not to be right, but “to be wrong and rich”: It is easy to take the view that one cannot forecast. If you do attempt to forecast you will still mostly be wrong, but the very act of trying will increase your chance of success over those that do not try.
The further away from a point in time you predict into the future, the greater the level of uncertainty. The difficulty in forecasting is finding a balance between being too narrow and too broad. Forecasting might use wildcards. The “hard part” is to be wild enough.
Typically forecasts for a new product or technology’s introduction are linear: The magnitude of the amount of use of the technology is forecast to grow steadily with time.
Reality tends to be represented as an S-shaped curve: In the early stages the magnitude of use is below the expectation generated by the linear forecast. Usage then rapidly grows, such that the actual usage rises above the prediction in the later stages. The result is that in the first part, forecasters tend to over-estimate performance, while latterly they under-estimate performance. Venture capitalists tend to have linear expectations, and so are disappointed in the early stages, while failing to see the later potential.
Robots and Inflection Points
Paul Saffo used the example of DARPA’s annual competition for robot-driven cars. In the first year only a handful of competing robot drivers made it out of the starting gate. No car completed the challenge. The next year 22 out of 25 robots got further than the leader in the first race.
The example gives a quantifiable measure of how the technology is developing, year to year.
Spotting the inflection point, the place at which real, dramatic change starts to occur, can still be hard. Sometimes it can be spotted using data which has been ignored or hidden. Sometimes it is a case of looking for what does not fit. The anonymous quote, “history doesn’t repeat itself, but sometimes it rhymes”, is apt. Look back in time as far as you look forward.
The good news is that if you miss an indicator, you still have lots of time to spot another.
Paul contested that the last three decades had been characterised by a dramatic cheapening of a component technology, which in turn had led to the widespread use of a product:
- 1980s: Cheap processors led to the processing age. The result, widespread use of PC.
- 1990s: Cheap communications lasers led to the access age. The result was the network infrastructure to support the World Wide Web.
- 2000s: Cheap sensors are leading to the interaction age. Applications are currently missing, but widespread use of robots appears to be the future.
Biology and Electronics
Electronics is building biology, and Paul expects that eventually biology will rebuild electronics: These technologies are far from isolated.
An example of developments in electronics progressing biology can clearly be seen from work on the human genome. A well funded government-backed project was beaten by a far smaller project. The smaller project was able to successfully deploy robots, with the results that the cost of the work dropped by a factor of 10 each year. The government project had been funded based on the cost of technology at the outset, and initially failed to adequately respond fully to the changing cost structure.
The creation of the first artificial genome in January 2008 may yet prove to be the inflection point.
Trust Instincts at Your Peril
“Assume you are wrong**” (** and forecast often)
Paul used the example of the sinking of a US naval fleet near Honda, on the west coast of the United States, on 8 September 1923. The fleet had been navigating using a forecasting technique called “dead reckoning”. The coastline had a (then) new technology available to assist navigation – radio direction finding. This allowed a bearing to be given between a land station and the fleet.
The radio direction finding gave an unexpected result that did not match the forecasted position. The lead boat in the fleet concluded that their position was more favourable than anticipated (closer to their destination), and turned sharply… straight into the rocks they had been trying to avoid. The 11th boat in the fleet did not trust the judgement of the lead boat, and when the fleet turned, it hedged its bets, slowing and waiting to see what happened. It was one of only 5 ships from the fleet not to run around.
The morale of the tale: Hedge your bets, but embrace uncertainty. Or as written once on a tipping jar:
“If you fear change, leave it in here.”
Divergence of the Species
The question was asked, will biotech lead to a further aggregation of wealth? Yes. The electronics revolution had itself deepened inequality. Biotech raises a particularly ugly spectre which extends beyond wealth, to life itself. The wealthy would be likely to use their wealth to extend their lives. The ultimate outcome – species divergence. Currently the rich tend to benefit from better health care, and so extend life. But biotech is likely to create a lot more options.