Fourth Industrial Revolution
Welcome to the Fourth Industrial Revolution. But who will be the winners – and who will be the losers?
Monash Life | Thriving communities | 2 minute read

In February 1966, Time magazine published an essay: The Futurists – Looking Toward AD 2000.
Some of its guesses were a little off, such as the prediction that “machines will be producing so much that everyone in the US will, in effect, be independently wealthy”. But it also predicted that “in automated industry, not only manual workers but also secretaries and most middle-level managers will have been replaced by computers.” Welcome to the fourth industrial revolution.
What does it mean for the humans working alongside those computers? According to Professor Simon Wilkie, Head of Monash Business School and former Chief Economic Policy Strategist at Microsoft Corporation, it’s short-term pain for long-term gain. Look back at the history of industrial revolutions, he says, and you’ll see that they don’t improve people’s lives immediately. “The first revolution made a few people very wealthy, but mostly people just moved from being poor and rural to poor and urban. It was the second industrial revolution, in the 1920s, that brought indoor plumbing, electricity and telephones available to everyone – a democratisation of stuff.”
The third revolution was about the spread of computing power and access to information, and now it’s happening again. This fourth industrial revolution is about what Wilkie calls “ubiquitous ambient intelligence” – which everyone can now access. Cheap sensors that generate huge amounts of data now cost cents. Cloud computing enables anyone to rent vast amounts of computing power, rather than having to buy 100,000 computers. And deployment of fibre allows high-speed data transmission. “In the language of economics, it’s an abolition of ‘barriers to entry’ through the reduction of sunk costs,” says Wilkie.

And we have yet to see the full benefits, he argues. “Technology is just a tool. And that tool can be what economists call either a substitute or a complement to human skills. When a new technology is introduced, it’s often introduced as a substitute – a way to do something cheaper, faster or more reliably. A bulldozer replaces a hundred men with shovels, for example. This tends to drive down wages for the people whose jobs are replaced.”
But over time, people learn how to use technology in new ways: it becomes a complement to our skills. Wilkie points to the start of the 20th century, where 50 per cent of jobs in the US were either farmer, farm labourer or domestic servant. But by the end of the 20th century, just two per cent of people worked on farms – and everyone still got fed. “We invented a whole new bunch of stuff for people to do that was more valuable – and hopefully more fulfilling.”
As a result, he argues, our thinking about jobs needs to change.
You shouldn’t be equipping people for a job. The nature of a job is probably transitory. It’s better to think of education as a lifelong journey: the skills you use today might not be that relevant a few years from now.”
That has led to a rethinking about what and how to teach at the Business School. Our MBA programs, including the MBA and the Global Executive MBA (currently the top-rated MBA programme in Australia and one of the only two in the FT100 from Australia) have been reimagined and rethought around the digital transition and the future of work. They’re now less about acquired knowledge and more about mindset.
Being in the middle of this transition can seem scary, says Wilkie, and it will take time to settle. “But while it can look dark in the short-term, in the long run we’ll find new things to do – and better ways to use our time. That human factor will become more and more important. Empathy, creativity, unique quirkiness – these too will become more valuable. The future is not about everybody working as a coder. The future is about being more focused on the things that make us human.”