For a long time, the computer business was the mainstay of the tech industry and 3-year upgrades of hardware empowered new types of software applications. This was known as Moore’s law, whereby the CPU microchip speeds would increase by 10x about every 3 to 5 years.
Moore’s law was a boon for most western economies. It allowed a fairly predictable increase in worker productivity to continue from the 1970s to around 2005. It then slowed from 2005 to 2007. Around 2007, microchip speeds generally peaked at around 2.4 to 3.4 Ghz. This is said to be due to physical quantum limits. The chip speed today is still around 3 GHz, 10 years later. If Moore’s law was still in effect, we would have around 300 GHz chips today!!! So what happened instead?
Well, we had a focus on miniaturization of devices (mobile). The mobile boom, largely driven by the iPhone and other smartphones, allowed a boost in productivity for mobile workers and a new mobile app ecosystem to develop.
In additional there was shift to multi-core chips. Then it shifted to cluster computing and cloud computing. This cluster of servers with many, many computing core chips all working together allowed the server room to become the new workstation computer, i.e. the cloud. And thus another reason the tech innovation from 2007 to now has focused heavily on cloud computing, because that is where an increase in raw computing power can occur, which allows new types of software apps to be developed.
The next phases of development may include faster connection to the cloud, watch the 5G wireless rollout as a big deal.