Historical changes to how we consume technology

Over the years, technology has revolutionised our world and daily lives, with the trend of the increasing significance of technology only set to continue. It is therefore of use to look back at how technology has evolved.

Understanding the driving factors behind innovation, and the evolution for how technology is consumed could provide insights and set the stage for what the future might hold.

In a matter of just over 70 years, computers have gone from requiring an entire warehouse to operate, to a device as small as seven inches and weighing less than a kilogram running largely from a Cloud.

This drastic growth in technical capabilities has led to an equally drastic change in how computer technology is consumed.

The original general-purpose computers largely acted as mathematical calculators. Today, relatively strong computational power can be found in everything from our smartphones to our watches and even our fridges.

There have been several historical events that accelerated breakthroughs in technology, the most notable being World War II.

This period acted as a midwife to the birth of the modern electronic computer.

Unprecedented military demands for calculations and hefty wartime budgets spurred innovation.

Ultimately, technology played a greater role in the conduct of World War II than in any other war in history and had a critical role in its outcome.

Shortly after World War II, the first programmable digital computer, called ENIAC was built and designed specifically for computing artillery range tables for the military.

Plugboards were used to implement ENIAC’s programming, which could take weeks to implement, but once completed allowed for computational speeds achievable by electronic means only.

Impact of the Space Race

The space race between the Soviet Union and the US was inaugurated by the Soviet’s launch of the first artificial satellite, Sputnik on October 4, 1957, which marked the start of satellite GPS mapping and revolutionised global navigation, travel and communication.

Light speed communication was the greatest innovation to be born of the race. In 1969, the Advanced Research Projects Agency transmitted the first message over a global satellite network (ARPANET). This was the inception of the internet.

2020 was a year dominated by the need to contain a pandemic. Nations scrambled to contain the virus, resulting in the first worldwide lockdown.

Although the technology already existed, the pandemic gave rise to 88% of organisations globally requiring their employees to work from home.

Society had already believed we lived in a digital age, the pandemic solidified that without the advancements in technology, many would not have been able to continue business as usual.

An abrupt shift to a remote workforce brought hope to global economies, but it didn’t come without its downfalls.

Cyber-crime rose to an all-time high. Organisations, governments and individuals all became targets.

The rapid shift to work from home produced ideal conditions for cyber criminals. Many organisations didn’t have the time to implement robust cyber security solutions.

TechRebublic reported a 667 percent rise in phishing attacks in March 2020 alone, and a rise of 800 percent in ransomware attacks causing an estimated global loss of approximately $1 trillion.

These changes to the cyber security threat landscape have called for drastic advancements in threat detection and security measures, with the Cyber Security market forecasted to reach $170.4 billion by 2022 according to Gartner, as organisations improve their defences against cyber threats.

Tony Snow
Tony Snow
Tony Snow is chief executive and co-founder of Stratus Blue. He can be contacted at tony@stratusblue.co.nz.

Related Articles

Latest

A D V E R T I S E M E N T