History Of The Cloud

Publication date: Jul 02, 2018

Last Published: Dec 13, 2022

Table of Contents
Read Time : 3 minutes

In the 1960s, the concept of Cloud computing originated with the breakthrough of having multiple users “time-sharing” computing resources on mainframe computers. At the time, the cost of purchasing a mainframe computer was not feasible for most organizations. Nor was it practical for each user to have their own processing power or storage capacity. Therefore, shared access through a dumb terminal to a central resource providing computing power on rented time made the most economic sense and opened up access to computing power that previously was only available for the largest organizations such as the military and very large research institutions.

In the 1970s, as full time-sharing solutions became increasingly available to organizations, the concept of the Virtual Machine (VM) was created. This allowed multiple distinct computing environments to reside in one physical environment, taking the shared access mainframe to the next level.

In the 1980s, the time-sharing concept was overtaken by the emergence of the Personal Computer (PC), which made owning a computer for individuals and households much more affordable. This allowed each user to have their own computing power and resources, eliminating the need for large mainframe systems.

In the 1990s, as PCs became widely available, organizations started deploying computers everywhere, and as IT operations started to grow in complexity, organizations recognized the need to control IT resources. Technology vendors and IT service providers responded to the growing demand for IT resources and the complex needs of end users by inventing and deploying Servers. The boom of data centers soon followed these service developments.

In the 2000s, after the boom of data centers and server infrastructure in the 90s, hardware Virtualization technology was introduced by VMware, which hides the physical features of the computing platform from the user. This allows multiple operating systems to simultaneously share resources on a single hardware host server. By the mid 2000s, virtualization had led to mass consolidation of server infrastructure, allowing organizations to reduce costs and ultimately to eliminate large on-premise data centers.

Simultaneously, as high speed internet became more available and cost effective, Cloud providers such as Amazon Web Services (AWS) and Google had launched. By the late 2000s, organizations were starting to migrate their systems to the Cloud, to reduce costs and improve service reliability.

Virtualization technology was a key contributor to the proliferation of cloud services, as it allowed providers to easily offer Virtual Machine (VM) instances to customers in an economical way by sharing pooled hardware resources, on a pay-per-use model.

By the 2010s, Microsoft Azure had launched, followed by Office 365 along with various other Cloud services. These new providers offered a wide range of Cloud based platforms and services. As of the late 2010s, Cloud computing has come full circle to the “time-sharing” model of 50 years ago. While technology and computing power have exponentially advanced over the last 5 decades, the concept of shared access to a central resource providing computing power, on a rented time basis and in an economical fashion, is still core to the concept of cloud computing today.

Written by: Payam Pourkhomami, President & CEO, OSIbeyond

Related Posts: