Let us assume that you are the CEO of an organization, and you have a big organization. Naturally, it becomes your duty to ensure that all employees of your organization have adequate hardware and software to carry on their tasks. Now, it is not possible to buy software and application for every computer and every employee. As the hiring of an employee continues, the buying of licensed products is also mandatory.  So you have to spend a lot on each employee.



But there is an alternative concept you can execute for a better result. Instead of buying and installing software for each PC, load a single application that will allow the employee to log-in to a web-based resource that hosts all the users' programs based on the requirement. Everything will run by another company's remote servers and facilitates us by providing everything from word-processing applications to spreadsheets, database applications, email, and data analysis programs. This technology is called "cloud computing," which brings a revolutionary change in the IT industry.

History of Cloud

In the early era of technology, client-server architecture was popular with mainframe and terminal applications. At that time, storing data in CPU was very expensive, and hence the mainframe connected both types of resources and served them to a small client-terminal. With the revolution in the mass storage capacity, the file servers gained popularity for storing vast amounts of information.

In 1990, the giant connecting concept - the Internet, finally got enough computers attached to it. The connection of those machines together creates a massive, interconnected shared pool of storage that won't be possible for a single organization or institution to afford. There comes the concept of "grid". The term 'grid' has a misinterpretation as a synonym for 'cloud computing' as both of the technology is formed from many computers connected. 'Grid Computing' requires the usage of application programs to divide one large system processing to several thousands of machines. But there lies the disadvantage; that if a single part of a software node fails the processing or working, other pieces of that software nodes may also fail to process. So, this 'grid'-based working concept didn't become so fruitful.


On the other hand, cloud computing involves the concept of 'grid', except that it provides on-demand resource provisioning.

On the first milestone of cloud technology, Salesforce.com engraved its name in 1999. It pioneered the technique of delivering enterprise applications via a simple website. They provided both specialist & mainstream software firms to bring up used over the internet.

The next development was in 2002 by Amazon's Web Service (AWS). They provided cloud-oriented services, including storage, computing power & human intelligence via Amazon Mechanical Turk. In 2006, Amazon launched their EC2 (Elastic Compute Cloud) - a commercial web service that let small organizations and sole proprietors rent computers on which they run their computer applications.

EC2/S3 became the 1st accessible cloud technology infrastructure service.

In 2009, another significant milestone engraved the name of Google with Web 2.0.  Google and others started to offer browser-based applications via Google apps and other apps. Then came Microsoft's Azure - both Microsoft and Google deliver services in a way that is reliable and easy to consume.



Found This Page Useful? Share It!
Get the Latest Tutorials and Updates
Join us on Telegram