Long Long Time Ago....Before the Rise of Cloud Computing,
If I wanted to create a website or any web apps. I would go through the following process.
I will develop my web application on my local computer and make sure that it works well and good.
Choosing Hardware Requirements
Now I wanted to make my web app available to everyone, so I want to buy my server for the production environment.
I am a developer. If I wanted to save money, I will take my own time to get to know all the server configuration and stuff. If you are lazy like me, you would hire an operations person for doing that stuff. Either way, you are going to spend time. Remember If I configure the server wrong, the hardware is damaged and becomes a useless piece of metal.
After everything, I wanted to make sure everyday that the website. If it was not monitored, I wouldn't know even if the website goes down. Unless I am noticed by a customer.
Cost (According to nothingbutnet.com)
- Server Hardware  - $ 11,000
- Server Software (Windows Server) - $ 1700
- Backup and Software - $ 2000
- Ancillary Server Equipment - $ 500
- Installation/Migration Costs - $ 4000
Total Upfront - $19,200 (approx.)
- Maintenance and Monitoring - $ 400
- 8 Hours of Monthly Remote Support - $760
- Offsite Backup - $ 150
Total Monthly - $ 1310 (approx.)
Variable Repair Costs
- Unplanned Repairs - $ 7000
Lets just say my application hit the market wll. I need to spend lots and lots money to people and hardware for scaling the application.
Fact: In Last 15 years, Google went from 112 Servers to a $5B-Plus Quarterly Data Center Bill.
Its a looooooooooooonnnnnnng process right....?
But some People like John MacCharty, an American computer scientist and cognitive scientist called the “Father of Artificial Intelligence” laid a seed in 1961 at MIT. He said that "Computing can be sold like a utility, just like a water or electricity".
Since Technology ∝ Time, it eventually happened.
Rest is History..... Many Tech Giants like Amazon, Google, Microsoft captured the market.