The question if an architecture should be centralized or decentralized is one of the recurring questions in systems design, but also object of discussion in organizational and political theory. In most enterprises and especially IT departments, technical and organisational aspects fall together and even influence each other. To understand the possible impact of cloud computing on the structure of organisations, we would like to figure out if cloud computing is actually following a centralized or a decentralized architecture. Let’s therefore start with a journey through history – the history of cloud computing is nothing less than the history of computers and computer networks – and their alternation between centralization and decentralization.
The First Computers in Use
The first effectively used computer (ENIAC) occupied the space of a large building (around 300 m2), was targeted for very specialized military applications, and all instructions hardcoded and controlled by 6000 switches that could be manually controlled. The next computer (EDVAC) became more flexible, controllable and less error-prone with the use of punchcards. The next one (MIT’s Whirlwind) was the first with a rudimental user interface (a kind of radar screen). Until then, computers are large, extremely expensive, and specialized, and only one organization in the world could use it: military.
Giant monolytic building blocks on a fixed location. The first computers are without any doubt centralized.
Continue [Part 2]