Previous [Part 3]
Cloud Computing – Centralized or Decentralized?
After this long excurse into history, let’s come back to our initial question: is cloud-computing centralized or decentralized? Well, the answer is: both! Consider a simple web-application: parts of it is running decentralized in your browser (Ajax). The data may be stored in a single data-center – centralized, but the database is replicated on different virtual machines – decentralized. The web-application may make use of other services – decentralized, but provides its features via the same URL to thousands of users – centralized.
Does the question even matter?
The terms centralization and decentralization have always been misused in the history of computers to simplify any form of change triggered by technological progress in a fad kind of way (something similar happened in organisation theory). But for me the history of computers does not show alternance between centralized and decentralized architectures, it shows that architectures have gotten more complex and differentiated fostering more specialization and abstractions, more dedicated software and hardware components, sophisticated layers, and specific solutions. Some are centralized, others are decentralized – the term does not make sense anymore. Discussing on the pros and cons of centralization can at best be done for individual components (like databases).
Cloud Computing is the answer to the increasing demand – of entreprises as well as of consumers – for ubiquitous information in a mobile world. It reflects new forms of communication and collaboration and is far beyond the discussion on pros and cons of decentralized architectures.
This post is part of a series. Click on the links below to read the other parts.
[Part 1] The first computers
[Part 2] Closer to the users
[Part 3] Home computers and the Internet
[Part 4] Centralized or Decentralized?