Businesses rely no less on electricity than on IT. Yet corporations don’t need a “Chief Electricity Officer” and a staff of highly trained professionals to manage and integrate electricity into their businesses. Does the historical adoption of electricity offer a useful analogy for today’s innovations in cloud computing?
While the utility model offers some insights, we must go beyond this simple analogy to understand cloud computing’s real challenges and opportunities. In a CACM Viewpoint paper we (Erik Brynjolfsson from MIT Sloan, John Jordan from Penn State and I) give six reasons why IT won’t be like electricity as Nicholas Carr claims in his book The Big Switch.
Technical Weaknesses of the Electricity Model
- The Limits of Scale
- Pace of Innovation
- Latency: Distance is not Dead
The Business Model of the Cloud Goes Beyond Electricity
- Synergies and Co-Invention
- Lock-In & Interoperability – the flip side of innovation
Limits of Scale RDBMS – the middleware for every ERP system (also for Salesforce CRM) – doesn’t scale; a scalable storage with an API as rich as SQL is still an unsolved research problem. PIQL by RAD Lab at UC Berkeley is a promising attempt to solve this problem by machine learning. PIQL offers the query (SQL) programmer predictive performance on distributed data storage, e.g. an upper bound for queries.
Further, cloud providers don’t offer yet SLAs (service level agreements) with predictable performance like electricity providers do. Stress tests conducted by Sydney-based researchers have revealed that the infrastructure-on-demand services offered by Amazon, Google and Microsoft suffer from regular performance and availability issues. Performance increases or decreases with load and there are huge variations in the measured standard deviation between the different providers of data storage in the cloud; see SIGMOD benchmarking paper by Donald Kossmann et al. The benchmarked cloud services (AppEngine, MS Azure, AWS RDS, etc) didn’t scale linearly as electricity would.
Composition of SLAs – What is the overall performance if provider A offers performance X and provider B offers performance Y? – is yet another unsolved research question.
The Pace of Innovation in electricity generation and distribution happens on the scale of decades or centuries. In contrast, Moore’s Law is measured in months. In 1976 the basic computational power of a $200 iPod would have cost one billion dollars. IT reinvents itself regularly. Cloud data centers are big investments which may be rendered obsolete after one or two cycles of Moore’s Law (18 to 24 month). A SaaS provider with a twice as fast HW may push incumbent providers using last generation HW fast out of business. The reason why we don’t see this replacement process yet at a large scale is that existing cloud providers are small compared to corporate IT and cloud computing is still in its hyper growth phase. For Example, a typical ERP installation for a Fortune 500 company is bigger than all of Salesforce together. Chevron’s IT is more than three times as big as for example AWS (Amazon Web Service). Google uses very cheap commodity HW to mitigate as good as possible the fast pace of innovation. IDC predicts that by 2012 less than 10% of world wide IT spending will be for cloud computing.
Distance is not Dead One of the few immutable laws of physics is the speed of light. As a result, latency remains a formidable challenge. In the network realm, the demands for nearly instantaneous execution of machine-to-machine stock trades has led financial services firms to locate their data centers as physically close to stock exchanges as possible. Typically, the data needed for approval of a drug by FDA is not send over the network but the computers involved are trucked to the FDA.
For many classes of applications, performance, convenience, and security considerations will dictate that computing be local. Moving data centers away from their customers may save on electricity costs, but those savings are often outweighed by the costs of latency.
Important as the technical differences are between electricity and cloud computing, the business model differences are even more profound.
Complementarities and Co-invention Like electricity, IT is a general-purpose technology. This means that critical benefits come from the co-inventions that the basic technology makes possible. It took 30 to 40 years for the full benefits of electricity to redound to America’s factories. Initially, assembly lines and production processes were not redesigned to take advantages of electricity: large central steam engines were simply replaced with large electric motors, and then hooked up to the same old crankshafts and cogs. Only with the reinvention of the production process was the potential of electrification realized. Firms that simply replace corporate resources with cloud computing, while changing nothing else, are doomed to miss the full benefits of the new technology.
The opportunities, and risks, from IT-enabled business model innovation and organizational redesigns are reshaping entire industries as Erik Brynjolfsson and Adam Saunders point out in Wired for Innovation: How IT is Reshaping the Economy. For instance, Apple’s transition from a perpetual license model to the pay-per-use iTunes store helped it quadruple revenues in four years. The tight integration between Apple’s ERP system and the billing engine handling some 10 million sales per day would have been difficult, if not impossible, in the cloud. Remember, in the cloud all companies would use the same ERP instance; extensibility and customization are very restricted for cloud apps.
Lock-in and Interoperability Lock-in issues with electricity were addressed long ago by regulation of monopolies, then later by legal separation of generation from transmission and the creation of market structures. Markets work because electrons are fungible. The rotary converter that enabled interconnection of different generating technologies in the 1890s has no analog for the customer of multiple cloud vendors, and won’t anytime soon. The essence of the cost advantage for cloud computing is statistical multiplexing which gets lost when moving data between cloud providers.
The security concerns with cloud computing have no electricity analog. No regulatory or law enforcement body will audit a company’s electrons, but processes related to customer data, trade secrets, and classified government information are all subject to stringent requirements and standards of auditability.
The typically shared and dynamic resources of cloud computing (CPU, networking, etc.) reduce control for the user and pose severe new security issues not encountered byon-premise computing behind firewalls.
If the utility model were adequate, the challenges to cloud computing could be solved with electricity-like solutions — but they cannot. The reality is that cloud computing cannot achieve the plug-and-play simplicity of electricity, at least, not as long as the pace of innovation, both within cloud computing itself, and in the myriad applications and business models it enables, continues at such a rapid pace.
The real strength of cloud computing is that it is a catalyst for more innovation. In fact, as cloud computing continues to become cheaper and more ubiquitous, the opportunities for combinatorial innovation will only grow. It is true that this inevitably requires more creativity and skill from IT and business executives. In the end, this not something to be avoided. It should be welcomed and embraced.