Popular in recent news is the rise of cloud computing services and although products such as Apple's iCloud have received a large amount of criticism, most will agree that a great deal of the leading trends in the computing industry can be associated to one model, that of computing distributed as a utility. The most compelling and thought inspiring comparison I have read on the issue is in a book called The Big Switch. In his book, Nicholas Carr takes us through the electricity generation and the evolution from power generation to mass power distribution. In the 19th century, any business wishing to power their plant would need to purchase and run a personal electricity generator built using Edison's dynamos. Thomas Edison realised late in the century that electricity could be outsourced to power plants, which were able to supply an entire block with life giving power. Edison however, never saw beyond this and believed he could profit by manufacturing and distributing the components for his power plant dynamos. It took a man named Samuel Insull to realise the potential of electrical power distributed as a utility. He discovered that power could be created and distributed in large quantities much more efficiently and at a fraction of the cost of Edison's Dynamos. In a very short time he was powering the majority of the businesses in Chicago from one huge power plant. These companies had realised that by avoiding the purchase of expensive equipment, they were able to reduce their own fi xed costs and not have to worry about maintenance or obsolescence. Electricity as a utility had triumphed over the private power plant. Present-day, we can see the same process taking place in the computing industry. The personal computer era facilitated by Microsoft saw the rise of the Client-Server model of computing. Through office based data-centres, workers on personal computers are able to access the network, files, printers or other services. Although this model became hugely popular it also made the entire process massively inefficient. This is largely due to the lack of industry standards, meaning that hardware and software companies constantly compete by selling products that do not interact well with their competitors versions. This results in machines dedicated to single purposes, such as databases or web-servers and in order to host a new application or service a company must purchase another machine to dedicate to it. Furthermore, in a stable system, these computers need the capability to reach peak demands, and their available resources must reflect this even if the peak is rarely reached. My most recent employer was suffering from this in a bad way, a recent review estimating that under 40% of the total processing power was in use over 90% of the time. In short, client-server computing has destroyed the conservation of energy and processing power that had been present in previous generations. Obviously clear similarities can be made from this to the electricity era, the self contained data-centres businesses construct being just as wasteful and expensive as Edison's private dynamos. From this comparison it is easy to wonder why computing as a utility was not at the forefront of the computing generation. However, it had always been limited by a lack of sufficient bandwidth in the same way that electrical power was limited by DC. Now that network speeds are sky-rocketing and becoming cheaper, the power of computers can be delivered to users from a great distance, ushering in the age of utility computing. From this description it's difficult to see how Apple's new service fits into the cloud computing model. By that I mean the fact that when using their iCloud service you get nothing more than online storage for your music. In order to listen you are still required to download your tracks to a physical device. Essentially they are providing outsourced storage, but hard drives are as cheap as they have ever been, and storage is now the most inexpensive part of computing. However, Microsoft and other manufacturers have also been following this same, seemingly unprofitable business model. I believe it's plain to see the future lies in outsourcing the whole deal, or more importantly, the processing element of a system. This is already happening, but we have obviously not crossed over to this model yet and due to current mind-sets I don't think we are quite ready to leave the client-server model behind. This said, I don't think it will be long before our home PC's consist of barely more than a monitor, accessing the cloud for any service we require, we just need a large corporation to give the industry a shake. Google have already attempted proper cloud computing like this with their new OS and it has also been experimented with on a few other platforms. However, large companies such as Apple and Microsoft are doing nothing to facilitate this jump. Personally I see these companies as the Edison of the digital world. They came up with a good idea, but are unwilling to push it onto the next logical step, and their concern with the potential obsolescence of their current products (such as Windows or iTunes) is holding the industry back and costing us all money.

blog comments powered by Disqus

Published

25 June 2011

Tags