Cloud computing is an attractive proposition for any organisation not wanting to manage its data in-house, for several reasons:
Reduction in up-front capital investment, on infrastructure.
New cloud-based servers can be up and running in minutes, rather than weeks.
Flexibility, as companies need only pay for the capacity they actually use.
Elimination of maintenance costs related to occasional spikes in demand.
Increased automation and process efficiency.
Added levels of service, and technological expertise.
All this, plus the fact that carbon emissions can be drastically reduced.
The Carbon Disclosure project (a report produced by independent research firm, Verdantix, and funded by AT&T), predicts cloud computing could save as much as 85.7 million metric tons of carbon emissions per year, within a decade. In addition, businesses could save up to $12.3 billion (£7.55bn).
The study involved 11 global companies, representing several different sectors. The researchers used various financial and carbon-reduction models.
But the good news doesn’t stop, there.
The CLEER View
A typical organisation has more servers than it needs for back up, failures, and spikes. The “green-ness” of any cloud option will depend on several factors:
How big is the data centre being replaced/removed?
How well-designed is the organisation’s data centre?
How does the cloud host’s cooling procedure compare with the organisation’s own facilities?
Does migration to the cloud leave a load of electronic waste to dispose of, or recycle?
Most reductions in carbon emissions occur because a cloud service has a much more efficient data centre than private facilities. Some cloud hosts are located in colder climates, which offer natural cooling benefits.
To give a clearer picture, Berkeley Lab has created the CLEER model, to help IT executives determine how much energy they could save by moving to a cloud service. The model provides an estimate; results depend on the size of the organisation, your business model, and other factors.
Against the Machine
Virtual machines (VMs) are servers, running in the cloud – bits of software that work like a real machine. Till now, they’ve been the mainstay of the cloud computing services offered by Amazon, Google, Microsoft etc. Virtual machines are also the basis of the (costly, inefficient) data centres operated by so many private companies.
There has to be a better way. And now, there is.
Zack Rosen (of online service and website publishing platform, Pantheon) believes that we could cut carbon dioxide emissions more than electric cars do, by using a cloud technology that fits into the open source Linux operating system. Namely, containers.
A container is a means of encapsulating software – wrapping it in discrete packages, isolated from other software running on a computer’s OS. Using a container format that runs on many operating systems, you can easily move applications from machine to machine. In the world of cloud computing, where software is spread across hundreds (even thousands) of servers, this is mission-critical.
Containers also provide “resource isolation.” You can carefully control how much of a machine’s processing and memory resources get allocated to a particular container. By so doing, you more efficiently squeeze many applications onto the same machine.
True, you can do something similar with virtual machines. But that requires loading multiple VMs on a server, each running their own operating systems. With containers, you can do resource isolation with a single OS. Which means less processing time, lower space requirements, and less money.
It’s Good Enough, for Google
On the Linux OS, Google pioneered this kind of resource isolation, with a container tool called “cgroups.” Simply by using this across the globe in its online operations, the company believes it has saved the cost of building an entire data centre.
In recent months, a start-up company called Docker has made the technology easier to manage, reshaping containers so that companies and developers can more readily move them from machine to machine. Google has responded by offering to run Docker containers on its cloud services (Google Compute Engine and Google App Engine) – which could significantly expand their use.
Google’s cloud services run Docker containers, sitting on virtual machines. The VMs are still required to ensure that services can run software from many different companies without letting data leak between them. Containers do have safeguards against this, but have yet to provide the level of security you get with the more established virtual machine technology.
This is the main reason why no major cloud service has (yet) abandoned VMs in favour of containers. We’re still a long away from the completely “containerised” cloud.
In June 2014, Rackspace introduced a cloud service that works without VMs. However, each machine offered by the service only runs software from a single customer. That gets around the security issue, but can’t achieve the efficiency you’d get by carefully packing everyone’s software containers into one enormous cloud service.
A Step Closer, to the Ideal?
Give it time.
Once the security quibble is solved (it will be, inevitably?), and virtual machines cut out of the equation entirely, ever more computing power can be saved. And, with it, the planet. Kind of.