People, Planet, Clouds

Holly K Cummins
9 min readJul 9, 2020

The world is changing. The cloud gives us dazzling computational possibilities, and … potentially uses a lot of energy. As climate change accelerates, where do we, as engineers, fit in? Are we part of the problem or part of the solution? How do we balance the needs of people against the need of the planet? Or can they be aligned?

The Situation

In 2019, back when plane travel was still a thing, I was on a plane from Berlin to London, reading Lean Enterprise. I recommend that to everyone — the reading part, not necessarily the plane part. Near the beginning of the book, there’s a section about how continuous integration is practiced in one large tech company. Between twenty and sixty code changes are submitted a minute. My first thought was “that’s the way to do CI/CD right!”. My second thought was “uh … how much energy do all those builds use?” If each change gets its own pipeline, that’s 40,000 builds a day. That’s a lot, by anyone’s standards.

Why does it matter? Well, there’s a bit of climate trouble brewing. In the UK, where I live, temperature records continue to crumble; 2018 was the hottest summer in England, until 2019, which was hotter still. July 2019 was the hottest month on record for the planet.

The world is now about one degree Celsius hotter than in the “pre-industrial” period. Although one degree doesn’t feel like a huge shift, the impact is noticeable. Coastal cities are flooding more often, and these floods are potentially ruinous. Some island nations, such as the Marshall Islands, may find their whole country underwater.

Photo of The Marshall Islands
The Marshall Islands. Photo: photo: Department of Foreign Affairs and Trade

.

Wildfires are getting more common and more severe. Hurricanes are getting more destructive, droughts are getting drier. Species are going extinct. So far, so bleak, but what does it have to do with us?

Computation and Climate

Our industry, in general, measures its success by doing more. More users, more cores, more RAM, if you’re a supercomputer, more petaflops. But all that more comes with a cost. There are a lot of computers in the world, and they use a lot of electricity. So much, in fact, that 1% of the world’s electricity is used by data centres. Once you add in the rest of the digital ecosystem, the computers we program use around 2% of the world’s annual energy consumption, comparable to air travel.

What To Do About It

Awareness

So what should we be doing? Although it’s “just talk”, talking to our friends and colleagues about climate change is an important first step. I don’t necessarily advocate becoming a climate bore at dinner parties, but this isn’t a subject on which we should be silent. Climate change is too big a problem for individual action alone; governments need to drive broad change with big policies. Making this happen — or even getting it onto government agendas — needs voters and consumers to be talking about climate change.

This spring, the IBM Garage (including me!) worked with a startup called The Climate Service, who use sophisticated science to quantify the financial impacts of climate change. The intention is to allow businesses to understand how the changes in the physical world will affect their assets and investments over the next century.

Analysing climate risk supports better decision-making, and is a regulatory requirement for publicly traded firms in some markets. Even though the Garage team were climate-aware, we were still surprised by some of the implications of climate change: “Is that 2030 flood risk graph for Tokyo correct?!” “Oh. Wow. It really is that big.” I do believe that seeing the concrete implications of current trends has the power to create positive change.

If you’re a full-stack developer or a designer or a data scientist or someone else who uses computers for communication, even if it’s not your day job, you have skills you can bring to the conversation— you can visualise things. Although it’s not directly climate-related, I loved the glowy earth earthquake visualisation glitch, and I’m keen to see other data sets mapped in a similar way. It’s open-source, so anyone can riff and remix.

Understanding

Improving public understanding of climate change is one way where everyone can help, but our profession can also help advance the scientific understanding. Data access is one of the tougher problems in climate science, and the cloud democratizes access to data. (Remember that data science is 80% finding and cleaning data, and 20% complaining about cleaning the data. )

Once the scientists get the data in their hands, there’s models to define and run. Models, by their nature, can always be improved, either with higher granularity, or new policy simulations, or clever-er math.

This is complex stuff, and there’s an opportunity to find new modeling techniques to handle rare events. There are also new frontiers in AI and data management. Machine learning can help synthesize different datasets, streaming and event-driven technologies can ingest data in near real-time, and human-centred design can ease user access to data.

Toleration

Computer folks also have a role to play in helping mitigate the impact of climate disasters. Initiatives like Call For Code give a structured opportunity for small groups to contribute solutions that help communities become more resilient. For example, in 2018 the winning idea was solar-powered mesh devices that could set up a network where there was none. At a larger scale, projects like the GRAF weather forecasting system put information in the hands of farmers, so they can get advance warning of unusual rainfalls or hot spells and avoid crop loss.

Waste reduction

Even in our everyday work activities, we all have a climate impact. We can influence how much energy — and what kind of energy — our companies use. Although data centres are a significant energy consumer, that doesn’t mean we should start running all our workloads on servers under our home office desk, or on our employer’s premises. A computer in a data centre will run generally more efficiently than its cousin in a server room, particularly if it’s a hyperscale datacentre. Some data centres are going even further in their push for efficiency. Skipping fans and immersing chips directly in coolant seems weird, but it’s seriously efficient. The industry is also experimenting with submersible data centres and mid-river ‘self-powered’ data centres (the river provides hydro-electric power plus water-cooling, and solar panels make up the rest).

All of this is good news for those of us who love the cloud. Because of the energy-economies of scale, it’s a good place to run workloads. It’s also usually cheaper and easier. This is what Project Drawdown calls a no-regret solution.

That doesn’t mean we can run any old workload without regrets. Some workloads are inefficient-by-design. At the time of writing, a single bitcoin transaction uses the same amount of energy as 41,837 hours of watching YouTube. Because of its algorithmic reliance on “proof of work”, bitcoin’s annual energy consumption is equivalent to that of Greece.

Other workloads are accidentally inefficient. Performance improvements like caching and streamlined algorithms can reduce energy usage and also tighten feedback cycles. For example, if you’re using Java, consider Quarkus. It uses about a tenth of the memory of a traditional cloud-native Java stack, and it starts a staggering two hundred times faster. That‘s delightful for developers, but it also means a single cloud server can run more Quarkus instances, and starting and stopping a Quarkus app uses far less energy. (If your workload is one of the ones Quarkus is suitable for, this is another no-regret solution).

There are two cautions when it comes to efficiency as a solution to climate change. The first is the highway problem. Often, increased capacity just leads to greater usage (“induced demand”). When they widen roads to help traffic, infrastructure planners usually imagine they’re building an empty six-lane highway:

In reality, soon there’s just a crowded six-lane road instead of a crowded two-lane road. Not an environmental win.

We see the same phenomenon with our home broadband, and to some extent with data centres. Low power chips and solid-state drives have improved data centre efficiency by as much as 80%. Other gains have come from better-planned ventilation. One 2005 study found only 40% of cooled air in a data centre got to the computers it was intended for. The rest was just … wasted. That’s fixed now, and modern data centres are far more efficient than their predecessors. Nonetheless, ballooning demand means that data centre energy usage is still increasing year-on-year.

So what’s the second efficiency caution? There’s a risk of focussing on micro-optimisations rather than meaningful change.

My work requires some travel — or at least, we all thought it did, before the lockdown. Where possible I take the bus or train — or, in the complicated cases, bus and trainto get to and from the airport. In some cities public transport is a breeze, but sometimes it’s far more confusing, error-prone, and time-consuming than a car would have been. A few times, when I’ve been walking alone, I’ve been followed, which is very scary. What with the personal inconvenience and the fear of being murdered, I do feel a bit proud and like I’m doing a Good Thing with my transport choices. What I forget (it’s probably obvious to other people) is that it doesn't much matter what I do at the beginning and end of a trip, if the middle is a plane journey. That doesn’t mean I should give up on public transport and switch to taxis, but it does mean I mustn’t fool myself that saving one taxi ride somehow ‘makes up for’ my flight.

Jeff Atwood calls this micro-optimisation theatre, and it’s a concept which should be familiar to developers. It’s not that making things better is bad, it’s that we mustn’t let making small things better distract us from making big things better. The advice for energy efficiency is the same as for performance optimisation in general: measure don’t guess. Be data-driven, prioritise the biggest wins, and focus on measuring outcomes rather than measuring effort.

When we look at the big picture, individual and team efficiencies may all just be micro-optimisations. The big decarbonisation levers will need big pushes, which means institutional will and investment. We may also need to adjust how we define progress. Instead of doing more with less, it may be we need to do less with less — as a society.

IBM Garage is built for moving faster, working smarter, and innovating in a way that lets you disrupt disruption.

Learn more at www.ibm.com/garage

--

--