Asaf Ezra is the Co-Founder and CEO of Granulate, a provider of real-time, autonomous computing workload optimization solutions.
Among the few silver linings to the Covid-19 pandemic was a 6% decline in energy-related carbon dioxide emissions in 2020, as the spread of the novel coronavirus led to more remote work, less commuting and a plunge in air travel.
But with emissions already returning to pre-pandemic levels, climate change will still have potentially devastating ramifications, absent a concerted push to achieve a sustained reduction in CO2 levels. Between the United States’ decision to rejoin the Paris Agreement and major automakers’ moves toward all-electric vehicles, there have been encouraging developments on this front in recent weeks.
Positive as these steps may be, naturally, much more can be done to mitigate climate challenges. All industries must do their part. And despite the longstanding focus on the automotive, manufacturing and agricultural industries, there’s no pathway to a cleaner planet without addressing the sizable energy consumption of the data centers powering our increasingly digitized economy.
The New Oil?
By now, it’s a cliché that data is the new oil — the high-tech fuel of 21st-century economic progress. Unlike oil, we often think of data as having no physical form and, therefore, no environmental impact. This couldn’t be more mistaken.
According to the International Energy Agency, data centers consume approximately 200 terawatt-hours (TWh) of electricity, or nearly 1% of global electricity demand, contributing to 0.3% of all global CO2 emissions.
With big data exploding and computing needs swiftly growing, these figures are only expected to rise without proactive steps to reduce data centers’ energy consumption.
The desire for data-driven analytics to assist in decision-making, coupled with the accessibility of computing power, be it on-premise or in the cloud, has sent demand for servers skyrocketing. Tempting as it may be to point the finger at big tech, the truth is that users and enterprises of various sizes all have had a hand in the increase in data centers’ workloads.
How can businesses large and small do their part to drive down energy consumption without sacrificing the computing power needed to support innovation and deliver goods and services as promised? Building on-site renewable energy sources and partnering with green vendors are praiseworthy and, in fact, necessary solutions and reflect the serious investment that tech companies are making to tackle the problem. Amazon recently became the world’s largest corporate purchaser of renewable energy, adding new wind and solar projects every year, and other tech giants such as Google and Microsoft are not far behind. Creating clean sources of energy is critical — but no less important is rethinking how computing resources are allocated.
Underutilized Computing, Wasted Resources
Given how rapidly the demand for computing power has increased in recent years — with 2020 setting records as we become even more reliant on the internet in our personal and professional lives — it’s hardly surprising that the transition to a data center-driven economy has been replete with operational inefficiencies.
Indeed, a recent survey our company conducted of senior IT professionals at 100 companies spending nearly $1 million annually on cloud computing found that for more than half of these companies, CPU utilization is only between 20%-40%. It is these very underutilized, partially idle servers that continue to consume substantial amounts of energy, imposing unnecessary costs on businesses and contributing to tens if not hundreds of millions of tons of CO2 emissions.
Two solutions that companies have been implementing to address this problem include improved cooling systems and updated servers. Microsoft, for example, has demonstrated substantial energy reduction in cooling costs by submerging server racks in specially designed fluids, as well as its well-publicized Project Natick that’s determining the feasibility of data centers beneath the seas “powered by offshore renewable energy.”
New processors for data centers also play a key role, increasing computing power without increasing energy consumption, and the ongoing “arms race” between chipmakers bodes well for the continued improvement of the energy-computing ratio.
But software also has a role to play: AI-powered software can assist companies to better manage their infrastructure, maximizing the utilization of their CPUs. This, in turn, will help mitigate and alleviate these issues, thereby delivering considerable energy savings.
This efficient, eco-friendly method of improving CPU performance and utilization can also help enterprises keep up with the ever-rising demand for data processing — and, in fact, help reduce the number of processors they need to achieve the same or more computing processes. Optimized performance can also reduce energy-heavy spikes that accompany increased traffic.
And lest we see 2020’s soaring demand as an exclusively pandemic-era phenomenon, it’s worth bearing in mind that in many cases — like with the rise of remote work and e-commerce — Covid-19 hasn’t started new trends; rather, it has dramatically accelerated pre-existing ones. The more digitized our economy becomes, the more we’ll rely on data centers to support it.
Fortunately, computing power and profits need not come at the planet’s expense.