With heat waves hitting three continents and global temperatures reaching record highs this summer, perhaps we need to pay more attention to “the cloud” and the environmental impact of computing.
The burning of fossil fuels, such as coal, oil and gas, is the largest contributor to climate change, but while that fact is often associated exclusively with cars, air travel and factory emissions, did you know that digital accounts for around 4% of carbon? emissions at a global level? Or that energy consumption increases by 9% annually?
It’s easy to imagine emissions from a factory or a car, but much less intuitive when it comes to someone developing software on their laptop. It’s a potentially huge blind spot for reducing emissions, and the public sector needs to hold the tech industry more accountable for the emissions produced by the infrastructure used to deploy software.
The environmental impact of computing on our planet
That developer’s laptop doesn’t work in a vacuum. It and the thousands of servers in the thousands of data centers that make up the cloud are also big producers of carbon emissions, exceeding the emissions of more than 22.2 million flights a year. Globally, the UK ranks third among countries in terms of the number of data centres, with 456 in 2022.
While the UK is generating more electricity than ever from renewable sources, with 40% renewable energy by 2022, the government has admitted that its current “net zero strategy will fail to sufficiently reduce greenhouse gas emissions.” ” according to the Financial Times. .
Net zero emissions strategy will fail to sufficiently reduce greenhouse gas emissions
When these emissions are out of sight and out of mind, it is easy to ignore their impact. However, when we look at how energy is consumed in data centers, it is clear that this impact is significant.
The cloud’s carbon footprint is estimated to represent more than 2% of global electricity production.
According to Yale research in 2018, the cloud’s carbon footprint was estimated to be more than 2% of global electricity production, and more recent data suggests it is now 3%. To put this in perspective, a single data center consumes the electricity of 50,000 homes. When we multiply that by the 456 data centers in the UK and 8,000 worldwide, the magnitude of the impact becomes staggering.
Surprisingly, about 88% of that electricity is not even used for computational processes; It is used to ensure that the cloud remains up 24/7 by cooling and maintaining redundant security systems.
In short, “out of sight, out of mind” is not a valid strategy for meeting the UK Government’s ambitious net zero emissions targets. Under the Paris Climate Agreement, they have committed to reducing emissions by 68% by 2030, just seven years away. This objective aims to be a basic element to achieve climate neutrality by 2050, that is, reduce its greenhouse gas emissions by 100% compared to 1990 levels, in conjunction with the European Climate Law.
We need each tool at our disposal to reduce emissions, and coincidentally, an unlikely candidate worth considering is cloud-native container orchestration technologies, specifically Kubernetes.
Cloud native can reduce environmental impact
Kubernetes, named after the Greek word meaning “helmsman” or “pilot,” is an open source container orchestration system. Like the captain of a ship, you are in charge of managing or organizing containerized cloud applications, always ensuring proper computing, networking, storage, and configuration.
It enables organizations to launch, terminate, update, and extend applications with better resiliency, governance, security, visibility, and lower operational costs.
However, viewed from an environmental perspective, Kubernetes has less obvious advantages. Automatically and intelligently increase or decrease computer power based on what is needed, avoiding wasted resources. This way, you will never leave resources idle or use too many.
AI and ML: the technology of the future and our changing environment
Given the AI craze we are going through, this will be significant because AI/ML workloads are very demanding on hardware. And as mentioned above, an overwhelming majority of the electricity consumed in data centers is used to cool that hardware, with up to 40% spent on cooling alone. With the growing popularity of AI, that number is only going to get worse.
Kubernetes could play an important role in alleviating this because, while it is not specifically designed to be a “sustainability tool”, it is is designed to manage and avoid hardware redundancies. You could optimize and streamline those AI workloads (or any workload) at scale.
Even something as basic as software testing environments can reduce the impact by using Kubernetes. The old “non-containerized” VM model would force companies to leave a test environment running all the time, even when they are not testing. In contrast, Kubernetes allows you to scale a test environment up and down as needed. This is important because more time spent developing or testing equals a larger carbon footprint.
It matters “when” you adopt cloud native technology
It’s no secret that one of the reasons the UK public sector has struggled to digitally transform is the slow pace at which new technologies are adopted.
This is not just a problem of efficiency and innovation. Its environmental impact is affected by when you adopt Kubernetes. In my recent presentation at Kubecon, Reduce your environmental impactMy colleague Zinnia Gibson and I demonstrated this by asking the audience to participate in a little thought experiment:
Imagine two large-scale companies in the game development space with the same product. Company A uses Kubernetes, but Company B, which intends to use it in the future, is still using virtual machines.
The enterprise using Kubernetes, due to its ‘autoscaling’ capabilities, automatically uses fewer data center resources by the nature of its infrastructure design, while the alternative leaves too much room for poorly managed resources at scale.
It’s also worth considering that any public sector organization that launches an app and sees its user base increase dramatically in number will inevitably need to scale that app. If you haven’t yet integrated Kubernetes into your workload, you now have to worry about training and infrastructure. More time and energy spent on computing means more emissions.
The fight for sustainability will not be resolved with a single solution
Of course, let’s be clear: none What I mean is that Kubernetes is “the only thing” that is going to defeat climate change. As one of my favorite voices in sustainability, Shelbi Orme, says: “You can’t do all the good the world needs, but the world needs all the good you can do.”
The fight for sustainability will not be solved by a single solution, person or industry; We all have a role to play. The technology industry’s potential to make a significant impact cannot be ignored, and the public sector must think about computing in the same way. Kubernetes is just one possibly effective way to begin clamping down on unnecessary computing inefficiency and, in turn, help reduce the environmental impact and carbon emissions of computing.
The best part is that public sector IT teams can now access free upstream projects to reduce development time, reduce costs, and reduce environmental impact. There are many alternatives, but here are some favorites:
- GreenFrame: An open source tool that measures and reduces your website’s CO2 emissions by detecting carbon leaks.
- Prometheus: Not explicitly designed to track emissions, but can help get metrics to know when resource loss is occurring.
- Microsoft Emissions Impact Dashboard – A tool explicitly focused on showing carbon emissions for Azure cloud usage.
Any responsible organization will look for improvements in renewable energy efficiency to reduce their carbon footprint (such as making their offices more environmentally friendly, carbon sequestration, etc.). But with the climate outlook so dire, we have an obligation to use as many tools as we can. Kubernetes should be part of that mix.
This article was written and provided by Mary Karroqe, Software Engineer at D2iQ.