Data centers and energy – Where’s the problem, where’s the solution?

In 2019, market researcher Gartner predicted that 85% of large enterprises would close their traditional data centers by 2025. It doesn’t seem to be going that far, but it is still being seriously considered: energy and water consumption, new cooling technologies, uptime and backup, all factors that play into the trade-off between internal and external computing. (Frans Godden)

Data centers are an old phenomenon, the cradle was sometime in the 1940s when the U.S. Army built the ENIAC, the first electronic computer used during WWII primarily for artillery calculations and the development of the first atomic bomb. Through mainframes, microcomputers and servers, we then eventually landed in today’s data centers that run both on-premise (within company walls) and in the cloud and have become a crucial element of our modern society.

Part of the shift from internal to external has to do with energy consumption. According to think tank Energy Innovation, servers and cooling in a data center account for nearly 86% of electricity consumption, followed by storage (11%) and network (3%). It is ironic to see that ICT is often put forward as a means to save energy while it has itself started to consume more electricity over the past few decades. Why? Well, you have to think of data centers as the heart of the Internet – they process, store and communicate all the data from the services we use every day, from social media to scientific calculations. And they do so using an array of devices that all consume energy – servers, hard drives, network devices, etc. Each of those devices also gets hot, so a data center must also provide cooling – which in turn consumes energy.

Virtualizing

In the meantime, many companies have made the move to virtual servers: instead of a mass of physical servers that all require space and power, they have adopted virtual servers that work like physical servers but are in fact housed together on one physical server via virtualization software – and can thus greatly reduce energy costs. Great gains can still be made in storage systems as well; not infrequently data centers hold 20 or more copies of the same data.

Yet for some time we have seen a different trend: companies are getting rid of their own servers and renting space on cloud servers, often cheaper and also easier because management is outsourced to specialists in those multi-user colocation centers. The economies of scale of such large data centers mean that they can also be more energy efficient, especially in the case of hyperscale data centers. The latter are not welcome everywhere because of their size. Just last year, Meta, the parent company above Facebook, had to abandon plans to build such a mega data center in Zeewolde, Netherlands, after locals objected because of the project’s power-hungry operations.

Submerge

One of the biggest problems for larger data centers is cooling, especially now that global warming is driving temperatures ever higher. Increasingly, it now appears that cooling with air (air conditioning) or water is no longer sufficient – and also in itself requires a lot of energy. And so liquid cooling, and especially liquid immersion cooling, is increasingly being used. This involves completely submerging a server in a bath of a special non-conductive thermal fluid that is capable of absorbing up to 1,500 times more heat than air, without any cooling water or moving parts such as a fan.

Another major energy guzzler in data centers also seems to be quietly on its way out – the diesel generator. As soon as the power supply in a data center goes down, a diesel generator springs into action to take care of the interruption and provide power to keep all the computer equipment running. But it has quite a few drawbacks: it produces exhaust fumes, makes a lot of noise, consumes quite a bit of fuel and is quite bulky. So there are increasing efforts to replace generators with batteries, especially hydrogen fuel cells in the near future.

Better servers

Of course, new technological advances also make for more efficient equipment, in this case servers and networking equipment. Research by the Uptime Institute found that 40% of installed servers in data centers are more than 5 years old and consume 66% of energy while doing only 7% of the work. EPA, the Environmental Protection Agency in the US, has awarded the Energy Star label since 1992 to equipment that uses energy efficiently. More and more companies are being guided by that standard when purchasing computer equipment.

More and more data centers are also going the way of renewable energy. Cloud providers in particular like to pack in wind and solar power, either in-house or through contracts with third-party providers. In late 2020, the European Commission even devoted a report to that energy theme, “Energy-efficient Cloud Computing Technologies and Policies for an Eco-friendly Cloud Market. In it, it called for, among other things, more efficient cooling systems, reusing heat to heat residential areas, for example, deploying virtualization software to make better use of servers, using renewable energy and building data centers in regions with colder climates.

What about the Internet of Things?

Looking at IoT applications in the energy sector, it is clear that sensors are the key element. Indeed, they can be integrated at every stage of energy production, distribution and consumption to monitor and optimize crucial data in real time. These include sensors for temperature as well as humidity, light, or motion. IoT technology can also be used for real-time monitoring of energy consumption so that production can thus be perfectly matched to distribution. This is already widely used in smart cities where IoT devices collect information from sensors and meters, analyze it and use that data to improve infrastructure and services. Not only for energy supply but also for transportation, waste management, air and water quality, etc. Market researcher Statista expects investment in smart city infrastructure to reach more than $100 billion by 2025.

And then there is the energy consumed by IoT devices themselves. They often run on batteries because they are in places where no connection to the power grid is possible, for months or sometimes years. When the battery runs low, they signal that it needs to be replaced. However, some of these devices used in smart buildings and cities still consume far too much energy today, so the industry is busy looking for ways to reduce that consumption. For example, by not constantly powering sensors when they are not needed, but only switching them on occasionally, such as when measuring room temperature.

There are the zombies

No, this is not a joke: in almost every data center there are piles of zombies. A zombie or comatose server is a server that is still running in a data center but in fact no longer has any function and thus merely consumes energy and generates heat.

And it’s not a new phenomenon, research by the Anthesis Group and Stanford Research revealed back in 2015 that some 30% of all physical servers were comatose. An expensive joke though because not only do they gobble up energy, but as mentioned above, they also give off heat, have software licenses that need to be paid for and also pose major security risks because they are no longer maintained and updated. Sometimes the zombies are initially kept as a backup, just in case, but after that they are usually just forgotten.

According to Hewlett Packard Enterprise, more than 20% of the equipment in many data centers is sitting idle, so it might as well be turned off to save energy, cooling and space. Data Center Infrastructure Management (DCIM) software that can map all servers and detect and shut down zombies is therefore not a luxury.

 

Source: TrendsTop

Lees meer

Gerelateerd berichten

Newsletter

SIGN UP
NEWSLETTER

Stay informed about news, events, updates through our newsletter