In a gray building in the middle of the campus on Leobener Straße is the technical heart of the University of Bremen. The Green-IT Housing Center. Every lecture video on Stud.IP, every saved file on Seafile, this data can be found somewhere on the servers in this nondescript building.
Inside, large computer clusters calculate the latest models for climate researchers, whilst two meters away, the most recent student data is being transferred from the campus management system to identity management and e-learning.
How the Green-IT Housing Center Works
“If you don’t hear from us, everything is running smoothly,” says Kevin Dehmlow, shortly after typing the security code into the combination lock. He has been working as an IT specialist for the Housing Center since 2016 and explains with stoic composure how all the technology works.
In the building right next to the Power Supply Center, there are two data centers that are operated largely autonomously from each other. Sections A and B. Each of the rooms has its own cooling and power supply. If one of the two rooms fails due to a malfunction or maintenance work, many important university IT services, such as email or administrative IT, can continue to operate without the users noticing the disruption.
In 2014, sections A and B each launched with two aisles and 48 racks. Today, the rooms are fully developed and there are eight aisles in each room with a total of 96 racks. Racks are the cabinets in which the servers are installed. Each institute and department has completely different requirement profiles in this regard.
In glacier research, extremely large amounts of storage space are required. For this, several high-end graphics cards are usually used for the calculation of AI models. Within a rack, there are small slide-in units that Kevin Dehmlow affectionately calls pizza boxes. In these, the individual IT elements are wired and connected to the university network. A rack can hold a total of 47 such pizza boxes.
The Cooling System Explained
When operating this technology, there is a high level of heat generation. Therefore, a sophisticated cooling system is in place. The cabinets in which the servers are located are in cold aisles. Cooling air is blown into these by refrigeration machines. The computers suck in the air in the cold aisle, which is about 21 degrees Celsius, and blow it back heated into the warm aisle behind them, where a dry desert climate prevails as a result. Temperatures can reach up to 32 degrees.
In the cold seasons, cooling is provided by the outside air: a large cooling system is installed on the roof of the building. Only in the summer months does the building have to rely on an absorption refrigeration system in the Power Supply Center, which generates cooling from waste heat from the city’s waste incineration plant. Overall, cooling, which accounts for a considerable proportion of data-center energy consumption, is thus very cost- and CO2-efficient.
With the creation of the Housing Center, the University of Bremen’s information technology has been brought together in one place to save money and energy. The advantage of a centralized system is that there is no need to install a separate cooling system and costly power supply in each building. The acquisition and operating costs for this are extremely high and can be reduced by centralization.
Things rarely go wrong in the Green-IT Housing Center, the IT technician assures us. It has happened that the cooling system has suddenly stopped working. In such cases, the system’s green lights suddenly turn yellow as temperatures rise rapidly, as does the stress level of the center’s employees. But then all it takes is a call to the Power Supply Center and the problem is quickly fixed. As a normal user of the University of Bremen’s web services, you hardly ever notice this.
The Green-IT Housing Center is available to all institutions of the university free of charge. For more information, visit: www.uni-bremen.de/zfn/weitere-it-dienste/serverhousing-webhosting.