Machine Learning and Artificial Intelligence Begin to Shape Data Center Infrastructure: IHS Markit

Copy of ihs markit.jpg

Artificial intelligence (AI) and machine learning (ML) are touching many aspects of the data center, bringing efficiency gains, increased reliability and automation to data center physical infrastructure, according to new research from IHS Markit. On top of that, it will allow for significantly improved remote management of distributed data center footprints.

Simply defined, AI is applying traits of human intelligence to computers. ML is a subset of AI, where inputs are mapped to outputs to derive meaningful patterns. AI is reaching far and wide, from manufacturing floors to supply chain management, and even to the operation of data centers themselves. Many enterprises are still planning and piloting different AI applications to understand how it can transform their businesses, but data centers are already providing an early use case of a successful AI application.

However, while AI and ML will revolutionize how a data center is operated and could even enable fully autonomous operation of data centers, a human presence in data centers will remain critical in data center operations over the next decade.

“Artificial intelligence and machine learning are touching many aspects of the data center,” says Devan Adams, senior analyst on the IHS Markit Cloud and Data Center Research Practice. “They are bringing efficiency gains, increased reliability and automation to data center physical infrastructure. On top of that, it will allow for significantly improved remote management of distributed data center footprints.”

Google Case Study

Starting in 2015, Google began applying ML in its data centers to help them operate more efficiently. “Google’s DeepMind researchers and its DC team began by taking historical data collected by thousands of sensors within its DCs -- including temperature, power, water pump speeds, and set-points,” Adams said.

Google then analyzed the data within deep neural networks to focus on lowering Power Usage Effectiveness (PUE), the ratio of total building energy usage to IT energy usage. “In the end, they achieved up to 40 percent energy reduction for cooling and 15 percent reduction in overall PUE, the lowest the site had ever experienced, Adams said. “Google plans to roll out its system and share more details about it in an official publication so other DC and industrial system operators can benefit from its results.”

This Google use case is a unique and relatively mature example of ML in data center operations. Google’s existing cloud infrastructure, access to large amounts of data and significant in-house expertise allowed for Google to become an early adopter of ML.

For enterprises and co-location data center operators who lack those benefits, deploying ML in their data centers may seem like a daunting task, with significant cost and knowledge barriers to overcome.  However, data center infrastructure suppliers are stepping in to bring ML integrated cooling, power, and remote management capabilities to these data centers.

Data Center Cooling

Cooling has become the primary place to start applying ML to data center infrastructure. The reason for this is cooling consumes around 25 percent of power a data center uses. Therefore, improving cooling efficiency translates into serious savings, but this isn’t an easy task. Data centers are dynamic environments, with changing IT loads, fluctuating internal and external temperatures, variable fan and pump speeds and different sensor locations.

Power distribution and backup power systems (UPS) have limited AI in them today. They currently can utilize firmware to make basic decisions based on a sensor’s input and pre-programmed, desired outputs. However, they are not programmed to learn from changing inputs and outputs as cooling integrated with ML currently is. For UPS’, integrating ML has a different end goal than cooling. UPS’ integrated with ML will be focused on preventing downtime, through predicting failures and preventative maintenance, either self-performed or by alerting engineers of a specific problem.

Lithium-ion batteries continue to see growing adoption in data centers. Prices for lithium-ion batteries continue to decrease and concerns over safety issues are easing, the argument to keep using VRLA (valve regulated lead acid) batteries is becoming more difficult. As lithium-ion batteries grow in data center applications, this opens new possibilities due to the nature of their chemistry allowing for more frequent charge cycles, healthy operation below full charge, no coup de fouet effect and no need for a controlled battery room environment.

(For more information, please visit https://ihsmarkit.com).