Artificial Intelligence has become such an accepted part of our vernacular that it’s known these days by only its initials. This is proof positive of AI’s promise for impacting myriad facets of our lives, from autonomous vehicles to smart assistants. Datacenters also are being influenced as AI-fueled transformations promise big changes—read, improvements—in how existing and future facilities are designed and function. Joel Davis, SVP, Sales and Marketing at Virtual Power Systems recently published the following blog posting to help designers understand the complex impact that artificial intelligence (AI) is having on datacenters:
But before getting too amped on AI, heed the words of Rhonda Ascierto, research VP at Uptime Institute, who wrote in a recent article, that “AI strategy for data center management and operation requires more than just data and some very smart humans.”
Datacenter designers, operators and owners should think of ways to utilize AI from the start instead of as an afterthought or add-on. Capturing operational data is one thing, but Ascierto asserts that to harvest a bumper crop from AI requires first identifying and then focusing on specific use cases and fully understanding the data that is poised to influence AI outcomes.
While you don’t need to become an AI expert, Uptime Institute suggests datacenter leaders gain at least a fundamental understanding of the depth and breadth of AI utilized. This will build the necessary foundation for what type and how much data is required and how best to leverage AI to produce actionable business insights. Otherwise, without this basic understanding, it’s nearly impossible to validate AI’s results or gain the confidence needed to drive effective business decisions.
To gain tangible “today” insights into AI’s promise, look to Google, which already is using AI to optimize datacenter efficiency. Most notably, AI is helping regulate datacenter cooling systems in real time on auto-pilot mode.
Also expect AI to have increasing impact on power in the datacenter. It’s no secret that virtually every facility could benefit from reducing power consumption, halting the standard practice of overprovisioning and boosting overall efficiency. The introduction of Software Defined Power (SDP) helps bring the benefits of AI and machine learning into focus.
While AI helps recognize, define and solve operational problems as well as identify areas for potential improvements, machine learning kicks in by taking that knowledge and applying it to future scenarios to ensure smoother outcomes down the road. For example, VPS’ Intelligent Control of Energy (ICE) continuously analyzes power consumption, managing and orchestrating all smart devices.
ICE’s predictive analytics then steadily optimizes power availability and utilization for improved operations and cost savings. Since almost 50% of all datacenters are constrained by power density at the rack and typically 60% of datacenters’ power capacity is not utilized, there is plenty of room for improvement when it comes to workload orchestration.
Over time, ICE’s machine learning capabilities make possible heightened levels of power awareness. Releasing stranded capacity permits extracting additional power from existing infrastructure, an equally welcome benefit in enterprise or colocation facilities alike, and this is accomplished without negative SLA impact.
Other promising AI applications for datacenters include:
- predicting and reducing failures, outages and security threats
- gaining insight into asset patterns;
- maintaining proper redundancy levels and
- improving forecasting for resources and budget planning.
By combining and harnessing the powers of AI and machine learning with innovations like SDP and next-generation improvements in automation and control, the fully realized vision of the Software Defined Data Center comes even closer to reality.