Maintaining Low-Latency and Sustainability at the Network Edge – Data Center Frontier

There are several ways for companies expanding their data center portfolio at the network edge to comply with sustainability standards while maintaining operational efficiency at a low cost. (Source: Schneider Electric)































































































































































































In this edition of Voices of the Industry, Steven Carlini and Andres Vasquez of Schneider Electric explore building data centers at the network edge, while simultaneously meeting sustainability goals.

edge

Steven Carlini, Vice President of Innovation and Data Center Energy Management Business Unit, Schneider Electric

edge

Andres Vasquez, Telco Segment Director, Cloud & Service Providers, Schneider Electric

Today’s applications require massive amounts of data and ultra-low latency — from streaming to self-driving cars, new resources are needed in dozens of sectors. One arena facing this demand is the data center industry, as local edge data centers must be scaled out to a variety of unfamiliar settings to meet these demands. The growing need for edge data centers in new environments brings an expansion that could jeopardize long-term sustainability goals as unique challenges in power and cooling, energy efficiency, management, and maintenance spring up. Companies will need to make sustainability a top priority as they build out new data centers and work to equip them with the necessary resources without putting the environment at risk using energy-intensive equipment. Fortunately, there are several ways for companies expanding their data center portfolio at the network edge to comply with sustainability standards while maintaining operational efficiency at a low cost.

Data centers at the network edge

The ‘network edge’ is a phrase that has been used in many ways over the years but can be defined as “a location where a local edge data center interfaces with the Internet to support data-intensive and ultra-low latency applications.” The network edge is a vital element for keeping applications up to par with contemporary standards. Building data centers at the network edge eliminates the latency between the central core cloud data centers or regional edge data centers and edge devices. Moving forward, service providers will be expected to deploy three types of distributed network edge data centers to support the cloudification and convergence of telco cloud and IT cloud architectures:

  • Legacy telco data centers act as the functions and controls of traditional telecom network architecture like base stations.
  • Distributed cloud data centers serve as an extension of the IT cloud, owned and operated by colocation providers or hyperscalers.
  • Multi-access edge computing (MEC)/edge data centers are owned by service providers and operate both IT cloud services and telco functions and controls at the network edge.

While all three types of data centers will continue to grow, legacy telco data centers and distributed cloud data centers will ultimately give way to MEC/edge data centers over the next 10 years as service providers work to support current and future edge applications. However, the complicated nature of this transition will lead to unique challenges that come with deploying at the edge of the network.

Edge data center scaling challenges

To keep up with the evolving needs of modern applications, it is imperative that companies make the transition towards deploying distributed network edge data centers at scale. However, there are key challenges that companies may face when making this change. Some of these issues have to do with existing infrastructure and resources, such as diverse equipment making it difficult to quickly execute widespread deployments, limited space to build or expand, harsh outdoor environments that increase the risk of damaging IT equipment, a lack of on-site staff for operations and maintenance, and vulnerability of physical infrastructure equipment to cyber-attacks. Aside from these challenges, there is also the inherently complicated nature of measuring, minimizing and reporting the environmental impact of scaling out distributed network edge data centers. Addressing these roadblocks will clear the way for companies to move to the network edge and support the requirement for edge applications with high performance.

Green data center deployment: Best practices

To transform the way computing and storage resources are handled and sidestep issues with scaling, companies can implement a multitude of best practices that will set them up for success. These will not only help with unique challenges but will prevent downtime, improve the overall reliability of the data center, and reduce energy usage:

  1. Using monitorable power gear with a lithium-ion battery backup: Uninterruptible power supplies (UPSs) give data center operators the ability to monitor their power supply while offering consistent and conditioned power supply. Pairing a UPS with lithium-ion batteries, which last longer and are better for the environment, is helpful for power security and reducing energy usage.
  2. Choosing air or liquid cooling depending on density, environment, and space constraints: The correct cooling system will protect equipment and minimize downtime. Using a ruggedized IT enclosure can be useful in harsh environments where facility air conditioning is not an option.
  3. Selecting effective IT enclosures designed for distributed network edge data centers: IT enclosures built for addressing environmental risks, space constraints, and physical security can keep many issues at bay.
  4. Deploying sustainable prefabricated modular solutions: These are pre-engineered data centers with preassembled systems tested in a factory environment. These solutions rapidly quicken the deployment timeframe and improve predictability of performance and cost, making it much easier for companies to expand their data center portfolio while cutting down on waste.
  5. Implementing a proper cybersecurity strategy: Creating a plan that incorporates maintenance, device selection criteria, secure network design, and effective device configuration will prevent cyberattacks that can lead to downtime.
  6. Making an investment in monitoring and management software: Remote or centralized monitoring of IT equipment is an important practice. Data center infrastructure management (DCIM) software can simplify this task, providing visibility that will lead to higher efficiency and energy savings.
  7. Interacting with an ecosystem of partners: Dealing with all the challenges that come with scaling at the network edge can become less complicated when companies rely on partnerships that offer key advantages.

There is no shortage of advice when it comes to handling the complexities involved with building data centers at the network edge, but with the proper guidance and tools, companies can upgrade their data center performance while simultaneously meeting sustainability goals. Over the next decade, distributed local edge data centers at the network edge will multiply to power the cloudification and convergence of telco cloud and IT cloud. The challenges that stem from remote management, power and cooling, environmental impact, and cybersecurity can be circumvented using best practices that minimize greenhouse gas emissions and help maintain high performance standards.

Steven Carlini is Vice President of Innovation and Data Center Energy Management Business Unit and Andres Vasquez is Telco Segment Director, Cloud & Service Providers, both of Schneider Electric.  

This UrIoTNews article is syndicated fromGoogle News