Enterprises are producing larger and larger amounts of data and as they deploy more IoT, edge and 5G devices the volume is only likely to increase.
This puts more pressure on storage capacity which has the potential to be an inhibiting factor in digital transformation. We spoke to Dr Tong Zhang, co-founder and chief scientist at ScaleFlux, to learn more about the importance of storage to enterprise IT plans and infrastructure.
BN: What is fueling the present data growth in enterprise IT?
TZ: Data growth is exploding. Data creation is expected to reach a staggering 180 zettabytes by the year 2025 — that’s 118.8 zettabytes more than in 2020. As the volume of data we produce increases exponentially so do the demands on data storage. The race to gain an advantage over the market is driving companies to deploy more devices across more data centers and edge locations, and this is exacerbated by the rise of distributed cloud technologies like 5G which generate enormous amounts of data that must be processed and stored.
BN: How do the cloud and edge computing affect this growth?
TZ: This increase is being fueled by several factors. First, the need to power a growing number of base stations — the fixed transceivers that serve as the main communication point for mobile technology like 5G (and even 6G) — is expected to boom as the technologies become more popular. Additionally, as the number of self-driving cars increases, so does the need for local storage and remote processing power in the cars themselves. As of mid-2021, there were over 1,400 self-driving cars in the US however, one report projects that there could be as many as 33 million autonomous vehicles (AVs) on the road by 2040. Other technologies like augmented reality (AR) and virtual reality (VR), plus the digitization of manufacturing (what’s known as Industry 4.0) are further driving this trend to ever more data.
Needless to say, it will be impossible to store all of this data in the cloud. As the demand to access and process data anytime, anywhere grows, backbone network bandwidth and centralized data centers’ data processing capability slows dramatically.
BN: Edge computing has really caught on in certain industries, such as telecoms, what does this mean for 5G/6G wireless systems?
TZ: According to a report by Allied Market Research, the 5G base station market is expected to reach over $190 billion dollars by 2030. Keeping up with this massive growth will require a completely different information technology infrastructure paradigm that moves computation and data storage much closer to data sources and end users than today’s centralized data centers. The answer is edge computing.
Edge computing offers a significant reduction in data traffic over the backbone network and much better quality of service (QoS); both natural effects of moving computation and storage closer to data sources and end users. Some industries have already caught on and are beginning to reap these benefits. For example, telecom companies have started to deploy edge computing infrastructures in order to steadily improve the QoS for their customers in the presence of growing 5G/6G wireless communication systems. And rightly so: edge computing is essentially the only option for telecom companies to reduce data round-trip latency. This will also become a critical imperative for other emerging applications, such as AVs.
BN: As a relatively new tech vector, the cost of edge computing might be seen as an inhibitor to its adoption. How can organizations balance its costs with the desired results?
TZ: The success of edge computing largely depends on its cost effectiveness. As semiconductor technology scaling reaches its limit, heterogeneous and domain-specific computing become critical in maintaining the cost effectiveness of IT infrastructure. Another key component of future heterogeneous computing paradigms is computational storage, which will serve as an essential building block for cost-effective edge computing infrastructure. Together, edge computing and cloud will complement each other to form the foundation of future pervasive IT infrastructure. In the coming years, we can expect to see the continued growth of cloud in the form of centralized data centers, in addition to the rapid expansion of edge computing.
BN: What are the future implications of a successful edge computing infrastructure?
TZ: The implications of edge computing are far-reaching and will be felt by businesses and consumers alike. The pervasiveness and performance of edge computing will directly determine the QoS experienced by everyday consumers, whether that’s via video upload/download speed or query response time. The backbone network and centralized data centers are no longer sufficient for meeting the new demands of data accessibility and processing. By ushering in the adoption of edge computing, enterprises and end-users can expect possibilities like never before.
BN: Where does data storage come into play, both locally and at the edge?
TZ: Of course, as data is collected it must be stored, but very quickly the attributes of that storage become important considerations. The operating envelopes are very different at the edge, where one might find a mobile 5G station, compared to what is available in a data center or large cloud provider. Power, heat, density, resiliency, serviceability all vary across distributed infrastructure. But then you are also faced with the problem that as you collect data it can be very difficult to determine its value until some level of processing has been done to sort the high-value data vs. the garbage. Until you have established the value, all data must be treated equally as highly important and that greatly increases the total cost of storage while also impacting the efficiency of the entire infrastructure.