One piece of (not always popular) advice I give industrial internet of things customers is to resist being swept up in the hype-wave of edge computing. Research company IDC has forecast that 40% of internet of things (IoT)-created data will be stored, processed, analyzed and acted on “close to or at the edge” in 2019. I recommend caution.
Edge – like many initially overestimated developments before it — is not a digital transformation panacea. Used properly, edge has its place: enabling optimization when an organization knows exactly what specific problem needs to be addressed in a specific place. Used indiscriminately, edge opens industrial internet of things (IIoT) customers to what I call “premature optimization” — micro-solutions that rob enterprises of the macro business-critical information the industrial internet of things is designed to deliver.
What’s more, as Operational Technology (OT) and Information Technology (IT) converge in Industry 4.0 — and sometimes tussle with one another inside organizations — companies can be tempted to spend unnecessarily on OT edge infrastructure that already exists right next door in IT.
What, above all, is the point of OT/IT convergence and the industrial internet of things? All that effort, innovation and expense is aimed at one thing: bringing the physical and digital worlds together to make smarter, optimized business decisions. Automation, AI, machine learning, sensors, analytics, cloud, blockchain, 5G, buzzword, buzzword and — yes, edge technologies — are interconnected to find ways and means to make things faster, more valuable, more customized, at a higher quality that’s less expensive and more profitable. What the IIoT delivers is better business answers.
Now consider edge, which simply means solving problems — applying analytics and algorithms — where they occur: onsite, at the local level, right on the factory floor. Why go through the expense of sending data to the cloud for better answers when those answers can be found right here in the factory? That’s true, in one instance: If the problem is uniquely understood and isolated to that factory, machine or process, then processing data on the edge can make a lot of sense.
But what if organizations haven’t identified precisely what problems they’re trying to solve because most companies are early in their digitalization journey and, let’s face it, there are a lot of problems to solve? More importantly, what if the same problems are hindering a company’s — or an industry’s — processes on a global basis?
Solving the problem locally on the edge, on the micro level, is premature optimization, in which organizations rob themselves without knowing it. By keeping the solution local, edge deprives the entire ecosystem of the benefits of network-effect solutions. Better to send streams of important data to the cloud, where mighty layers of analytics, AI and related technologies with a global view can deliver macro-level, universal answers and much broader, more valuable optimization.
Especially in these early days of digital transformation, why self-limit with edge and run the risk of throwing potentially crucial data away? With edge, an organization may solve one local problem but fail to send important data to the cloud, which is vastly better equipped to use that data to solve multiple bigger problems across the network.
Use the IT data center as a private cloud to connect OT edge to the public cloud.
Sure, you say, it would be great if we could send all the data our connected machines generate to the cloud for a global view, universal answers and solid-gold network-effect optimization. But that’s simply not feasible because data is growing voluminously as sensors proliferate and broadcast at increasingly smaller intervals, and do we really want to start building OT infrastructure to connect to the cloud? Who’s got the money for that?
I think it is eminently feasible. Here are my answers to these problems:
1. Data can be compressed in the IT data center before being sent to the cloud, reducing the data stream by as much as a third without loss of data quality and saving the expense of extra storage. There are also existing and emerging solutions in the data center designed to accommodate the increasing flow of information generated by more sensors and the arrival of 5G.
2. It may not be prudent to build OT data centers at the edge of the network. Enterprise IT data centers already exist and are being affordably configured to handle the tasks required (including using remote algorithmic IIoT techniques). From what I’ve seen working with customers worldwide, the budget and skill-sets required to do that can be found more readily in IT. OT teams have their own optimization challenges to grapple with — no need to load unnecessary IT responsibilities onto their full plates. Even assuming it could be done, given the average age of OT equipment (10 years or more) with limited CPU power, there’s no reason to build new OT infrastructure to connect the edge to the cloud.
Sending as much problem-solving data as possible to the cloud — and not isolating it at the edge — is not only feasible, it’s strategic and doable. IT data centers that have already been built and are in the process of upgrading for the challenges of Industry 4.0 are ideally suited to provide the compute power and storage needed as the middle step — a private cloud — between the edge and the public cloud.
The public cloud is best equipped to provide the business optimization answers digitalization promises. Accordingly, it could be detrimental to your business to rely too heavily on edge this early in the game when the local answers that edge develops and isolates could turn out to be the universal, network-wide optimizations needed all over the world, but never discovered. Try avoiding edge’s premature optimization and utilize IT to achieve full optimization.