Edge-computing benefits for AI crystallizing – Urgent Communications

Interest in edge computing continues to build, as does  confusion surrounding the architecture. The situation is similar when it comes to artificial intelligence. The prospect of moving AI to the edge might sound like a recipe for even more confusion.

Performing artificial intelligence at the edge is often “just theory quoted in articles,” said Martin Davis, managing partner at DUNELM Associates.

Still, the concept of edge AI is increasingly hard for industrial and enterprise organizations to ignore. Resource-intensive operations such as deep learning and computer vision have traditionally taken place in centralized computing environments. But the growing availability of high-performance networking and computing hardware opens up the possibility to shift that activity a “centralized cloud architecture to the [edge],” as consultant Chaitan Sharma wrote. “It won’t happen overnight, but it is inevitable.” Gartner predicts that three-fourths of enterprise data to be processed at the edge by 2025, while Grand View Research predicts the edge computing market to expand at an annual rate of 54% to 2025.

At the Edge of Industry

The question of where exactly edge computing takes place is not always clear. The Open Glossary of Edge Computing defines the architecture as the “delivery of computing capabilities to the logical extremes of a network.” Located outside of traditional data centers and the cloud, the edge is concentrated at the “last mile” of the network  and it is as close as possible to the things and people producing data or information.

[IoT World is North America’s largest IoT event where strategists, technologists and implementers connect, putting IoT, AI, 5G and edge into action across industry verticals. Book your ticket now.]

Given the difficulty of using cloud computing in environments such as factories or mines, the industrial sector is a good candidate for edge computing architecture. A factory, for instance, might require high network reliability, at 99.9999% uptime, and low-millisecond latency, and it might place constraints on sending data off-premises. Given such limitations, most factories have traditionally deployed physical cabling and proprietary wired protocols from industrial vendors. The result is a “fragmented technology environment,” which technologies like edge computing could help unify, according to the Ovum Market Radar: CSPs’ Industrial IoT Strategies and Propositions.

An edge computing architecture that operates without the cloud is not to be confused with local compute scenarios in which all data is processed on individual devices. While such on-board computing can support critical decision making in real time, the device hardware is costly, according to Harald Remmert, senior director, research and innovation at Digi International. Additionally, the ability of such local compute configurations to support operations such as machine learning is often limited.

Conversely, an AI-enabled edge computing system in a factory could contextualize data from multiple machines to detect and ultimately predict problems that cause downtime. “Performing machine learning inference at the edge is an enabler for the scale of applications, even when low latencies are not required,” concluded Gal Ben-Haim, head of architecture of Augury, a company creating machine learning technology for the process industry.

That doesn’t mean deploying machine learning at the edge is necessarily easy, however. It “requires more mature machine learning models and new ways to manage their deployments,” Ben-Haim said.

To read the full version of this article, visit IoT World Today.

This UrIoTNews article is syndicated fromGoogle News