Recent News

On the edge of successful OT and IT convergence – Control Engineering Website

06 June 2022

Jash Bansidhar discusses the role of edge computing in helping ensure the success of OT and IT convergence.

Historically, operational technology (OT) and information technology (IT) have fulfilled very separate roles at manufacturing facilities. Today these two entities are coming together due to the inherent efficiencies and competitive gain this strategy provides. 

The challenge is to ensure that convergence takes place in an optimised, cost-effective way without major disruption to ongoing operations and this is where edge computing plays a key role, especially with the emergence of artificial intelligence (AI) and machine learning (ML).

By taking advantage of AI software algorithms, it is possible to process OT information and make experience-based decisions without human intervention. Historically, AI software has needed high levels of computer power, but the emergence of edge inference platforms – which use the results of AI training algorithms within a much more constrained environment – has enabled AI to extend to the factory floor.  

Once a manufacturer has the possibility to accelerate its decisions, it can improve both productivity and efficiency. Moving AI further into the automation processes also helps to address the challenges posed by an ageing workforce which is not being replenished with new talent.  Leveraging new technology and algorithms allows manufacturers to do more with the same amount of people.

Digital inspector

By way of example, consider the inspection stage of a typical manufacturing process. In the production environment there are people with 30-40 years of manufacturing experience which they use to spot defects, anomalies and other quality data. However, as these people retire, without a new generation of inspectors to replace them, how can the manufacturer retain and leverage their knowledge?

AI and ML provide a solution by digitalising that knowledge via a training process similar to that given to a new apprentice inspector. Once trained, the AI model can be deployed to edge inference devices which can then perform the role of defect and quality assessment, feeding back information to the training servers to further tune and optimise the AI models. 

The ‘digital inspector’ provided by the edge inference platform retains all of the aggregated knowledge of experienced operators in perpetuity and applies it to the inspection process, normally with a higher degree of accuracy than manual inspection is able to provide. At the same time, the human inspector’s role is elevated to one of analysing the resulting data and identifying areas for process improvement. 

Using the latest AI and ML techniques, coupled to appropriate training server and inference edge hardware, the digital inspector can be realised without the need for data scientists or complex programming. Instead, the system learns what constitutes acceptable and non-acceptable samples over a brief training period. The resulting models can be deployed concurrently to multiple sites and production lines further leveraging the aggregated experience of the inspection teams from multiple locations. Running the system constantly results in ongoing feedback to the training server, refining the edge inference with ever-increasing accuracy leading ultimately to a near-perfect decision-making process.

So, how can traditional manufacturers take advantage of this technology and become ‘smart’, particularly in Europe, where many production facilities are ‘brownfield’ sites? Often, this is much easier than manufacturers think. Of course, it is necessary to have network connectivity to the inspection location, and cameras are needed to gather the visual inspection data, but these elements are normally already in place if the OT investments are up to date. Thereafter, the inference edge is normally a relatively simple ‘bolt on’ addition, providing both image capture and the AI inference algorithms which form the defect detection system. The training server – which constantly refines the model based upon data fed back from the edge – is normally located in the IT server room, but can be installed anywhere with access to the network on which the edge inference devices are connected. 

In many cases, the model also uses information acquired from other sources, but the standards based, open technology nature of the systems makes it possible to implement layers of IoT technology, combining edge apps, communications protocols and data acquisition capabilities, thereby digitalising brownfield sites and bringing them into the new era of manufacturing.

When Europe first began discussing the Industry 4.0 revolution, the immediate question was how to create a non-proprietary, ‘open’ solution to avoid vendor lock-in. The initial step was to create technical compatibility and integrate it into an open and transparent environment where further innovation could take place. 

Around six years ago, when organisations mooted the idea of IoT implementation in a production environment, the main discussion centred on how manufacturers could connect their devices, machines and equipment. How could they increase their level of analytics compared with previous solutions, which were based on SCADA or MES applications? And how was it possible to enable more applications in the OT field to be sure of an attractive return on investment (ROI)? As time has passed this has led to the situation where industrial IoT is based increasingly on open PC technology. Whilst this has been new for the OT industries, it is something with which the IT world has been familiar for decades, understanding the value in leveraging common architectures, communication standards, management tools and application frameworks. As developments in open communications such as OPC-UA become more widely adopted, and new technologies such as time sensitive networking and 5G overcome the need for dedicated real-time devices, the case for using open PC based technologies becomes overwhelming.

Less complexity

In turn, this adoption of open architectures means that today, the concept of creating a technical layer on top of a brownfield manufacturing site is much less complex than it would have been even two or three years ago. This is because there are many stacks that integrate proprietary OT devices and systems into the IoT domain. Communications, protocols and physical interfaces have become more standardised, and as opposed to older proprietary OT devices, the open technology platforms used to build systems are becoming increasingly commoditised, so investment becomes easier. 

The technical part historically represented the value of the business case, but today companies no longer discuss technical compatibility because it’s already there. Instead, they talk about how to use IoT technology to optimise end-to-end workflows, not just a stand-alone process. About how to leverage the intellectual capital and experience they have in their workforce by using technology to enrich their roles to add further organisational value. Increasingly, they talk about how to leverage the information that can be captured and virtualised at the edge, across the whole enterprise. 

The key to success is in exploiting the open nature of the latest industrial IoT technologies to unburden both integrators, who are implementing business applications, and integrators who are implementing OT applications at the edge. 

In summary, today there is less complexity in implementing IoT applications and providing a layer on top of existing brownfield manufacturing sites. Today, it’s all about how to provide integration with business applications. 

Jash Bansidhar is managing director at Advantech Europe.

Contact Details and Archive…

This UrIoTNews article is syndicated fromGoogle News

About Post Author