With the emergence of Industry 4.0 and IIoT, the complex legacy systems landscape and outmoded data management platforms of different industries are being taken online.
Big industrial enterprises buy and sell assets, and the need for integration - connecting and disconnecting assets - is becoming increasingly important.
There are many advantages in running big-scale operations and streamlining processes through digitalization, some of which are:
- Making use of big-scale economics to make operations more efficient and profitable
- A possibility of on-demand production
- Predictive maintenance leads to less downtime
- Making data-driven decisions makes operations run more smoothly on all levels
- Reporting to different levels of the organization is made easy
Silos are still the rule, not the exception for legacy systems
The norm in creating OT (operational technology) and IT systems for industries has been different vendors tailoring OT and IT systems layers for the specific factory or plant, using different technologies and various standards. These have been systems created by a particular vendor and have usually had little to no compatibility with other systems, thereby creating vendor lock-in.
Vendor lock-in means that there is little chance for the customer to have a change of mind as to what systems to use, or even change just a part of the legacy systems with other, better-working system parts.
This makes for rigid systems with very little flexibility in a technological landscape that is changing and developing at an ever-increasing speed.
Also, it presents a problem when an enterprise consisting of many different factories or plants wants to integrate new assets into its operations.
There is also the question of security considerations when the need for accessing the production environment remotely arises. The need for crossing firewalls and using open ports leaves questions of how to control access rights and secure data encryption.
What if there was a different way to organize and access your data, letting your data flow freely with no vendor lock-in?
The liberation of your data can be as transformative as you dreamt it to be when you first started out on the digitalization journey.
At Prediktor, we believe that this is the way to the future, but many challenges must be overcome.
The problem of digital disconnect
When you are dealing with digital disconnect, data from your OT layers are hard to access, even harder to understand, and are difficult or impossible to access from the IT layer of the enterprise.
Part of the problem is data existing in silos, and part of the problem is a lack of context. Interpreting the data and understanding them becomes a time-consuming, impossible task, leading to a digital disconnect, where information in the form of data can not be used for the optimization of operations.
The cost of running a siloed data system
With different vendors using different standards for programming legacy systems and with one single industrial plant often using several vendors, data from the systems in question become difficult to access and bring from the OT to the IT layer of an operation.
The result is a massive need for human resources to retrieve and interpret data streams. At the same time, there is an increasing need for connecting the systems online to gain access to data across assets with very different technology to be able to take advantage of the digital revolution happening within industries all over the world.
Integration hell seems to be unavoidable
It is my opinion, an opinion that can easily be proved, that vendor lock-in and siloed data lead to what I like to call integration hell. This is an expensive hell and also a hell that is extremely time-consuming.
Conventional IT approaches to solve the problems of connecting the conglomerate of legacy systems to the IT- and operational level of an organization have proven to be difficult at its best.
At its worst, one ends up in integration hell, where every little arrow in the illustration below represents an integration project that needs to be maintained whenever changes are made on any level of the organization.
This integration hell results in high-risk projects that last for years, and where the goal is never reached, as the integration of every asset becomes an endless integration project. Every time something is changed, whether at the OT or the IT level of the organization, every integration of every asset has to be adjusted accordingly. No need to say that this is highly costly and manpower intensive.
What does the contextualization of data do for integration?
The solution to the problems of siloed data, diverse legacy systems, and integration hell, I believe, is to contextualize data and standardize the communication protocols in your systems.
Your data, through contextualization, are always described in the same way - similar things at the same level are given names and appear the same every time at the surface of your IT systems, regardless of adding or subtracting assets and regardless of what legacy systems sit at the bottom.
OPC UA is a key in both contextualization and security
OPC UA is a standard that has been adopted as a core feature of the Industry 4.0 initiative coming out of Germany. OPC UA is an open-source communication protocol that allows you to build a future-proof architecture.
Also, OPC UA is secure by design - meaning it uses elements such as reverse Connectivity (enabling only needing to open an outgoing port from the server), encryption (e.g., using the basic256sha256 algorithm), and access control (both application-level security and user-level security) to make data in transit as safe as possible.
The future is near
When creating an OPC UA structure on top of legacy systems, you get as close to a “plug and play” solution as possible when dealing with the diverse standards of different vendors in legacy systems. Allowing you to integrate with ease; integration projects are done only once. The facility is mapped up and then made available via OPC UA, giving you access to data streams and information at all levels of your operations.
The data streams become easy to use through dashboards that allow you to dig into the data you need and make informed decisions on all levels of operations. Also, OPC UA being an open standard means that you are free to choose what vendor to use, and you are geared for the future with no vendor lock-in, as any vendor with access to and knowledge of the standard can program and integrate into your systems.
- A system where data is contextualized and described in the same way, regardless of legacy systems.
- OPC UA framework
- As close to “plug and play” as it gets
- Map up a new facility connect it to the OPC UA based system
- This builds a future-proof architecture that allows you to shop around freely; new technologies, new assets
- You can choose what vendor to use, as you create a system that anybody using the same standard can program into and integrate into
- Open standard means “anyone” can access and understand how the system has been set up.
- OPC UA is “secure by design.”