Supply Chain Management Blogs by SAP
Expand your SAP SCM knowledge and stay informed about supply chain management technology and solutions with blog posts by SAP. Follow and stay connected.
cancel
Showing results for 
Search instead for 
Did you mean: 

Introduction

There is a lot of talk around Artificial Intelligence (AI) now and this trend will not subside any time soon.  You may have seen me write about the concept of an Inverse Historian-1 in the past as a means for how to manage the various digital signals that are generated in manufacturing and put them to use at scale.  The concept of an operational historian is a simple one in that it fills a basic need that engineers have in being able to understand waveforms of signals and their relationship to each other.  These digital signals happen at a speed that is far too fast for people consume directly and over periods of time that are too vast for people to use without help.  Here is a good article that covers the current situation of “Manufacturing Operations and Data Management” as we see it to in manufacturing and sets the stage for the Inverse Historian (Historian-1) paradigm.

The purpose of this is not to understand what a historian is, but rather the value it provides and then re-think how to fill the same need and capabilities but leverage the new world of technology that has recently arisen.  Digital signals alone will not provide enough value to impact business decisions or guide behavior of employees.  For that to happen it must be tied across multiple business processes (and supporting systems) that they relate to in a way that doesn’t involve a lot of point-to-point connections for each application.  The important key point here is that not all control data is relevant for the business just like not all business data is relevant for control systems.  However, there is data that is relevant for both and the ability to map this data together over long periods of time is where the real value is and where having a data management strategy will benefit an organization the most and their aspirations to scale the use of AI to bring meaningful impact to their business.

UNS compared to OONS

There is a lot of talk about Unified Namespace (UNS) and what it brings to the table and this approach of a linear namespace that maps to operations data is not a new one.  This article published by the HiveMQ team, is a good reference for this methodology.  In fact, this was same approach could be accomplished with SAP MII using the Plant Information Catalog to remotely manage Plant Connectivity many years ago (The Inverse Historian v1.0 and Streaming Data in the Industrial Space).  I have also had a few discussions on UNS with folks in the industry including a recent one that I frequently hear back about with Walker Reynolds published here on YouTube

For SAP Digital Manufacturing (SAP’s MES), the difference now is that technology has shifted on both sides of the architecture.  The application layer has moved into a multi-tenant cloud where it is managed in real-time, and the connectivity layer has moved from RESTful to MQTT (via MQ Broker) for the upwards feeds.  There is also a different approach that has moved away from a linear namespace in favor of an Object Orientated Namespace (OONS).  This may at first seem minor, but it has a big impact on the scale of applications and the way that they are supported on the backend.  It also impacts the maintainability of the data modeling layer as there is no longer deep tree structures with lots of jumbled up data to accommodate the different systems (Operations, Maintenance, Environmental, Safety, Labor, etc).  You may already be familiar with ISA-95 as this industry standard operating model has been around a long time.  There are lots of details on it and this article “Beyond the Pyramid…” I find very relevant to this discussion as it explains how there are interlocking references that are not represented by a linear hierarchy.  Rather it points out a concept that has existed in SAP ERP for a long time which are business object models for various processes that also relate to each other.  This same concept has been inherited by SAP Digital Manufacturing where multiple objects reside and provide context to the processes and systems that it interacts with (application & control). 

Data Management as a Service

The new system has brought about tremendous value in the way that automation data can be scaled to merge with IT/MES data and this novel approach sets a very high-standard for operations data management.  The design time management of various “digital signal” types (Event, Continuous, and Meter) and applying process context to them in-line with the readings ensures that their data values and context are relevant to the business application layer over time. 

For example, if I have an equipment state tag and an ambient temperature reading in a work center and a pressure & mass flow reading inside an asset how can I ensure that quality, operations, and maintenance all have visibility to this without three separate application connections.  Now re-assign the work center to a new work center and put another similar equipment in parallel.  With a traditional approach of independent historian, MES, and ERP environments this would very quickly break the references to the previous setup and make contiguous reporting very difficult.  By embedding the context into the data feed directly, we now have a timeseries history of data values but also of the context as the plants evolve over time.  Now that the values and their context are free flowing and aligned with ERP/MES and the OT/IoT mappings we have a stable and scalable way to manage data a design time and move data into an MQ Broker at runtime.  From here the endpoints are endless but we (SAP) do have some unique value in putting that information into SAP Datasphere.

Architecture

The key to this is approach is like ‘open jaw’ travel where you fly in one city and fly out from another.  The inverse historian approach matches this in that there is a design-time management channel for managing context and logic for when certain events & streams are active.  Then there is a runtime side where the rules are applied and data flows from controls to a SAP MQ Broker in a managed Kubernetes container in the plant.  This way local applications and central ones subscribed to the various feeds can tap into the live operations data; see diagram below:

salvatore_castro_0-1709326249770.png

Figure 1: Design time and run time setup of the Inverse Historian

This approach now has the added value of being able to analyze data as it arrives without having to refer to a lookup table inline with the data feeds and can relate them together at the point of ingestion.  Referring to the previous example the work center, functional location, equipment id, cost center for each of the feeds mentioned previously is now included with the control readings.  Any correlation of values in the dataset can be easily done and any events from SAP Digital Manufacturing are also readily available to map them to the manufacturing process state and raise awareness to the relevant operations personnel.  This approach is outlined here and opens the door for innovation not only to SAP but also to our customer & partner ecosystem:

salvatore_castro_1-1709326249783.png

Figure 2: Overview of how Inverse Historian impacts operations

This methodology enables partnerships that you will see on the SAP Store along with partner & customer led innovation to address a wide range of needs for a diverse set of manufacturing processes.  Topics that are very niche for a customer and common across multiple industries can now be addressed with a common data management layer and it’s all provided as a service.

Let me know how you plan to use it in your environment!