Part 2: How to build an Integration Architecture for the Intelligent Enterprise
A Bridge has a special meaning for any Architecture…
We say, Enterprise/Solution Architecture (in general) is bridging the gap between business and technology, but Integration Architecture (in specific) is also bridging a gap – it is bridging functional & technical gap between different Systems and Services…
In my previous article How to build an Integration Architecture for the Intelligent Enterprise, I was talking about the Methodology and Integration Services. In this article, I will go into some practical examples – how to put all this in use.
And of course, let me place my usual disclaimer – all this is just my view. I am only sharing a Solution Concept.
We start with Integration Styles and Use Case Patterns from SAP ISA-M
When building overall Integration Architecture on the Enterprise level, needless to say, it is mandatory to have a holistic view on all integrations. There are many EA tools (and other tools in general) which can be used for documenting Integration Architecture – we may create an integration “map” and it’s ”visual representation” using e.g. LeanIX, or Ardoq, or (more-or-less) any other EA tools of choice.
Disregarding of the EA tool(s) choice, we can also create a simple Integration Matrix – as a “one pager” catalogue:
- Listing all Integration Services;
- Linking Integration Services with corresponding Systems;
- Grouping and Mapping per Integration Styles and Use Case Patterns;
- Indicate some basic “info” – in/out-bound etc.
I like putting all this on a table… Optionally, each Integration Service name can be “clickable” hyperlink directing to the web page where (functional & technical) details are captured – of course, assuming we are using web-based tools for building and documenting Integration Architecture.
Now, Integration Services are linked with corresponding Systems and Interfaces; where all of them have their own characteristics. E.g.
- System has its Business Objects etc.;
- Integration Service has it’s Integration Patterns, Integration Function, Data Integration Scope, Integration Execution, defines Master System and Client System(s) etc.;
- Within Interface, we define Application Interface (e.g. SOAP, OData, IDoc, RFM/BAPI etc.), Data Formats (e.g. XML, JSON etc.), Application Protocol (e.g. HTTPS, RFC etc.), Method (POST/GET/PATCH/DELETE and PUSH/PULL) etc.;
- … and Interface has associated Field Mapping, defining mappings and transformations in the Request & Response Messages;
Both, functional & technical details, would be captured in Integration Service documentation.
Scope of integration
Let’s be clear on one thing– integration is much more than configuration of the Middleware…
In the ideal circumstances, we will activate e.g. standard OData service in S/4HANA, activate standard service on the Client System(s) and deploy pre-prepared Integration Package in SAP CPI – and all works like a charm… While this ideal scenario might be okay in some cases (especially when working with Public Cloud solutions), in real life, it is usually not so simple…
There are four main components in designing & building of the integration, which we must consider:
- Understanding the “context” – one Integration Service may be related with other Integration Service(s) – e.g. an event may (or may not) trigger additional GET message;
- Designing & building of the service endpoint (API, or event API, or…) in the Master System;
- Designing & building of the service endpoint(s) in the Client System(s);
- Designing & building of the integration flows in the Middleware.
Understanding the “context” is the key to make decisions about specific integration flavor and technology to be used. This “context” would tell us what the functional and non-functional requirements are, so we can decide if we need e.g. Sync OData integration, or Async SOAP will do just fine… Or, are we supposed to PUSH or PULL data… Or, is this integration based on a specific trigger (event?) or is it executed based on scheduled job… Or, how is this particular integration flow integrated with the overall business process… The “context”, together with some Integration Architecture principles (within our Organization) will guide us in making many other choices as well…
Do we try to use standard APIs and integration flows at any cost?
Remember, we build for business needs; APIs should not dictate the integration (pls see my previous article).
If we design & build scalable and reusable Integration Architecture, we will ordinarily prefer all API endpoints are “as standard as possible” and we build transformations within integration flows – we would try to avoid too much customized API endpoints, which are fitting the needs of only one or two integration scenarios (as we might wish to re-use some of it in other “future” scenarios)… Sounds good, but do we always apply this approach?
No, not necessarily…
Let’s imagine the following situation; Integration Service is usually built on top of the specific Business Object in the Master System; however technical representation of this Business Object in the Client System may be very much different… This would imply the Client System actually has several “standard” endpoints which must be accessed by the Middleware – following specific sequence and logic of the Client System.
E.g. Sales Order in S/4HANA has its Header data, may have one or more associated Items & Partners. As far as S/4HANA, all these elements are part of the same Business Object, and they are exposed “jointly” within various Integration Services – e.g. Sales Order Replicate, Sales Order (A2X) etc.
Client System would store Header, Item and Partner associations in the separate technical tables; and it might not have a single endpoint for all those tables.
So, what do we do?
- We can create an integration flow in the Middleware (e.g. SAP CPI) implementing specific logic and call individual endpoints – this however contradicts with some SOA principles; and it may also be quite complex to implement “update” Integration Function – where records in the reference tables may be added, removed or physically updated (e.g. updating Sales Order may update Header data, but could also add/remove/update Items)
- Or we can build a custom API in the client System, with a single endpoint – logic would be implemented where the “tables” reside…
In some cases, building all in the Middleware is easier and better choice – and please note, Sales Order is a simple example, let’s think about e.g. BP Customer and all related reference assignments and attributes…
In other cases, building a custom API in Master System or Client System(s) would be a faster and easier approach – and still would enable reusability.
Actually, in specific use cases there are certain constraints “forcing” us to go in one or the other direction.
E.g. in Microsoft Azure (either Power Platform apps or “real” Azure cloud-native apps) there are some limitations on how many API calls can be executed in the specific period of time – and this is not uncommon when building solutions in the cloud. Some limitations are related with licenses, and some are related with service protections or throttling.
Obviously, if we have an integration intensive scenario – where many payloads are exchanged with Power Platform app or Azure app, and each has several API endpoint calls – building this integration flow in SAP CPI might not be a viable option. In this case, the preferred option would be to build a custom API within Power Platform app or Azure app – using Azure integration capabilities and expose only a single endpoint to SAP CPI.
This was only one example…
A bit more on SAP ISA-M (again)…
On endpoints services and integration flows, let’s say the Master System is S/4HANA; we have both SAP and non-SAP Client Systems, and we will use SAP CPI as a Middleware.
SAP does provide many out-ot-the-box standard APIs and Integration Packages in SAP API Business Hub, for both SAP and non-SAP products – and this is where we should start.
In many cases, it might be needed to extend standard CDS OData or DRF SOAPs (or IDoc etc.), and SAP provides extensibility for various integration options – designing API extensibility (when needed), would be the next step.
But in some cases, it might happen that we need to build custom API – standard is preferred whenever possible (if you ask me), but if it doesn’t do (even with extensibility), we should not run away from building custom API…
As part of SAP ISA-M, SAP Integration Suite provides various accelerators and unified overview on integrations. Beside Configure and Settings, we can:
- Request best-fit (proposal) Integration Service – e.g. based on the questionnaires;
- Analyze Integration Areas, Integration Policies, Application Overview, Integration Technologies Overview – e.g. per Domain, Styles, Key Characteristics; also cross referenced per Integration Use Cases and much more…
Request can give us some clear guidelines if we can use some standard APIs, with or without extensibility, or some custom approach is better fit.
What about other non-SAP systems?
Well, many Vendors with their products do provide standard APIs (or events, or webhooks etc.). Many also provides their own lightweight Middleware for their own API configuration. Some even provide their full-blown solutions for integration with various SAP products…
The principle here would be similar as with SAP products – let’s try with standard available, if standard is not sufficient then extend it, and if previous options are not feasible, then let’s build custom API within that platform.
If we are building our core integration capability around SAP CPI, I would recommend exploring available integration packages – even in case there is no integration package for the specific non-SAP product/service we want to integrate – there might be an integration packages for the similar products (or line of business) which could help us build new integration flow for our needs.
Integration Patterns through examples
Okay, let me provide some examples of various Integration Patterns.
I will start with few basic scenarios.
Data Migration– e.g. through extraction program or report, data is exported from the Legacy ERP System into the new S/4HANA Master System. This process is usually very specific and one-time. It is usually executed semi-automatically or manually.
Broadcast – e.g. S/4HANA is broadcasting Sales Area Replicate to multiple subscribed Systems. S/4HANA is exposing this Integration Service via its Data Replication Framework (DRF), and it is not expecting any response from the subscribed Systems. There is one data flow from S/4HANA to SAP CPI. In SAP CPI there are different flows (multicast of the message) for each subscribed System. In SAP CPI we can also filter and transform messages or its formats per need, as well – e.g. only some Sales Areas are sent to System A etc.
One-way synchronization – e.g. B2B Portal is taking Sales Order; message is sent to S/4HANA via SAP CPI; S/4HANA process the message and respond back to B2B Portal, via SAP CPI. In SAP CPI we may perform some filtering and transformations, although in this case (when making Sync call) it is preferred not to put too many transformations in the Middleware (to avoid any additional latency). Technically, SAP CPI is usually just pass-through…
Why using SAP CPI for pass-through?
We may still benefit from the unified view of all Integration Services “in one place” – e.g. monitoring, additional orchestration (multiple Systems taking orders, or multiple S/4SHANA instances covering different regions) etc.
Two-way synchronization – e.g. mutual synchronization of BP Vendor/Supplier data between S/4HANA and SAP Ariba. Using standard SOAP (DRF in S/4HANA) both Systems are replicating changes to the other System, via SAP CPI; the other System is processing each message and returning the confirmation message (success or error), again using standard SOAP, via SAP CPI. In SAP CPI we may perform some filtering and transformations.
Data Correlation – e.g. two Sales Agents creates the same Account in the two different Systems for Direct and Indirect Sales – those Accounts will be synchronized, but if only one Sales Agent creates Account in the System covering Direct Sales– this Account will not be synchronized to the other System covering Indirect Sales (will not be visible to the other Sales Agent);
Data Aggregation – e.g. ETL of Account Receivables from S/4HANA into SAP BW.
“Real life” scenarios are usually more complex – combining multiple Integration Patterns.
Let’s observe BP Customer integration using SAP MDG and SAP MDI. We can have following scenario (as an example):
- SAP MDG and SAP MDI have Two-way synchronization; using PUSH SOAP in between – basically SAP MDG is both Provider and Consumer toward SAP MDI;
- S/4HANA is consuming data from SAP MDI and this is One-way synchronization; using PUSH SOAP in between – S/4HANA is only Consumer from SAP MDI (this is PUSH, the arrow on the diagram shows Request Message with data from SAP CPI, and Response Message toward SAP CPI);
- System A and SAP MDI have Two-way synchronization via SAP CPI; using PUSH OData in between – System A is also both Provider and Consumer toward SAP MDI;
- System B is consuming data from SAP MDI via SAP CPI and this is One-way synchronization; using PULL OData, where System B pulls data from SAP CPI endpoint – System B is only Consumer from SAP MDI (this is PULL, the arrow on the diagram shows Request Message toward SAP CPI, and Response Message with data from SAP CPI)
- SAP ECC (e.g. legacy, but still in use) is getting replicated data from SAP MDI via SAP CPI and this is Data Replication; using PUSH IDoc in between – SAP ECC is only Consumer from SAP MDI;
- … and there could be other connected Systems as well;
Demystifying integration terminology
At the end, let me demystify some of the most commonly used integration terminology.
Sync vs. Async
As it was already explaining in the previous article, Sync Integration Execution is a single thread – only one operation will run at a time:
- Requestor (System sending Request Message) will wait for the response from the Responder (System sending Response Message);
- session for the Requestor is “frozen” until response is received;
Sync would be usually used for the following Integration Patterns:
- One-way synchronization;
- Two-way synchronization;
Sync can be used for all Integration Functions:
- CRUD (Create-Read-Update-Delete) – e.g. POST/GET/PATCH/DELETE;
And, as far as Data Integration Scope, we will ordinarily use Sync for Single record or in some cases for Partial dataset integration.
Sync approach makes most sense when; either we want to encapsulate the transaction (e.g. POS payment, eCommerce etc.); or when we need to ensure consistency of the data transmission between Business Systems (as a part of the unique/integrated business processes flow) – because it is (near-)real time, and because it provides response as part of the integration flow (e.g. success or error).
An example could be POS payment, where Customer invokes credit card payment at the counter, and awaits response (confirmation/rejection). Customer needs to await successful completion of the transaction before taking his/her goods from the counter. Customer & his/her goods are “blocked” until the transaction is completed.
Async Integration Execution is multi-thread – multiple operation can run in parallel:
- Requestor (System sending Request Message) does not wait for the response, Responder (System sending Response Message) will send response “at some point of time”(*);
- no session freezing for the Requestor, Requester can work on other operations;
*) in the Data Migration and Broadcast Integration Pattern, Requestor will not expect any response.
Async approach, in fact, can be used for all Integration Patterns:
- Data Migration – but through specific tools and processes, very often this is only partially automated process;
- Broadcast – where Request will “fire & forget” message “payload” and will not expect any response – usually this is message PUSH to the subscribed System(s);
- One-way synchronization;
- Two-way synchronization;
Async would be usually used for the following Integration Functions:
- CUD (Create-Update-Delete) – e.g. PUSH message “payload”;
- R (Read) – e.g. by invoke PULL from the Responder (to return the “payload” with the Message Response);
Far Data Integration Scope, we can use Async for integrating:
- Full dataset object – e.g. all data instances of the Business Object as a part of the Data Migration (full load);
- Partial dataset – e.g. range of the Business Object as a part of the Aggregation (delta load ETL for DWH);
- Single record – one (or only few) instance of the Business Object
Async approach makes most sense when either; we want to send/receive larger set of data; or we need to send/receive data without blocking sender while awaiting the response, or when delivery time/confirmation is not critical/relevant for the ongoing operation – because it can be run as a batch job (particularly suitable for large data volumes), and because response is not needed for sender to continue working on other operations.
As an example, Customer pays an invoice (for goods or services) via bank transfer (clearing) from his/her account to Seller’s account. Payment is processed “at some point of time” (e.g. next business day). Seller receives the payment and clears the invoice. Customer will receive a confirmation of the payment in the bank statement – or in some cases also from the Seller.
PUSH vs. PULL
Requestor System is the one initiating the integration. It can either PUSH or PULL data.
PUSH means Requestor is sending the data (or event notification) in the Request Message to the Responder, where Responder may send Response Message with e.g. confirmation or error code. Integration Execution can be either Sync or Async – does not matter. For Sync, it is usually POST, PATCH or DELETE method (although complex MERGE is also possible). If Requestor is invoking some action in the remote System – this is considered as PUSH, and e.g. if it is Sync, it would normally be POST method.
PULL means Requestor is requesting the data with Request Message from the Responder, where Responder will send Response Message with e.g. requested data or error code. Here as well, Integration Execution can be either Sync or Async. For Sync, it would normally be GET for read data operation.
On-change vs. Scheduled job
What is triggering the Integration flow?
It can be:
- some “change” – like a change pointer in the record, specific event etc.
- some “job” – scheduled or manually initiated;
On-change is used for both Sync and Async execution, either to PUSH or PULL data, usually Single record only.
E.g. user created Sales Order in the remote Client System which triggers Sync PUSH message (POST method) toward the Master System. Or, user opens the screen to see Sales Order details in the remote Client System which triggers Sync PULL message (GET method) to read data from the Maser System. Or, whenever Sales Order status is changed in the Master System it triggers Async PUSH toward Client System(s)
Scheduled job is primarily used with Async execution, either to PUSH or PULL data, usually for Full dataset (e.g. full batch load) or Partial dataset (e.g. delta load based on change pointers). Sync execution would not make much sense when sending in batches many records.
E.g. scheduled job in the Master System Async PUSH daily Price List to Client System(s). Or, user action in the Client System starts manual job to Async PULL all Stock data (for the specific warehouse) from the Master System.
And these are just a few examples…
As indicated earlier, all this is just a Solution Concept, just an idea “how to do”…
Although this article, together with the previous one, could serve as a good guideline, there is no strict rulebook how to build “sustainable” Integration Architecture – an Architect responsible for building the Strategy in the respective domain, will be building the Integration Architecture in his/her Organization. And again, please note; “one size, does not fit all” (Organizations)…
I am inviting you to keep following relevant blogs and community resources, post and answer questions, and read other posts on the integration topic.
And of course, share your thoughts and comments on my article, in the comments section.
*) Some of my views on Agile, EA and Integration Architecture – I have shared within SAP Community Groups: Enterprise Architecture in the era of Agile… and Agile EA – from SOA to Interoperability
*) Intro photo by Ivan Aleksic on Unsplash
More guidelines on Solution Diagrams & Icons can be found in this article by Bertram Ganz.
 SAP ISA-M: Integration Methodology | Services and Support
 LeanIX: Integration Architecture – The Definitive Guide
 Adroq: Data Flow & Integration Architecture
 Microsoft Power Platform : Requests limits and allocations – Power Platform | Microsoft Learn
 Microsoft Power Platform: Service protection API limits – Finance & Operations | Dynamics 365 | Microsoft Learn
 Microsoft Azure: Request limits and throttling – Azure Resource Manager | Microsoft Learn
 Azure Integration Services Azure Integration Services | Microsoft Azure
 SAP Integration Suite: Product Features | SAP Integration Suite
This week is Arbor Week: Arbor Week: one click = one tree! - SAP Community Groups
Place your likes and/or comments this week on this - or any other Blog... Let's support planting trees...
Thanks for sharing, Goran Stevanovic! 🌳
Hi Goran Stevanovic ,
Thanks for the excellent blog. Is it possible for you to share the Integration matrix template or other templates used to capture all interface inventory?
Sorry, cannot share files on SAP Blogs... Contact me on social profile (LN)...
Great blog post Goran Stevanovic Thank you for sharing.
Hi Goran. I enjoyed getting a tidbit more of a taste on ISA-M. I've been studying the template (V4), completed the OpenSAP course on ISA-M, and I'm now trying to find more information on how other organizations are customizing the configuration to meet their specific needs as well as try to find any information where/how an organization is using this in the real world context. I'm finding lots of theoretical aspirations for the value it is supposed to provide but I'm not finding any concrete examples where these have come to fruition. Another blog digging deeper into these two aspects would be awesome.
Sorry for responding with delay.
In particular (very operational), when working on SAP Integration Suite - I would use Request to find a recommended best-fit proposal - which Integration Service to use.
You will also be able to use e.g. Analyze tab for having a holistic overview of all Integration Areas, Integration Policies, Application Overview, Integration Technologies Overview – e.g. per Domain, Styles, Key Characteristics...
However, what I find most useful - is SAP API Business Hub, many/many (described) APIs & accelerators etc. Of course, one does not have to formally use ISA-M; "searching" through available APIs & accelerators (packages) could do the trick as well (instead of Request)…
In general, I find the approach of ISA-M extremely helpful to have a holistic view of all Integration Services and its dependencies - and I work in the environment which includes many (really many) both SAP and non-SAP systems. ISA-M is just one part of the puzzle. I observe ISA-M as an approach/methodology, not as a rule book.
Ok. Thank you. I'll keep all this in mind as I continue to explore. Appreciate the reply!