Skip to Content
Technical Articles
Author's profile photo Graham Robinson

SAP’s big challenge – Integration

Earlier this month I attended SAPPHIRENOW and ASUG Annual Conference as part of SAP’s Blogger program. It was a good opportunity to touch base with SAP and their leadership teams after the recent changes in personnel, the Qualtrics acquisition and the associated shifts in focus and strategy.

There were lots of interesting ideas, topics, etc. discussed formally and informally throughout the week but the one I kept returning to was “Integration”.

The day prior to the conference I took part in the SAP Global Partner Summit. During the Opening Keynote SAP Executive Board Member Adaire Fox-Martin was asked what she saw as the biggest challenge SAP faced. Her answer was concise – “Integration”. Click on the video below to see the full question and answer.

Adaire Fox-Martin answers the big question

Although not addressed directly in his SAPPHIRENOW Keynote Bill McDermott also mentioned integration several other times during the week.

About 2 years ago – maybe a bit more – SAP finally got the memo and accepted that SAP-to-SAP integration is SAP’s problem. In some respect this was an acknowledgement of a customer pain point and a willingness to address it – but it also makes strategic sense given SAP’s cloud ambitions.

Arguably SAP has a differentiating position in the enterprise software market because their software touches all parts of the enterprise. Most of SAP’s cloud competitors have established capability covering one or two lines of business and are aiming to extend out from there. SAP currently offers solutions that cover the complete enterprise – but a true “cloud” offering should be a single solution spanning all LOB’s rather than multiple solutions cobbled together. At least it should appear that way to their customers. 😉

Last year SAP announced Master Data as Service(MDaaS – my abbreviation not SAP’s). The goal of MDaaS was to have master data managed in a cloud service that could then be consumed by any number of applications – both SAP and non-SAP – to ensure a “single version of the truth”.

At the time SAP said they were starting with Business Partners (think Organisations, People, Customers, Suppliers, Employees, etc.) and Product Masters (think Products, Materials, etc.). Their aim was to deliver those two services before the end of 2018, with others to follow in 2019.

This is a big job. Step one is to come up with a universal data model covering the master data objects that takes into account all the requirements of existing (and future) consuming applications.

For example how many characters should be in a surname? What is the priority and relevance of a surname in different geographies, countries, cultures, etc? Should Business Partner master data include support for recording history as an organisation or person changes their name? Should Business Partner master data include support for differing tax requirements in different jurisdictions.

As I said – a big job.

Despite the scale of the task SAP announced the GA release of SAP Cloud Platform Master Data for Business Partners 1.0 on 5th November 2018.  A GA release is an important milestone because it doesn’t just mean the product is available it also means it has successfully passed some well defined benchmarks during a customer beta phase. Subsequently version 1.1 was released during Q1 of 2019.

*Note – there is no sign yet of SAP Cloud Platform Master Data for Product Masters.

This seemed like a good start so I was keen to find out more – however it seems that MDaaS is one of those things SAP has revisited recently so things are a little “up in the air”. ¯\_(ツ)_/¯

It appears to me that the MDaaS initiative has been restarted. It is hard to figure out why this has happened but the best explanation I heard was that the work was being done in several silos and it was felt it should best be placed under a single area. This makes some sense because organisational silos have been a challenge for SAP for as long as I can remember.

I learned that SAP are now focusing on the top eight (I think) business objects and are close to completing the data modelling work mentioned above for each of those eight scenarios.

My presumption is that the Business Partner and Product Master data modelling work done last year would have been valuable input into this process but my impression is that a completely fresh look is being taken at those two business objects as well.

Personally I am happy if SAP need to revisit earlier work to get this right – because I see MDaaS as the vital piece of architecture and technology that will allow SAP to deliver on their promise of owning the integration issue. And if they can do that they will truly be able to offer customers a single solution for the entire enterprise – customers simply choose the “modules” they want to subscribe to.

Why so? Well first I would ask you to try to discard any thoughts you have about replicating master data from a MDaaS into your applications. That would just be adding one more integration challenge to the pile. Rather think about refactoring all your applications so they consume master data via standard APIs.

If I have an on-premise edition of S/4HANA I may choose to provision my Business Partner master data in my S/4HANA system – but I could equally choose to provision it on SAP’s MDaaS offering. This choice would then be reflected in system configuration tables – not code – because both options would use identical APIs. The only real difference would be the service endpoint.

Source: SAP Cloud Platform Product Overview L2 deck

Last year I shared my killer integration demo idea with Bjoern Goerke – and I did so again this year with Jan Schaffner who heads up SAP’s Central Engineering team. It goes something like this….

  • In my S/4HANA system I create a purchase order for a raw material.
  • I then decide I want to extend my SAP subscription to include the professional procurement module called “AribaProcure”. Hey if SAP can change product names willy nilly so can I. 😉
  • I can immediately see my suppliers, including history, spend analysis, etc. I can initiate a tender or auction process for the same raw material which saves me some costs.
  • Now I decide to extend my SAP subscription to include professional CRM – “C/4”. I can immediately see all my customers, their histories, spot some new trends I may have missed with S/4 Simple CRM, etc.

I think that would be a very powerful demo. Especially if it only took five minutes. Especially if the UX completely hid the backend “modules” being consumed from the user and simply surfaced new functionality as the subscription details changed.

I don’t expect it this year because – as I said earlier – it’s a lot of work.


Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Phil Cooley
      Phil Cooley

      Thanks Graham Robinson - definitely a key challenge for SAP but also for most organisations. I am really excited for the master data objects being integrated into SAP Cloud Platform to then be utilised by applications. Powerful stuff. Thanks for writing!

      Author's profile photo Christian Drumm
      Christian Drumm

      Hi Graham,

      nice blog. I especially like your killer integration scenario. If this scenario is going to work out of the box someday, SAP has really solved the integration problem.

      One thing I'm still unsure about it the Master Data as a Service. While I totally understand the reasoning behind the approach I just don't see it working anymore in a business application scenario involving cloud application. One basic idea in Domain Driven Design is the idea of a bounded context. Basically (as far as I understand the idea) is to design a data model the fist for exactly the current application. Instead of trying to design a one size fits all data model (the Master Data as a Service) approach the DDD approach accepts that there are difference in the data model for different applications. IMHO that fits a lot better to the concept of cloud applications.

      Take the Qualtrics acquisition as an example. Where SAP started to develop the current version of Master Data as a Service this acquisition was not on the horizon. Consequently, the requirements if integrating Qualtirics could not haven been taken into account. So after the acquisition there is probably a need to extend the BP in Master Data as a Service to accommodate also some data required by Qualtirics. In contrast to that, if we accept that the views of BPs are different in S/4HANA and Qualtrics creating a central model is not required. Instead "only" a  mapping  between the data models is required.

      This is still a complex task and maybe a lot of work. But for me it seems to be a more realistic approach in a world of cloud applications.#



      Author's profile photo Graham Robinson
      Graham Robinson
      Blog Post Author

      Hi Christian,

      thanks for your engagement and your insighful comments.

      I struggled how to respond to you without having to resort to a blog-sized comment – so my apologies if this gets too long-winded. I don’t really know where I am going to end up. Please read this as a stream of thoughts and by no means a fully considered answer to the issues you raise.

      Let me start by saying that I don’t see the MDaaS data models having to solve all the requirements of every consuming application. However I do see these data models addressing all – or almost all – generic requirements. That is data structures, associations, etc. you would reasonably expect a common data model to have. i.e. Ones that more than one application are likely to need. Simple Business Partner example would be “FirstName”, “FamilyName”, etc.

      Where an application has requirements that fall outside what is delivered in the generic model I would consider it the responsibility of that application to enhance the generic model to meet these needs. This might be simply mapping existing properties to domain-specific properties in the application model (e.g. “FamilyName”->”Surname”), suppression of unwanted properties or support for additional domain-specific properties (e.g. “IsFavourite”, “MarketSegment”, etc.).

      This provides both a bounded context for each application but still allows for maximum leverage of a single master data model.

      To me this is not so different to what SAP are advising us to do to extend S/4HANA using the SAP Cloud Platform – assuming in-app extensions (or whatever they are called now) won't do. Extend the S/4HANA data model inside SCP and build a new SCP oData service that adds the new properties to an existing S/4HANA service.

      Also not a lot of difference to the concept of creating a CDS consumption view for each new UI5 application you build – even if they all project the same underlying database tables.

      I hope that makes some sense.

      And, finally, I don’t think there is an alternative for SAP if they want to crack the integration problem. Any other strategy can only add to the integration challenges whereas MDaaS limits it to those application-specific (domain-specific) edge cases.


      Graham Robbo

      Author's profile photo Daniel Graversen
      Daniel Graversen

      I would assume that the tool will allow some extension of the masterdata object to also encompase data your have in 3th party tools. So an addition with Qualtrics will just be to add the few extra fields and a mapping.

      Author's profile photo Nabheet Madan
      Nabheet Madan

      As always thought provoking blog sir! My two cents given the SAP's current approach toward overall development which is more of event driven, API driven and with an option to enhance the existing stuff via factory/Kyma etc. it will not be that complex. I think the framework being followed is more of an industry standard and quite flexible.

      Particularly in case of the MDaaS, for the model to succeed it is more dependent on the functional knowledge which SAP has built in last 40 years or so. The technical framework is more or less laid out it is the functional knowledge which might impact the design.

      Overall i feel many steps are already taken by SAP to solve the integration problem, lets hope we succeed.

      Author's profile photo Markus Ganser
      Markus Ganser

      "Things are a little up in the air", this statement holds true in the sense that we are talking about the Cloud ?. In fact, domain model alignment (DMA) is in full swing, which is the foundation for truly integrated business processes. It is extremely important that applications can rely on a consistent data model that is typically shared between multiple applications in business processes: applications must speak the same business language and the data model must be technical compatible. For this reason, we are continually expanding the scope of DMA along key business scenarios.
      In this context, we are also continuously enhancing the master data services on SAP Cloud Platform. For example, we continue with SAP Cloud Platform for business partners as the master data sharing and integration layer in the cloud (based on an aligned data model) and SAP Cloud Platform for products is planned to come soon.



      Author's profile photo Dev Karan Ahuja
      Dev Karan Ahuja

      This is Dev Karan, APO for Product Service and I would like to clarify the matter.

      SCP Master Data for products service delivery is planned for H2/2019. The CRUD-capable service endpoints will be aligned with the Product DMA model. The DMA modelling exercise will conclude in the next few months and subsequent releases of product service will incorporate the DMA model enhancements.

      The service will provide a central access for product data using Fiori UI and OData/SOAP APIs. For the initial delivery scope, data model will cover the core entities along with the Sales aspect.

      I look forward to sharing more details about the service and roadmap closer to the GA date. The standard disclaimer applies to all forward-looking statements and are subject to various risks and uncertainties.

      Author's profile photo Vincent Zhang
      Vincent Zhang

      Hi Graham,

      Nice ideas on the MDaaS. However, I think giving up replicating may not be achievable from engineering point of view, if I understand correctly of what you mean on this:

      "Well first I would ask you to try to discard any thoughts you have about replicating master data from a MDaaS into your applications. That would just be adding one more integration challenge to the pile. Rather think about refactoring all your applications so they consume master data via standard APIs."

      Because API can do nothing but object level access. There are lots of cases we consume data in data set. For example, a join of sales order items with customer mater data cannot be achieved by any of current API technologies, but only with SQL which requires data is already in the same instance.

      If you want to achieve UI level data manipulation through OData stuff, it equals to you rebuild a SQL parser using JAVASCRIPTS. And you will never achieve reasonable performance.

      Master data is and will be replicated. The only problem is who will be acted as the "single source of truth". And it is hard to expect that one can see everything in a single fact sheet considering SoD, DPP, performance, and integration costs. You cannot expect changes on a master data should only happen in the central master data system. It will be always a world with complex integration flows.



      Author's profile photo Graham Robinson
      Graham Robinson
      Blog Post Author

      Hi Vince,

      thanks for your comments on my blog post.

      When considering architecture issues I favour a technology agnostic approach because technology can and does change regularly – but good architecture is not encumbered by technology changes but should facilitate them.

      I don’t feel the SQL-centric points you have raised are of real relevance here – but if I was to respond to them I would offer some of these perspectives…

      • Who says that backend data needs to be stored in a SQL database? As an API consumer I don’t care where the data is persisted as long as the service does what it claims it will do.
      • oData V2 Query example showing a join of sales order items with customer master data
      • Distributed Databases
      • oData (created by Microsoft) & GraphQL (created by Facebook)
      • Etcetera

      Putting aside technology the strategic decision for SAP (and others) is do they truly want to build a suite of Cloud applications – or do they want to move their on-premise applications to the cloud? Because these are not the same thing.

      Allow me to draw your attention to what is known as the “Jeff Bezos Mandate”. The well remembered and oft-quoted points he made to his engineering teams are:-

      • All teams will henceforth expose their data and functionality through service interfaces.
      • Teams must communicate with each other through these interfaces.
      • There will be no other form of interprocess communication allowed: no direct linking, no direct reads of another team’s data store, no shared-memory model, no back-doors whatsoever. The only communication allowed is via service interface calls over the network.
      • It doesn’t matter what technology they use. HTTP, Corba, Pubsub, custom protocols – doesn’t matter. Bezos doesn’t care.
      • All service interfaces, without exception, must be designed from the ground up to be externalizable. That is to say, the team must plan and design to be able to expose the interface to developers in the outside world. No exceptions.
      • Anyone who doesn’t do this will be fired.

      The less remembered part – paraphrased by me – “Amazon is not an online bookshop or an e-commerce site. We are a platform company!

      It is important for context to point out that the Bezos Mandate happened in 2002 – 17 years ago! If you look at where Amazon is now you would have to be a pretty disingenuous person to say that Bezos was wrong.

      But here we are in 2019 and there are surprisingly few companies that have followed the Amazon lead at scale. To be fair for a lot of them it is easier to leverage an existing platform than to build their own – another way that Bezos was right IMO.

      You may disagree with me – many do – but if we can agree that SAP aspire to be a “Cloud” company we can probably also agree that they are not there yet.

      Sure they have some true cloud-only applications – mostly acquisitions – but these applications run on their own technology stack in their own infrastructure silos. You can’t honestly point to any of them and say they run on a “platform” if we acknowledge that a platform should serve the masses not just one or two important individuals.

      I see MDaaS as first step towards a true platform for SAP, Partners and Customers to build cloud applications on. I certainly do not see it as the last step – but I fear that unless they can get this initial piece right then nothing beyond it can succeed.

      Markus Ganser points out above that

      “... domain model alignment (DMA) … is the foundation for truly integrated business processes. It is extremely important that applications can rely on a consistent data model that is typically shared between multiple applications in business processes: applications must speak the same business language and the data model must be technical compatible.”

      I couldn’t agree more. And I especially note that Markus has implicitly extended the reach of the domain model beyond master data – something I deliberately left out of my original post in the interest of clarity and space.

      Markus adds…

      “For this reason, we are continually expanding the scope of DMA along key business scenarios.”


      So where do SAP need to go?

      I will shamelessly steal from Jeff Bezos. IMO they need a single “platform” that supports all business scenarios – both SAP’s and others. All scenarios will need to expose their functions via well described “service interfaces”. These “service interfaces” need to be the only way one scenario can interact with another. The goal must be that there will be no movement of data from one scenario to another.

      That is the only way to break the downward spiral of the “integration challenge”. Anything else just adds to it.

      Thanks again for your comments.
      Graham Robbo

      Author's profile photo Vincent Zhang
      Vincent Zhang

      Hi Graham,

      Thanks for the long reply. It’s quite informative and also make me think from other perspectives.

      There is still no correct answer for “Cloud” yet. AWS may success, but what they did is different with what we(Enterprise LoB software developers) are doing. However, the target is always clear: How to lower the TCO for our customers to leverage modern IT.

      I always tell my friends, don’t neglect SQL, it is the most important invention in information technology. The “NO-SQL” movement was over, even MongoDB is embracing SQL. What I mean “SQL” is beyond “SELECT..FROM”, it actually stands for the relational algebra defined by E.F.Codd. Even we take a technology agnostic approach(as you proposed), there are still physics and mathematics ahead. So far as I can see, the service API concept just cannot handle them in a reasonable cost.



      Author's profile photo Graham Robinson
      Graham Robinson
      Blog Post Author

      Hi Vince,

      You seem to really like SQL and not like service API's because they can't do something that SQL can.

      But if a service API simply provides a new interface to a SQL query where do you stand? LOL


      Graham Robbo