Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
hardyp180
Active Contributor
In the week commencing 23/09/2019 there was SAP TechEd Las Vegas. You have probably already read half a dozen blogs about this, but I thought I would add my ten cents worth. Well in fact this is going to be an enormously long blog, but I do not see the difference between posting one huge blog, and posting five blogs 20% of the length of the long one, fifteen seconds apart, which is another common way of doing things.


Random Photograph of TechEd Show Floor


Day One

SAP Executive Technology Briefing

There is usually some sort of event before the “real” start of TechEd Las Vegas on the Tuesday, as well as the ASUG pre-conference sessions (which cost extra money). Two years ago it was all to do with the Data Hub. This time I went to the “SAP Executive Technology Briefing” which as the title suggests was aimed at people a bit higher up the tree than me.

It would be unfair to call the whole thing an advert, but it was sponsored by Intel, AWS and Red Hat who all had sessions promoting their wares, interspersed with SAP sessions.

It started off with an opening address by SAP CTO Jürgen Mueller, saying a little about what he was going to be talking about in his keynote tomorrow, and how that fitted in with the subject of the day which was “The Intelligent Enterprise is Real”. I think that was actually a very sensible subject to address because a lot of people do not believe it is real, or cannot grasp what SAP means by “the intelligent enterprise” in the first place.

I could, if I was in a cynical mood, re-phrase the message to “Get on S/4 HANA in the Cloud. Now! Now! NOW! If you do not the penalty is DEATH!

The format was half hour presentations in a sort of layer cake approach with SAP sessions surrounding the sponsor sessions.

I liked the “customer success stories” presentation. The presenter said it was important to hide the fact your presentation was boring by inserting a picture of your young child or an animal. He put in a picture of his puppy. He said it almost died by eating too many M+M’s when they were spilt on the floor, and he took it to the emergency vet which was, ironically, owned by Mars (the company, not the planet). So the irony was the Mars company got paid twice, once for the “poison” and once for curing the effects of the poison. This is the sort of innovative game changing business model that often gets talked about at these conferences.

Uber got a mention – of course – and the whole audience erupted in shouts of “YES MICHAEL! as is the rule. This time the reason that Uber was disruptive and changed the market was that – as opposed to NYC yellow cabs – the air conditioning works and there is not spilt coffee everywhere. I will say I was in an Uber in Las Vegas yesterday and the lady driver told me that since Uber started there six cab companies have gone out of business already and it’s not looking good for the ones that remain.

Then Qualtrics was mentioned – indirectly – there is going to be a lot of this throughout the conference I am sure. The statistics quoted were that 80% of CEOS think they are doing an absolutely stellar job of giving their customers a wonderful experience whenever they buy anything whereas just 8% of customers share this view. This is the “experience gap” that the combination of Qualtrics and SAP is aiming to close i.e. the first one identifies there is a problem, and the exact nature of the problem, and the second one is able to do something about addressing the problem. Put like that it all makes a certain kind of sense. I’ll give you an example. Today the moving company (Conroy’s of Australia) took the contents of my flat out of storage and put it back in my flat. Whilst they were at it they connected up the washing machine and dryer. They did not have to do that, they did not get paid extra, they did it because it was wonderful customer service.

Getting back to the conference, sadly some customer success storied involved being able to get rid of a large number of personnel and still get everything done that was done before. This is called “taking the work out of work” apparently. Also I learned that those people were not sacked or made redundant – they were “re-badged”, which means the exact same thing but just sounds better.

Moving on to the Intel talk, there has been a large debate on the SCN and elsewhere as to how much of the performance improvement when moving to a HANA database is done to the new hardware you have to have, and how much is down to the database itself. Many people claim it is almost totally down to the hardware and naturally Intel are of this view, even if they do not say so explicitly. One thing is clear – which each new generation of chips they are still getting the Moore’s Law type of performance improvements on a HANA database (or presumably anything else using the chips).

Next up was SAP again this time talking about “SAP Embrace” which is an initiative to get customers to move to S/4 HANA in the cloud, using one of the three “hyper-scalers”.

Then came AWS with the longest talk of the day (well two talks really, with two presenters, one about the theory, one doing a demo).

Say what you like about AWS they are clearly a runaway success story financially. The presenters had a slide where he first showed how many companies moved their SAP onto the AWS Cloud in 2016. There were about five. Then in 2017 there were about fifteen. Then it tripled again in 2018, and thus far in 2019 the number has exploded out of all proportion. Some of these are very big companies like BP in the UK, and Qantas (the airline) in Australia.

The demo was all about the IOT and we saw a small conveyor belt hooked up to an SAP system being broken and a service order getting automatically generated. The point was that AWS has its own IOT set of services, and so does SAP (Leonardo) and the two can talk to each other, though I presume you would want to use one or the other, not both at once.

Day Two

On the next two days I did not go to any lectures. I spent most of the time in SAP Mentor meetings with assorted top SAP dogs who were asking for our feedback on assorted new things that are or are going to be coming out. I cannot talk about any of that as it is secret squirrel.

I did make sure I went to one “Hands On Workshop” every day as they are great, the best way to understand a new or improved product is to actual go through exercises using it.

Naturally visiting the show floor was vitally important as well – that is always really good, there is just so much going on, and this year they had puppies as well.

As a tip for young players, even though the official drink-up, oh I am sorry I mean “networking reception” does not start till 6:30, a lot of booths start dishing out alcohol well before that e.g. one company had Bourbon tasting at 4PM – but I could not make that – I was in a hands on workshop!

Hands On Workshop – Git Enabled CTS

You might not think a two hour workshop about moving transport requests resulting from your custom code from development to production would be that wildly exciting, especially if – like me – you have been doing just that, all day every day, for 20 years.

However the future of the ABAP Transport System is somewhat radically different than the situation today. I also had no idea SAP was this far advanced with the new concept, I thought it was still on the drawing board and then suddenly there I was using it in action (albeit in a demo).

The application we were transporting was the ever popular “Hello World” application. We used an ABAP in the Cloud system, and naturally Eclipse as the IDE. We had a unit test as well – which tested if the output was “Hello World”. The existence of such a test turns out to be vitally important to the future of the universe, or at least this exercise.

abapGit was also heavily used, as might be expected.

So we created the application, transported it to production, made another change in development that broke it (it now said “Hello World2”) and attempted to transport the broken version, and then fixed it again.

Nothing unusual there but there are several vital differences:-

The CTS system looks the same to the developer – there are still transport requests created when you change something, and you still release the transport request.

However the two transport files we know and love are no longer there. Instead the details of the changed object are stored in a GIT repository.

When the transport moves to quality or production “Jenkins” is used (in the same way it is used in every other language) and if the “pipeline” is configured in a certain way then a failing unit test, or indeed a syntax error, will cause the transport to be automatically backed out of (say) quality and the last version that actually worked re-instated. That has been something ABAP people have been asking for since the year dot.

The other vital difference is that you can have “branches” like other languages have had for years.

That way you can keep your “master” branch pristine i.e. code you know for a fact works, and do your new potentially dodgy changes in a “feature” branch. This is where abapGit comes into play – it gives you the same ability that other languages have always had. Roll back to the “master” branch where everything works, make sure production is OK, and then recall your feature branch in development – you have not lost any of your work. Then only when that is in production and you are sure it works – unit tests plus integration tests plus “real end user using it in production” tests (the latter being the most common type of tests performed in my experience, sometimes the only tests done) – merge the feature branch into the main branch.

One hilarious things was that at one point everything suddenly stopped working for everybody in the room. That puzzled everybody, not least the instructors, and it turned out what had happened was that the time in Germany (where the system we were using lived) had just gone past midnight, and so all the transports we had created were suddenly created yesterday and thus vanished from our screens showing todays transports.

Day Three

Another day of Mentor meetings for me, with an added Hands On Workshop.

On the show floor, INTEL had a really great magician. He asked me to think of a card in my mind and then pulled that card out of the pack. That bit was not that surprising, the really surprising things is that when he pulled the card out of the pack he was wearing my watch.

The other really good trick was that he got two guys to stand about six feet away from each other, facing each other with their eyes closed. He then hit one of the guys on the nose with his own SAP TechEd lanyard and asked “what did I just do?” The guy who actually had been hit said nothing while the other guy said “You hit me on the nose!”

Hands On Workshop – Extending S/4HANA with ABAP in the Cloud

In a very good way this was carrying on from the HOW from the day before in that the ABAP in the Cloud was still being used, and the GIT enabled CTS and so forth, but this time we were going to do more than “Hello World” and influence/extend the behaviour of a standard S/4HANA application.

As I recall the extension was that we were adding a “Z” value help to a standard SAP SFLIGHT type application and the extra functionality was being coded in ABAP in the Cloud.

I always moan and groan and complain about everything, and this is going to be no exception, so I was puzzled why in the exercise we were extending an on-premise S/4HANA system as opposed to a cloud one. You can extend S/4HANA on premise systems directly with Z things, the same as you could in an ECC system. I imagine SAP would argue you are supposed to extend on premise and cloud systems the same way as that way you get to pay two separate licence fees, oh no that is wrong, I mean that way everything is consistent.

The exact process you use is a bit too complicated to go into here – I will probably do a separate blog on the subject – but it is not the end of the world, and if it can be explained in a two hour hands on workshop to someone who is totally unfamiliar with the process, then that bodes well for the future of ABAP programming.

Day Four

All the Mentor meetings were over, so I started going to see some lectures.

ABAP Roadmap

This was massively over-subscribed – 209 people signed up – so had to be moved from a lecture room to a theatre in the show floor. The basic strategy for ABAP has not changed since last year – one update a year for on-premise (the last being 1909 released last week) and one a quarter for the cloud version. SAP keep saying there is a single code base but that will not really be true until such time as the ABAP RESTful programming model is enabled in on-premise S/4HANA systems.

What is new is that in the “behaviour definition” you can say what fields are mandatory and what fields are read only. I also notice validations have now been enabled as well as actions and derivations so the RAP will now have parity with BOPF. The major difference between BOPF and the RAP is that BOPF was pretty much form driven e.g. you choose from a drop down list when the validation fired. In the RAP you say where the validation fires in the actual code e.g. “VALIDATION xyz ON SAVE”.

No one is more pedantic than I. When I see the example error message came up as “20190827 is before 20190828 for booking 000000123” I want to weep. It is so easy to forget there are actual real humans looking at these error messages and humans don’t tend to think like computers i.e. backward dates, strings of leading zeroes and so on.

Hands On Workshop – RAP

I went into this about half an hour after the ABAP Roadmap and as might be imagined the exact same example that Karl Kessler had to gloss over during the last session in five minutes was now presented as a two hour exercise where you have to program it yourself (albeit with pre-defined example code you can cut and paste into your program).

This time last year the RAP only had the “unmanaged” scenario where you had to code the CRUD operations yourself. Now the “managed” scenario is ready where basic things like that are done for you, leaving you to focus on application specific logic like actions (user commands), derivations ( e.g. working out monster hat size based on monster head size) and validations ( cannot save a monster master record if it has more hats than heads).

All these things – as I mentioned above – are now built right into the ABAP language.

If you do want to do some CRUD operations in your own Z code this is where the EML comes into the equation. EML stands for “Emergency Medical Hologram” and says to you “Please state the nature of the medical emergency”. No – hang on – that’s not right. Oh yes – EML actually stands for “Entity Manipulation Language” where you can code CRUD operations on an entity (like a sales order or monster) either from the buffer (the state of the entity in memory) or the database.

If I do want to actually update the database I first use EML to say what changes I want and then say COMMIT ENTITIES as opposed to COMMIT WORK.

Next comes a term you will find confusing at first namely “feature control”.

However if I said to you the following you would most likely understand what I was on about:-

IF SCREEN-NAME = “MONSTER” AND MONSTER_NUMBER IS NOT INITIAL.

SCREEN-INPUT = 0.

MODIFY SCREEN.

ENDIF.

You will be happy to know that is what “feature control” means. Naturally the RAP has no idea what sort of UI is going to display the entity, so unlike DYNPRO this sort of logic lives in the model as opposed to the view.

I have been doing this for some time now, even in the SAP GUI, using an MVC pattern where the view (DYNPRO) is in a function module and has no logic of its own at all, it just does what the model and maybe the controller tell it to do.

The example given was that if you have external numbering (the user can choose the primary key of a data record) once the record has been saved you can no longer change that field. That sounds obvious, but we have to code that sort of thing in a DYNPRO application, and the same applies here.

At time of writing the syntax to do this looks very complicated compared to the old-fashioned approach, but as ever this is all new and I am sure it will get easier.

Here come a non sequitur - If you Google “ABAP Gore” you will get a link to a SCN question and answer page where people contribute some truly horrendous examples of real ABAP code they have seen, usually from customers from occasionally from standard SAP.

One side discussion I started was the wide variety of “Boolean” domains people have created over the years due to the fact that ABAP is pretty much the only programming language with no native Boolean data type. To be fair when I started programming in BASIC the statement ( 2 = 2 ) + ( 1 = 3 ) resolved to “1” as TRUE was 1 and FALSE was zero.

Anyway there are dozens of Boolean types to choose from in standard ABAP many with three values to cater for the Schrodinger’s Cat situation which I am convinced does not occur that often in real business scenarios e.g. I do not usually hear people saying “that material may have been flagged for deletion, or it may not, we will only know when we call up MM03 and have a look, at that time the value in the database will magically change from “undefined” to TRUE or FALSE”. Possibly that is how quantum computers are going to work. I have not found – yet – in standard SAP a Boolean domain with four values or more. Naturally as the dozens of variations in standard SAP are not enough people make their own ones up as well for their own organisations.

Anyway I like “X” as true, but you also get “Y” and “J” (as in Ja) and all sorts of other values (including SPACE) meaning TRUE.

In the EML of the RAP when you want to update an entity just like a BAPI you have to say what fields you actually want to change (as your change structure might have 900 fields, and you want to change only two and so populate only those two fields in the structure and you do not want the other 898 getting blanked out). In structures like BAPE_VBAPX you set TRUE as X for the fields you want to modify. In the EML you also set the fields to TRUE using a constant IF_ABAP_BEHV->MK-ON.

I drilled down on the data definition (remember this is done via F3 in Eclipse) expecting to find an X, but alas alack the values are 01 for TRUE and 00 for FALSE.

An example of abapGit Workflow

Due to that nature of the world you are always going to get presentations for vendor companies who want to sell you something. Naturally SAP itself falls into that category. The art is not to make it look like an advert, concentrate on technological aspects, tell the audience something they do not know that will help them in their day to day work and then causally throw in, as an aside, that you have (sell) a product which might help just in case you run into trouble doing it on your own.

This presentation (which was really good) was by software company BASIS technologies, and their product which they skirted around was a tool to automate the workflow of the CTS system. I have nothing against such tools, at my company we use one called RevTrac which is excellent, I have also seen some ropey ones, and the standard CHARM from SAP is nice and free but is pretty difficult to set up and not as configurable as one might like.

Anyway it had occurred to me that all these products might be in trouble in a world where one uses GIT as opposed to CTS files, and “Jenkins” seems to manage the “pipeline” that moves changes from place to place. From what I saw I think vendors of those products have nothing to worry about mainly due to the fact that you still have transport requests and you still release them, so that aspect has not changed and still needs to be managed.

In any event any presentation is a success if at one point a light bulb goes on in my head and I think “of course – that is the obvious solution to problem XYZ. Why did I not think of that? I must be an idiot”.

I had such a moment in this presentation. In this case, for the last year I have been wandering around various SAP Inside Track Events all over Europe giving speeches about the future of ABAP Programming. My slides always culminated with the following:-


Future of ABAP Programming Slide


On the left you see various ABAP programmers all developing on their local machines, possibly all working on the same object(s) just like programmers in other languages have always done, and then merging their changes into the central development system when they are happy bunny. You see the pipeline there with the static and unit tests and any other type of test one wants to configure, and GIT doing the version management and handling “branching” and so on.

The conceptual problem I had was how are you going to get every developer their own full blown development system? The early versions of ABAP in the Cloud had a footprint the size of Mount Everest which is probably why only now is a trial version available.

The obvious – in retrospect – answer is to use “containers” a la “Docker” so you can create isolated copies of the full development system, one for each developer. I am sure it is not as simple as all that but I get the general idea.

You could also have development clients with as much configuration, master data and so on as you need for basic testing. Unit testing can work without this, but I am more comfortable doing some basic tests manually in the DEV system as well. Some developers currently do not do either, just think “that will probably work” and hurl it into QA, but let us not go there.

Just as an aside, all development is done in local packages in this scenario, the changes are pushed into GITHUB (or equivalent) via abapGit and the assignment to a real package (and hence transport request creation) is done at the time the changes are put into the central development system.

By this point you can see a recurring theme in a lot of the sessions and workshops I attended.

Transforming ABAP Code for S/4HANA

I gave a talk on this very subject at SAP TechEd Las Vegas 2017. I have also had to give presentations at work on this subject - naturally using the results of our own custom code S/4HANA readiness analysis. I am going to be talking about this again in South Africa in March 2020 (hopefully).

As such I always like to go to at least one such talk at every TechEd in case I am missing out on something obvious in my own talks on the subject.

In this case I came away relieved. This was a well formulated presentation which covered the subject well.

The main change I could see is that (as expected) with every release of S/4HANA and the associated ADT (ABAP in Eclipse) the amount of automatic corrections which can be done in ADT to auto-correct certain “errors” (changes needed to get the code to compile or work properly) increases. One I had not seen before was automatically changing variables typed as VBTYP to data element VBTYPL (the field is one character in ECC, four characters in S/4HANA and that might cause some function module interfaces to get the heeby-jeebies).

Leaving this presentation aside and going back to my own experiences I have found the following with the custom code analysis that says what you need to change:-

  • The amount of items that come up as needing to be looked at is so large it makes you want to jump out of the window.

  • It is generally the same type of errors again and again e.g. sort the errors by OSS note and count the notes and suddenly 50,000 problems go down to 72 types of problems.

  • A lot of those OSS Notes have PDF “cookbooks” attached, all in totally different formats, but containing a lot of useful information as to what you are supposed to do in situations where (say) you are reading directly from VBUK and VBUK is no longer a table but rather a view.

  • Some of those problems are red herrings e.g. it would be a problem only if you were not going to the latest release e.g. SAP did a “Hokey Kokey” with field VBFA-STUFE in that it was in, out, in, out, shake it all about, with various releases of S/4HANA. When the music stopped it was still in the table.

  • You will get a zillion entries related to the MATNR field increasing from 18 to 40 characters. Some other fields also increase e.g. all the currency fields are currently different lengths in ECC and they all get expanded to the highest length (23) in S/4HANA. So if you always type variables using data elements then you will be fine. Naturally everyone does that. Or do they?

  • The warnings from the extended program check also come up and get counted into the total e.g. “value may not be unique” when you do not specify the full primary key in a SELECT. So if you are not doing it already building the extended program check / code inspector into your peer review process would be a really good idea.

  • 75% of custom code is never used, so you need a usage analysis from production (spanning an entire year) to see what you can just totally exclude from the adaption process.

  • If you see a DDIC element in the standard ECC system starting with a “/” e.g. “/IS-JMS/MATNR_RANGE_TABLE” and you think that is just what you want, do not use it. Create a Z version instead. A lot of the industry solutions (In this example IS-JMS” stands for “Industry Solution – Japanese Mice Racing”) vanish into thin air in S/4HANA.


Presentation I will not mention

If you cannot say something good about someone do not say anything. So I will not say what presentation I am talking about. I am also trying to give constructive criticism i.e. do not do this, here is why it is bad, do this instead for a better result.

There was nothing wrong with the content, and to be fair, the original presenter had dropped out and someone had to take their place at the last minutes, but I am going to have to put my Toastmasters hat on here.

In a presentation you want the audience to listen to what you are saying. That might sound obvious – after all that is what you are standing up there for after all is it not? All you need to do is say something and people will listen to it will they not? After all they turned up.

If you want to give a 45 minutes presentation and have the audience ignore every single word you say then here is how to do it:-

  • Pack your slides with as many words as will fit on the screen, twenty bullet points per slide.

  • Read out the slides, word for word.


After the first 20-30 seconds the audience will realise what you are doing, and then read the slides themselves, totally ignoring you. In fact after that point – provided you keep speaking in a monotone – you could recite “three blind mice” repeatedly or indeed say rude things about the audience members one by one and no-one would ever notice. They would just presume you kept reading out the slides.

Just to make matters worse if someone in the audience reads the slides there is a 99% chance they will not remember any of what they read. Why would that be? Possibly because 75% - 90% of communication is non-verbal so if you are actually looking at the presenter as opposed to the slides you might pay more attention and remember more.

Toastmasters have been studying this sort of thing for decades – the exact advice varies but the idea is to have as few slides as possible, put as few words on each one as possible, and only try to make three points – all related – in the entire presentation. You might think that is impossible in a technical presentation.

If you look on the internet there are a million sets of – free – advice on how to do this (they all agree on the basics as outlined above).

I once saw in Australia the CIO of Fortescue Metals (incredibly successful mining company) give an excellent technical presentation – the first 5 – 10 minutes he let his slides run with music playing while he sat down and said nothing. Then there were no more slides, and he stood up and gave the rest of his talk for the last 35-40 minutes.

That is one way to do it. Here is how I do it. I am not saying I am right, or any way is right, this is just an example.

It has been said a good slide deck is useless without its presenter. If you just had a set of my slides without the notes you would have no idea what was going on.

This is because I just have a title and then three or four odd pictures which are reminders for me as opposed to the audience. After 0.5 seconds the audience have run out of things to read on the slide and then they wonder what the pictures are all about and how they relate to the slide title or indeed the subject matter of the presentation and they have to listen to me to find out.

There is always a connection but it is only obvious once I point it out. After a while they try and work it out themselves each time, and have to listen to me to see if they are right, making it into a sort of game.

As I said when it comes to Toastmasters I am still a rank amateur – but I would encourage anyone who wants to speak at TechEd – or any SAP event – or any event at all – or to your CEO in a lift trying to convince them of something in 30 seconds – to go to a Toastmasters meeting (they are everywhere in the world) and see what it is all about.

Day Five

On Friday in some senses the event is already over e.g. the show floor is closed, but the morning is still full of lectures and hands on workshops and so on. I always go, it seems strange to miss out after having made the effort to come all the way here in the first place.

I always feel sorry for the speakers who have to speak on the Friday, as half the delegates fly out on Friday morning, and another huge chunk are too hung over from the party on the night before to be able to get out of bed for the final morning.

Paving the way to Fiori UX in S/4HANA

I have said this before that whilst SAP Marketing claim that S/4HANA is the most successful product in its history the adoption rate by customers still has not been as great as SAP would have liked, especially the cloud version. Likewise, in another body blow, often when a customer does decide to make the jump they often stick with using the SAP GUI instead of using Fiori.

SAP’s official position is that if you cling onto the SAP GUI for dear life – which is more than possible with the on premise version – then you will not benefit from any future UX enhancements. That statement could be described a bit “woolly” (my opinion) so this presentation tried to be a bit more precise.

The other thing SAP does not advise people to do is to look through the huge list of Fiori apps and pick some that look good to them and implement them only. That does not work very well as Fiori apps are “not designed to be used individually”. What does that mean? The apps have navigation to each other and if you implement the source but not the target and the user clicks on the navigation part everything falls apart.

Warnings over, it was time to clear up some myths.

First of all Fiori is not a technology of any sort – it is a “design system” what I would call a consistent look and feel – and is technology agnostic. As such it can be used with any sort of JavaScript variant not just the UI5 library, and indeed in the latest SAP GUI and Web Dynpro. I would imagine it would be quite difficult to implement with WRITE statements but you could have a go. After all, a lot of people are still using them.

The next myth is that SAP is trying to do a one to one replacement of SAP GUI transactions with Fiori Apps. The two have a very different purpose and are aimed at different audiences. Transactions like VA01 are aimed at a variety of roles and thus have ten billion different functions. A Fiori app is aimed at a single role and usually only has one to three functions.

Thus a Fiori app is a subset of the functionality of one or more SAP GUI transactions (they are decomposed or recomposed) and some transactions just cannot be converted and thus stay in the SAP GUI.

Thus, the Fiori Launchpad has both options – you get tiles pointing to both, and in the SAP GUI ones the UI theme has been unified such that it follows the Fiori pattern. Transaction VA02 was demonstrated as an example and it looked horrible to me, maybe possibly better (if you squint a bit) than the SAP GUI version that people have moaned about for the last 20 years, definitely better than Web Dynpro, however certainly not “delightful”, but then beauty is in the eye of the beholder.

If you have an on premise system you can use the SAP Business Client and then you get the exact same behaviour except you have SAP GUI for Windows as opposed to SAP GUI for HTML. That did look a bit better to me.

Next was the time to see what was new in the UI zoo. If SAP say do not pick and choose Fiori apps then what do they want you to do?

Here comes the concept of “Rapid Activation”. This seems to be a GUI transaction which looks somewhat like transaction SATC, or indeed the LSMW i.e. a big tree of options you work through.

As we know the state of technology in recent years has exploded exponentially – the race to planet Mars is on, we have quantum computers, artificial intelligence, devices that can read your thoughts, and the Large Hadron Collider will soon open the gateways to faster than light travel and alternate planes of reality.

Not to be outdone the software industry has made similar advances. The latest one is called “cut and paste”. So the first step in the rapid activation is to go onto a SAP website and find the technical names of the roles you are interested in, and download them to a spreadsheet. Then cut and paste them into a screen within first step in the rapid activation transaction in the SAP GUI.

As you work through the steps in the Direct Implementation Rapid Transaction Yutility / Rapid Activation Transaction (or DIRTY RAT) you then copy those standard roles to Z ones, as sure as eggs are eggs you are going to change them e.g. remove transactions you do not want that role to have access to, or add in custom ones you have created yourself. In many companies the custom transactions in use vastly outnumber the standard ones.

At this point though you have not changed anything yet. You press a button and the relevant OData services are activated for the Fiori apps chosen, and because these are all standard SAP roles at the moment there are no inconsistences – that is no apps try and navigate to a non-active one. The process also creates a test user with the correct authorisations to try out the various applications that role can use.

Then you use that test user to have a play with the Fiori Launchpad generated for that role, and decide what you like and do not, and decide to remove and add things. There is an app for that, as the saying goes. This time it is the “Launchpad Content Manager” which helps you remove and add things to your new Z role(s). Presumably it is clever enough not to let you delete one of five apps which all depend upon each other.

From the demo it looks like the process is still very manual – there are a lot of steps involved even if you have a “wizard” guiding you, but no doubt like the extension factory this is a work in process and will get more and more streamlined as time goes on. Remember this is TechEd – you are generally seeing version 0.1 of whatever is being demonstrated. Think about what your own applications look like in their initial form!

Nonetheless even in Beta form it is clear this approach makes a lot of sense, a lot better than casting around switching on Fiori apps at random and wondering why none of them work. I also like the admission that Fiori apps are going to be co-existing with SAP GUI transactions for a long time to come, at least in on premise world.

SAP Conversational UI

A statistic quoted was that in 2018 no less than 67% of people (who use computers) interacted with a chatbot of some sort. I am actually surprised it is not more. This morning for example I pressed the “chat” button on the website of my Australian phone provider, wanting to complain my new modem did not work, and got a chatbot as opposed to a human. It asked how it could help and I said “I want to talk to a human” and it replied “I can answer all sorts of questions” and I replied “Human. Now” and it was able to decode that and put me through to a real person in India. So it could be said that 67% of people deal with chatbots every day, but how many of those WANT to be dealing with a chatbot as opposed to a real person?

Nonetheless the potential is clearly there, if they can be made a lot better than that is a fantastic business opportunity which SAP clearly sees. As an example of an improvement opportunity I had already logged into my phone account on the internet and pressed the “chat” button from within that environment. It would be a radical improvement if the embedded chatbot application was somehow able to read the data from the account I am logged into and greet me by name. As it turns out when they turn me over to a human I have to manually tell them my name and account number – I do the latter by cutting and pasting my account number from the top of the screen into the chat box. If you are doing the same sort of thing within an SAP system hopefully that sort of integration problem goes away (though my phone provider company is a big SAP user).

SAP are going gangbusters with this – they aim to have conversational AI in all their Fiori applications in all their cloud products. Moreover in the demos the chatbot is making calls out to Fieldglass, Ariba, Concur, and SuccessFactors and so on, and naturally that is invisible to the human having the conversation.

This is TechEd and the speaker knew the audience well, technical people like to know how to build things, and change existing things. So the question becomes how do you create a new chatbot or radically change the one you get from SAP? Once again, there is an app for that.

SAP has a chatbot building application – and you can try it out for free. As usual there is a sort of wizard to guide you through the process.

You start with one or more “intents” which is a description of what the user is trying to achieve e.g. book a flight, buy a computer, make a new monster, whatever.

Then you move onto “expressions” where you add a list of sentences a user might say to express that intent e.g. “I want to build a green monster” or “A green monster to build want I” or “I want to build a green monster for less than $1000”. Humans can say the same thing in lots of different ways so you need to enter various variations on the theme to “train” the chatbot to recognise what the user is on about.

Here comes the clever part – for each sentence you enter to be stored the SAP application automatically breaks it down into elements a computer can better understand – verbs, nouns, organisational elements, money, date and time, and so on.

Next comes defining a “skill” which is what the chatbot does with the information the user is typing into the box. You need to define the pre-requisites e.g. what does the user need to say before the chatbot does something, and the mapping of the decomposed sentence elements to an API e.g. date goes to this field, customer goes to that field, money amount goes here and so on.

You can add some predefined SAP skills as well, such as the ability to make small talk or to tell jokes (honestly) or the ability to detect insults. The latter comes in incredibly handy I would imagine.

Then you define the action which is a call to a URL passing in all the API fields and getting a response back to tell the user if whatever the request was worked or not e.g. “new monster created”.

The next part of the wizard defines how the user connects with the chatbot in the first place – options include via Slack, via Messenger, half a hundred other common applications, and of course a standalone URL from within your own custom application which hopefully can pass in more details than my phone provider can manage.

Lastly once the “bot” is live, you need monitoring to work out just how frustrated the users are getting i.e. what percentage of sentences a user types in are getting met with a logical response of some sort. The idea is to look at the user requests that could not be understood and “train” the chatbot to know how to handle them in the future, so the bot gets better with time. It looks like this is totally manual at the moment, but presumably some sort of ML/AI is going to get applied here at some stage to try and automate the continuous improvement.

I had to laugh when I saw the term “Business Connector” and one of the slides. I cannot remember what that is in this context, but to me it will always be the precursor of the XI middleware.

As I mentioned earlier everything demonstrated at TechEd is cutting edge and the presenter did not that “SAP Integration is only just out” by which I presume they mean the part of the demo where S/4HANA talks to Concur and the other systems. However as Oscar Wilde would say “the only thing worse than SAP integration being just out, is SAP integration being not out”.

Lastly there were some fancy tools which I am not sure are actually there yet, but will be in the near future.

One is to generate a chatbot from the OData metadata of an “entity” e.g. if my entity is a monster than a chatbot could be generated where you could ask it to do CRUD operations on the monster, plus any custom “actions” (user commands) defined in the OData model. I would be interested to see that action in action.

Less dramatic, but very useful, is the “FAQ Generator” where you upload your list of frequently asked questions and their answers to the “bot” definition. Then you need to once again think of different ways a user could ask the question so several possible query sentences point to one FAQ answer.

Cloud Extension Factory

This was the very last talk I attended in the entire event, taking me up to 11:30 AM when the whole thing shut down … and it was also the best one!

You are no doubt aware of the seemingly conflicting messages coming from SAP – on the one hand do everything standard and do not have any Z type custom code, on the other hand there are a huge bunch of tools for creating Z extensions to S/4HANA in the cloud, and indeed all the other SAP products like Success Factors.

You square this circle by having a standard code base that cannot be modified, but you can talk to that standard code base with your own custom code via a well-defined API.

Imagine if you will, a 100% vanilla SAP ECC system (if there ever was such a thing). A partner application could create a sales order in such a system by using an RFC connection to send and IDOC which would then create the order when it was processed inside SAP. No change to the standard SAP system was required.

I have seen many presentations, both inside SAP and at public events on the Cloud Extension Factory, the Cloud Application Programming Model (CAP), the SAP SDK, and of course the RESTful ABAP Programming Model (RAP). All of those are intended to help you build a custom (henceforth “Z”) application which can extend the functionality of an unchangeable cloud SAP application like S/4HANA or Success Factors. Put another way changing what the code can do without changing the code i.e. the “Open/Closed” Principle.

The difficult thing was seeing how they all fit together. Actually “difficult” was far too mild a term. It was virtually impossible. In January 2019 after seeing presentations on all of them I got the impression they were all different ways of doing the exact same thing. This is not in fact the case, they all work together but it took me a LONG time, and help from various SAP experts before the light bulb finally came on. Even then I would struggle to explain the “big picture” to my fellow human beings.

There have been some excellent blogs trying to explain the subject but this presentation finally “knocked the ball out of the park” when it came to explaining how all the seemingly disparate things were all parts of the same whole.

First of all there is a picture with the cloud extension factory as a big box around all the other boxes such as CAP, RAP, SDK and so on. That is supposed to show they are all part of the same thing but in this case a picture is worth zero words, you need a text explanation, or better still a code example, which is what we got in this session.

It was mentioned you do not have to use the CAP or RAP as the programming model, you could use anything you wanted, like React or Angular or Springbox (which I thought was a rugby team) but just not to make matters even worse let us just stick to those two programming models.

The following is not a transcript of what was said but rather my notes which I used to build up a picture in my mind of how all this works. This picture could be totally wrong but it sounds right to me.

First of all we start with a SAAS system such as S/4HANA where you cannot change the code but want to add a Z extension. Let us say I want to build a Z sales order front end and then create the actual sales order in S/4HANA at the end of the process. To do this I need the equivalent of a RFC enabled BAPI which is what I would call from an external system to ECC currently.

Nowadays the world has moved on and S/4HANA does not have BAPIS it has APIs, and so do all the other systems like Ariba. So to get my extension to work the absolute first and most vital thing I need is to (a) know what that API is and (b) be able to call it from where my extension code is running.

This is the job of the Cloud Extension Factory –it provides standardised access to the APIs of all the various SAP products (well not all of them yet but eventually it will). Someone once said this is not an extension factory but rather a connection factory. As a starting point for example you say in the SAP Cloud Platform (where you are building the extension) I want to connect to a particular S/4HANA system and a 128 character long UID “token” is generated, which has to have a matching entry created in the target S/4HANA system.  That is just the first step.

There used to be a further 64 configuration steps needed before the systems could connect, to use (expose) a particular API. The extension factory brings this down to 3 steps as far as I can see (the other 61 steps are done automatically in the background).

Now I am able to connect I could use a load of Java or JavaScript code to access that API from my extension. Or I could use the Cloud SDK to automatically generate Java or JavaScript classes out of the API massively reducing the code writing effort needed. I get the feeling that ABAP in the Cloud will eventually have such classes as well. Maybe it will use an ABAP Cloud SDK to generate them. That is pure speculation on my part, but it would be a consistent approach between languages.

Next I want to create some Z tables. If I use the RAP as my programming model so my code is in ABAP in the Cloud then fine, I can create the Z tables there and wrap them in CDS views. If my code is not in ABAP then I would use the CAP to define “entities” and generate the CDS views and the Z tables at the same time, and turns such entities into a service, which in turn generates an OData API which can be exposed to a Fiori app.

Then came the vital part of the presentation where you saw the example code from such a service.

That code was supposed to represent your “Z” extension and it kept jumping between the CAP and the SDK to connect to your Z entities on the one hand and SDK classes to connect to S/4HANA and Success Factors (in the example) on the other. Such access becomes boiler plate code you do not have to think about, a line or two of code in every case, leaving the vast bulk of the extension code to worry about the business logic. And it occurred to be that because all the accesses were done by classes they could all be mocked, and thus you can easily unit test and the business logic.

So in my sales order example I could have some custom Z tables (entities) that get updated when a sales order is created, and when the user presses SAVE I do some Z validations and derivations, and if all is well update both the “Z” tables and create the real sales order in S/4HANA and maybe C/4HANA as well if I feel like it.

The caveat at the end of the presentation is that the Cloud Extension Factory (i.e. the initial connection tool) is still a work in progress – setting things up to react to an event coming from an SAP system still takes a lot of configuration setup for example – but things will get better over the coming years.

Just as some wild speculation on my part, if we ever get to the stage where we have an ABAP SDK to generate wrapper classes for Success Factors and Concur APIs and the like, you could have the crazy situation where you write an extension to Ariba using the RAP in ABAP in the Cloud. That would turn the whole “ABAP is Dead” thing on its head once and for all.

The End

When something is really good it is always quite depressing when it ends, especially as TechEd Las Vegas ends with “not a bang, but a whimper” when the few that remain for the Friday come out of the last session and see all the signs being taken down.

Next time I am going to send out a generic invite to anyone remaining to join me for a celebration lunch, after the final sessions are over, just so we can end the event on a high note. Even if one person turns up it would be good.

Just as a final aside, this year the pictures of people dotted around the event were of “SAP Champions”. These are people who promote SAP on the SCN or at SAP Inside Track events. I have been to loads of SIT events this year and so was lucky enough to have met a lot of people pictured, and it was good to put a name to the face of various people whom I had only read their blogs but never met.


SAP Champions – I have met most of them!


SAP TechEd 2020

Next year the USA TechEd will not be in Las Vegas, but rather Austin, Texas and so the new rule would have to be when the speaker says the words “Paradigm Shift” the whole audience have to do the “Boot Scootin’ Boogie”.

Cheersy Cheers

Paul
10 Comments
Labels in this area