Skip to Content
|

Category Archives: Uncategorized

When a business or an individual makes a conscious decision to pursue a mobile application strategy a common assumption is that an app needs to exist on as many platforms as possible. In an ideal world, time, money, and human resources would be of no concern and it would be possible to create a native app that leverages the very best features and capabilities of each targeted platform.

In reality, we are all constrained by time, money, and human resources and we are forced to make objective decisions that will maximize our return on investment.

Web Apps

If true platform independence is a primary goal for your app you may want to consider a mobile website (web app); however, platform independence is not without disadvantages. The new HTML 5 standard makes it possible to develop fantastic web apps that run on a majority of mobile devices. Unfortunately web-based apps generally are not able to take full advantage of the features and capabilities of the mobile device itself such as accelerometers, gyroscopes, GPS units, local data storage and integration, and even user interface interaction.

Technological advances in mobile device CPUs, graphics processors (GPUs), memory, and other components have little to no impact on a mobile website or web app leaving most of the potential virtually untapped. Furthermore, mobile websites may be rendered differently depending on the mobile device or web browser making it difficult to fully control the user experience.

These factors lead to a “lowest common denominator” approach to web app development which compromises the user experience and results in an app with limited functionality generalized to run in as many browsers on as many devices as possible.

Native Apps

Conversely, native apps facilitate the best possible user experience by fully leveraging all of the device capabilities; however, it can be cost prohibitive to custom develop and maintain an app for every major mobile platform.

This forces many clients to make an objective decision about which platforms to target for their native apps. When selecting a platform to support it is important to maximize your reach while minimizing cost to achieve the greatest ROI.

Apple vs Android

There are some significant platform differences between iOS and Android which impact the true reach, the cost of development & maintenance, product quality, and user experience.

For example, Apple Inc.’s proprietary sotfware development kit (SDK) ensures that with minimal effort an iOS app will run similarly on all iOS devices including iPhones, iPads, and iPod Touches.

The open source nature of Android’s operating system allows each mobile device manufacturer to create multiple devices with varying technical specifications and wireless carriers to customize the features of the operating system in order to achieve a competitive advantage. This creates a highly fragmented environment. Several different types of fragmentation should be considered.

Release Fragmentation

When Apple releases a new OS version all supported devices have immediate access to seamlessly update the device to the new version through iTunes. Recent benchmarks show that 50% of iOS devices run the current version of iOS 12 just weeks after the release.

 

With Android, updating the OS isn’t as seamless. The wireless carriers, not Google, have the primary responsibility for providing OS updates to devices. Since many carriers have customized and branded the standard Android release it may take a carrier months to update their custom branded version of Android, if they deem it worthy to do at all.

An unfortunate pattern has emerged in which carriers stop providing OS updates to users of low cost or older device models in an effort to migrate them to a newer device and/or sign a new contract. This renders many users “stuck” at a particular Android version until they purchase a newer device.

With nine major releases of Android this creates additional challenges. Less than 0.1% of Android devices are running the current version 9 (Pie). Less than 20% of the devices run the previous version Oreo (8.0 or 8.1) and roughly 30% of devices run Nougat (7.0 or 7.1). Doing some quick math we can see that you need to go back two years and two major releases to target the largest audience.

From a development perspective this forces us to choose which set of users to target: the majority or the current version. This may dramatically reduce your reach, increase your development and maintenance costs, and/or sacrifice user experience.

Hardware Fragmentation

Apple has designed the iOS software to operate specifically on the device in which it is running. However, Android’s open source approach allows it to run on many different devices. Different devices mean different CPUs, memory, screen sizes and resolutions, form factors, etc.

These differences combined with the number of different device offerings make it extremely difficult and very costly to properly design, develop, and test apps on all supported devices thus sacrificing user experience and quality.

Distribution Channel Fragmentation

Apps designed to be consumed by the public face multiple distribution channels. Google’s Play is not the only Android “app store” in town. There are dozens of secondary Android marketplaces to contend with.

Having additional distribution channels further reduces reach and increases costs. All of these markets will have their own submission processes, DRM and licensing schemes, development agreements, support requirements, etc. Managing and submitting app updates to multiple stores will be no easy task for developers.

For users, shopping from multiple stores will require multiple accounts and payment methods. Users may be completely shut out of some stores while left favoring the perceived top two or three stores.

User Experience Fragmentation

With each new Android device, OS version, or marketplace the user experience is further diluted making it more and more difficult to reach as many people as possible and for as low cost as possible. The various versions of the operating system and plethora of hardware devices with differing technical specifications make it a daunting challenge for developers to produce and maintain a quality software product.

The ability of each device manufacturer and wireless carrier to alter the user experience through hardware or software differences results in an inconsistent user experience when using the same app across multiple devices.

Security

Androids various app distribution channels can be seen as both a blessing and a curse. Dozens of marketplaces with varying degrees of submission security checks increases the risk of malware embedded applications ending up on the device. Not to say that Apple hasn’t had a few malicious apps appear but the volume is dramatically less with iOS. Furthermore, as Android devices typically run older versions of the operating system it becomes much easier for hackers to exploit known security vulnerabilities across a larger audience.

We now live in a GDPR world where privacy breaches must be made public and will have both direct and indirect costs associated with security mishaps. Deploying devices that in the near future will be stranded on an older version with no security updates possible presents a nightmare for IT and GDPR compliance.

Conclusion

Considering all things, including hardware, performance, reliability, security, and adoption rate, Apple iOS is clearly the best mobile device platform for use in the enterprise.

 

In times where buildings and infrastructure projects are becoming even larger and more complex, engineering and construction companies need digital technologies to stay competitive – the need for smart buildings and infrastructures is often the base for great innovations that help the world run better.

So let’s dig up your industry gem and share the value of your innovation with us! SAP has grabbed a shovel and wants you to grab one too. Let’s dig up the hidden digital gems together by sharing your #SAPInnovation story by entering the 2019 SAP Innovation Awards.

The Innovation Award is an opportunity for #TheBestRun customers and partners to showcase how they are driving innovation, making their business more intelligent and utilizing technology to make a difference. This year, EC&O customers and partners using any SAP product or technology to drive innovation are eligible to submit. Winners will be showcased at SAPPHIRE NOW and will benefit from promotional opportunities with SAP. Finally, up to $25,000 in charity donations will be awarded—just for submitting! Take a look at this blog for more details on the program timeline and prizes.

There are chances to win across several categories, one of them being the Industry Disruptor. The winner for Industry Disruptor will exemplify how the company has used innovation to disrupt the way the industry has traditionally conducted business. For example; establishing new business models, redefining a critical business process, or enabling strategies that break through existing boundaries.

What’s in for you?

Should you be the winner of the Industry Disruptor award, you will receive:

  • A special recognition for your company and your team
  • Exposure in select external communications and thought leadership channels
  • Exclusive opportunities to showcase your success with key SAP executives at SAPPHIRENOW in Orlando and beyond

Here are examples of the past award winners that are driving business value from unique approaches in their industry:

Tools Needed to Get the Job Done

All submissions except for the Next-Gen Innovator Partner submissions will be eligible for the Industry Disruptor award.  Partners that submit on behalf of a customer should ensure they have obtained customers consent to participate. Each entry will be reviewed and scored by a panel of SAP Industry judges. To ensure you stand the best chance of claiming the Industry Disruptor award, take note of the judging criteria and ensure your pitch deck highlights the following:

  • Use Case:  How is the entrant using SAP technology / products to add value?
  • Outcome: What business or social outcomes have resulted from the project?
  • Value: What business or social value has been realized?
  • Human Empowerment: How does the project impact the lives of individuals either inside or outside the organization?
  • Intelligent Enterprise: How is the company applying technology and innovative approaches to progress toward an Intelligent Enterprise?

Let’s Share!

We would like to hear your #SAPInnovation story! You can find more information on the Innovation Awards timeline and list of prizes here. For details on the judging criteria and category definitions, visit the Innovation Awards website.

The submission period is open now and all entries are due by February 8, 2019. Get started today and remember to join the #SAPInnovation conversation online.

For more industry-oriented Innovation stories, follow us on Twitter @SAPIndustries and join us on LinkdedIn.

Introduction

In traditional BW, modeling of transitive attributes (attribute of attribute) has been quite time consuming and also led to duplication of data, see explanation here.

Transitive attributes – as an out-of-the-box feature – have been introduced in the BW Modeling Tools InfoObject editor with BW 7.5 SP4. Here you can simply enable the required transitive attribute, there is no need for specific modeling anymore.

Transitive attributes are exclusively supported by new object types, so they can be enabled for an Advanced DSO but not within e.g. classic DSOs or InfoCubes. If your CompositeProvider still contains those old object types and you try to activate transitive navigation attributes, then it will fail.

Possible workarounds:

  • Adjust your data model as in traditional BW
  • Migrate your old PartProviders to new object types
  • Manually join the required attributes (e.g. within a CompositeProvider)

 

Steps to activate transitive attributes

  1. Open an InfoObject and go to tab “Attributes”. Click with the right mouse button anywhere within the attributes table. A context menu appears, click on “Maintain transitive attributes”. Please note: if your InfoObject is a reference to another InfoObject, then the context menu just won’t appear. You will have to edit the main InfoObject instead.
  2. A list with all attributes of your attributes appears. Select the required InfoObject – in this example 0MATERIAL__0APO_PROD__0APO_ATTPRI1 – then click on the button “Same” and “OK”.
  3. 0APO_ATTPRI1 now appears in the attribute table. The value “true” in column “Transitive Attribute” indicates, that this is not a regular attribute but a transitive attribute.
  4. Enable the navigation attribute checkbox but keep in mind, that the basis attribute – here “0APO_PROD” – has to be enabled as navigation attribute as well. See the following message:
  5. If you want to add a transitive attribute which has the same name as an already existing attribute – e.g. 0MATERIAL__0MATL_GROUP vs. 0MATERIAL__0APO_PROD__0MATL_GROUP – then you will get an error message. In this case you need to use the buttons “Other” (this maps the attribute to another InfoObject) or “New” (this creates a new InfoObject as reference to the original InfoObject).

 

After some great sessions in Las Vegas we in the Predictive Analytics team will bring you the best most exciting content in Machine Learning and Predictive Analytics at SAP to Barcelona.

This year our content will be presented as a Learning Journey entitled AIN5 – Unleash your Data’s Potential with SAP Predictive Analytics which will enable you to Explore, Discover, Learn and Expand your knowledge of Predictive Analytics in an easy to follow road map.

 

 

One of the most exciting topics at this year’s TechEd will be the introduction of SAP Analytics Cloud – Smart Predict. Explore the world of Augmented Analytics with the famous Robert McGrath as he takes you through the new features available in SAP Analytics Cloud in the area of Predictive Analytics in a series of Lectures and see for yourself in the associated Hands on sessions.

So popular in Las Vegas that his Hands on session will be done twice in Barcelona!

 

Erik MARCADE and Richard Mooney will take you through the strategy for Advanced Analytics at SAP through Lectures and Roadmap sessions and I will be taking you through the advancements we have made in the last year embedding predictive analytics in to SAP Applications such as S/4HANA and C/4HANA via the Predictive Analytics Integrator.

 

Finally please feel free to drop by the booth to chat to our experts Herve KAUFFMANN, Veronique Vendetti about all things Predictive.

 

We look forward to seeing you there!

 

1q4ask.jpg

Source: SAP

Purpose of the askSAP call is shown above

 

Source: SAP

Legal disclaimer applies

Source: SAP

List of speakers is shown above

Source: SAP

Themes include Augmented Analytics,  Intelligent Enterprise  Hybrid Environments and Collaborative Enterprise Planning

 

Data is massively growing

Source: SAP

Not just about visualization

Analytics is about being smarter, using machine learning to guide us

Capture and connect plans through enterprise without being a financial planning expert

“The One Simple Cloud strategy is about moving beyond human bias, with automated & assisted insights, from a connected environment”

Source: SAP

Inclusive strategy

Source: SAP

SAP is included in SAP applications such as SuccessFactors, Ariba

Source: SAP

Smart assist is making BI accessible to everyone

Source: SAP

Data Warehouse as a Service

Q: Do you have to be an SAP shop to use?

A: No, SAP Analytics Cloud works with other data sources and application types and run in hybrid landscapes

Q: Embed Analytics Cloud to integrate

A: API’s available

Smart Predict

Intelligent enterprise, automate repetitive tasks, making business analysts more productive

(blog from yesterday):

Source: SAP

Help Business Analyst understand the results, best way to use prediction

Source: SAP

Drive adoption, show how use to solve business problems

Drive into transaction systems – Pai

Q: Is this in addition to what is on premise for predictive analytics?

A: in addition to what is on premise predictive analytics (tool for analysts & data scientists)

Cloud – simpler experience for business analyst

Source: SAP

Moving to a hybrid environment is the next evolutionary step

“Strategy is hybrid, bringing additional capabilities to innovate with agility in the cloud”

Source: SAP

Forrester research

Source: SAP

Licensing

Source: SAP

What Analytics Hub offers today

Source: SAP

BI4.3 is planned next year with BOE maintenance until 2026

Source: SAP

Q: What is SAP BI 4.3? A: Hybrid for improved integration, enterprise readiness – simplifying deployment, user experience

Application Design

Part of SAP Analytics Cloud

Source: SAP

Professional design, guided apps, reuse elements from SAP Analytics Cloud

Source: SAP

Planned is planning and integration with Smart capabilities

Use dashboard to influence the model and rerun it

Watch the webcast replay for the in-depth demo, and this blog for a recap.

Q: What is the GA on app design?

A: Towards end of the year

Q: Is the scripting in app design based on JavaScript?

A: Scripting is based on JavaScript/TypeScript

BW/4HANA SAP Analytics Cloud

Source: SAP

Goal of BW/4HANA is to drive simplification

Deliver faster to the business when change an existing business scenario or a new data model

Data model is simplified; in the past had 10 modeling objects

Reduced to only 4 modeling objects – infoobject, ADSO that keeps line item DSO, composite provider for query access, open ODS view

Middle of the above are simplified data flows

High performance analytics in BW4

Decision on layer is no longer based on performance implications

Reduced huge number of source connections (had 11 different source system types) – now down to 4 – one includes HANA, ODP, Big Data (Data Hub, Hadoop), File

Source: SAP

New user experience, eclipse based environment

BW/4HANA Cockpit, the central entry point for the administrator; web based UI5 technology so access to any device

Source: SAP

New process chain editor; integration with Data Hub and BW/4HANA (add structured/unstructured) process chains can trigger SAP Data Hub

Source: SAP

New is data tiering optimization; data movement from hot to cold/warm – it is done in the background

Source: SAP

Flexible conversion options

Next step is BW/4HANA 2.0 – planned for February 2019 with major enhancements

Source: SAP

How integrate “best of both worlds”?

No additional modeling, replication, data stays in BW, and Analytics Cloud can integrate

It is a unique integration

Source: SAP

What SAP Analytics Cloud can use

Can run queries in parallel

Source: SAP

Time dependent hierarchy support

Source: SAP

BW Structure support

Source: SAP

Variant support

Source: SAP

Rename dimension, create your own groups

Q: Will SAP Analytics Cloud perform authorization routines?

A: When connecting to live data, automatically check data level security

Q: When should a customer go for BW/4HANA or SQL warehouse?

A: Not a decision done on feature/function level; customers who have no ABAP knowledge, perhaps consider SQL approach – can integrate with BW/4HANA

Q: Can Lumira Designer apps migrate to App Design?

A: So far, that is not planned as part of the immediate roadmap; not saying it will never happen as both products are not at same level of maturity

Q: What is the future of Lumira Designer?

A: Lumira Designer is the leading tool for professional dashboards; no plan to replace; SAP still committed to tool

Q: Is there is a mass import option of assets for SAP Analytics Hub?

A: Today Analytics Hub has API’s and that API would allow you to script operations

Going forward, looking to clean up and tighten integration with BOE and SAP Analytics Cloud

Q: Which BI4x is required for live connectivity?

A: BI4.2 SP5 or above

 

For more information join us for these upcoming webcasts:

November 8 Mixed Modeling with SAP BW/4HANA & Native HANA

November 13 Smart Predict

November 14 SAP Analytics Cloud Integration with SAP BW: Best Practices

November 20 BI: SAP Analytics Hub and SAP Hybrid Analytics Strategy

December 4 INFL: Mobile BI Influence Council Re-launch

You need to effectively author master data in high quality for excellence in your business processes?

SAP Master Data Governance is the way to go for enterprise-wide master data management. Ever since the first release, data quality was “built-in”:

  • by the re-use of ERP business logic (nowadays S/4HANA business logic)
  • custom validations, with the choice to implement in BRFplus or code in ABAP
  • data enrichment using SAP products, your own code, or third-party offerings, for example for address validation
  • duplicate check to save you from the costs of unnecessary double-maintenance and the potentially caused inconsistencies

You can create your perfect data quality firewall with SAP Master Data Governance for data that enters your landscape via well-known channels, namely change request processing or master data consolidation, but comprehensive and effective data quality management is more than that.

Explicit and accessible definition of your data quality

Data quality is driven by the requirements of your business processes. Having an explicit definition of your data quality rules that is easily accessible not only to IT people, but also to process owners and all other stakeholders of master data, makes it possible to collaborate and to agree on your data quality standards. You need to be able to describe data quality rules in natural language, augment them with further information, for example the reasoning of the rule or the impact if data does not comply with the rule. Accessibility does not only mean that the rules are well described and organized, but also that there is the visibility down to the actual implementation and usage of each rule in your systems and processes.

For managing the quality of your data, you need to be able to measure it along your quality dimensions. You need to be able to define KPIs, their baselines, and their targets. Only then you will be able to see the current state of your data quality initiatives, their past achievements and where they are heading to.

Advancement in your definition of data quality

Anybody who ever dealt with setting data quality standards will certainly agree: Once your quality standards are defined it is already time to adapt them. Innovation in business processes mandate for changing your standards. Luckily, an explicit and accessible repository of your data quality rules is a solid base to further evolve and optimize your standards.

Measure and analyze your state of data quality

However, advancement in your definition of data quality also means that data of shiny quality in the past might not be flawless anymore. Work needs to be invested to identify and to improve existing data so that the actual state of the data can follow the changed standards.

While the advancements of the quality standards might be easily incorporated in the main authoring processes, reality proves that you can never consider all channels that bring data in your landscape. For some it might be just a matter of time and effort to cover them, but there are also situations in which there might be good reasons to accept data coming that needs to be improved later. Again, you need to be able to identify this data and to efficiently remediate its quality.

Furthermore, you need to know how the effectiveness and the progress of the measures you have taken for quality remediation. Do parts of your organization need further support to achieve the set targets? Knowing the current state of data quality is also the basis to set new goals and to define new initiatives for further advancements.

Improve and correct your processes and your data

Analyzing data quality issues gives you the required insights for improvements and corrections. Only if you know who, where and how bad data enters the system, you will be able to find yet uncovered data entry channels, to fix issues in your data entry processes, or to educate people how to meet the expected quality standards.

At the very end, somebody needs to do the job and fix the data errors. As this is typically a time-consuming task, efficiency and distribution of the workload is key.

What if …

What if you could achieve all this, including central governance and master data consolidation with one single product that is integrated in your SAP S/4HANA system?

If you agree that this is a desirable target, you are invited to have a look at the new master data quality management capabilities of SAP Master Data Governance on SAP S/4HANA 1809 and see how this can help your organization to achieve this target.

“Maxis Adventure in SAP Cloud Platform” is now known as “Maxis adventure to SAP TechEd” 😉 I guess I promised too much when it came to my journey in the SAP Cloud Platform in a series of blog posts. A lot has happened lately..unfortunately, the blogging suffered from that. But a small follow-up report from SAP TechEd Las Vegas. I try to keep myself short and not to exaggerate too much – but maybe one or the other has not yet heard of SAP TechEd and thus discovers something new 🙂

So many first times!

My first time in the United States, my first time at SAP TechEd, my first time as a speaker, the first time meeting most of my teammates personally. So many first times that I was of course excited, curious or even nervous – hard to describe. But before I could start the actual trip to the US (including some other stops in Palo Alto, Yosemite, Death Valley, Grand Canyon), I first had to prepare a lot for SAP TechEd in Las Vegas.

SAP TechEd 2018 Las Vegas – Keynote

Admittedly, I had never been involved with SAP TechEd in general when I was a consultant. I knew roughly what it was all about. But to take the time to rummage through the documents in the aftermath or even to be on site was simply almost unimaginable.

And now I’m able to be onsite in Las Vegas and Barcelona, supporting in the developer garage on the show floor and as a speaker in one of the many learning journeys at TechEd. That meant not only the honor of being allowed to take part in all this but also a lot of preparation.

Developer Garage – What’s that?

First of all for some of you who don’t know the rough “structure” of SAP TechEd – and that’s probably just a few. Just like me so far. The TechEd is basically divided into several tracks. Lectures, hands-on sessions, etc in classical lecture rooms – lecterns, large screens and everything that belongs to a classical presentation. Product managers, developers, evangelists and many more try to give you a helping hand in countless sessions. (Session Catalog Barcelona: Link).

And the show floor. On the show floor, you can find lots of booths of SAP and their partners, which demonstrate countless showcases, possibilities of cooperation and much more – so much, that you could spend three days on the show floor easily without actually having seen everything.

And right in the middle of it: The Developer Garage. The “hands-dirty” garage. Lots of computers, lots of experts and evangelists and all with one goal: to teach you new things or to improve existing skills. And how? With tutorials. About the SAP Cloud Platform in general, about SAP ABAP in the Cloud, about S4/HANA, native Android and iOS Apps with SAPs dedicated SDKs … and the same about the possibilities that open up with our partners like Google, AWS or Microsoft.

Developer Garage – AppSpace

My team colleagues and I have also created a series of tutorials (bundled in so-called missions) to introduce you to the technology. My colleague Marius Obert from Munich and I jointly built the mission How to integrate Microsoft Office 365 into SAPUI5. Integrate Outlook into an SAP UI5 application? Super easy, have a look! There’s a completely new Tutorial Navigator you should explore anyway.

Participants are coming, asking questions, sitting at the computers and starting. The experts on the respective topics are always in the surrounding area and are available with advice and action. That generates a lot of interest – it’s not common to have an expert for yourself every day. To be able to pose questions of any kind. And as a bonus on top, there are also prizes for completed tutorials – starting with socks, Bluetooth speakers, power banks or even drones, for the very industrious.

Over 2000 tutorials have been completed at SAP TechEd Las Vegas 2018!

Speaker in a Learning Journey?

Speaker in one of the several learning journeys – to be exact Explore SAP:

“Being a developer in the SAP ecosphere has never been more exciting. Find out what opportunities there are, and make informed decisions on what you should master next.”

Too abstract? Let’s dig a little deeper. My exact session title is “How to become an extension and integration expert in a cloud first-approach”. As generic as the session title is, as comprehensive are the possibilities to fill an hour that is given to me. Where do I start, where do I stop? Where do I have the knowledge, where I don’t have knowledge? Where do I lose my listeners in detail, where do I bore them with already known things? Who is my audience anyway? All in all, a task that was not quite easy for me.

Your Speaker today – nervous Maxi

On all accounts, I am satisfied. Rarely that something like that happens. One of the central points, why I probably didn’t disappoint the audience: Honesty. (Thanks again to DJ Adams  and his internal sessions for presenters) The audience will notice anyway if you have no clue. I can’t and won’t know everything and don’t have to beat about the bush if I can’t answer a question. Of course, I try to figure out the answer later on or point to other sessions that can provide a more thorough answer. What probably helped me the most was to tell the audience: “Yes, I am nervous”.
And let’s be honest, the old phrase “you can’t please everyone” unfortunately also applies to such events 🙂 Nevertheless, I have tried to please as many people as possible.

How to become an extension and integration expert in a cloud first approach?

Hell, what a title. The title, I think we agree on that, promises quite a lot. (Which, by the way, doesn’t make it any easier for my first speaking session…) And I have to admit, hardly anyone will leave the room and turn into an expert after an hour of a maxi lecture. IT and SAP, in particular, are too extensive for that. As you know, all beginnings are really difficult.  It is well known that all beginnings are difficult. But I can point out a way where you can start and how.And for all further steps, I am sure, everybody is good enough to make himself an expert. In general, I would describe the path to become an expert which presumably also looks like for many other topics:

  1. Have a reason to start

    Intelligent Enterprise. SAP Cloud Platform. Extensions and Integrations. Is there anything I have to add? (I promised I want to keep it as short as possible, so here we are)

  2. Know your playground

    Well, there are a lot of super complicated cloud/on-premise mixed scenarios out there. But to get started and write your first extensions you don’t need money or highly sophisticated systems. Keep it simple. A possible playground might be an SAP system (ES5, for free). Use the Cloud Connector to write an (cloud-first approach) extension UI in your own SAP Cloud Platform runtime environment. (free trial account)

  3. Know some tools

    Use the SAP API Management service to control/manage/monitor an OData Service from the ES5 system or the API Business Hub to find the appropriate API for your “source system”. Build your first SAPUI5 prototype with SAP BUILD based on the OData service mentioned above. Export your SAP BUILD prototype, import it into the SAP Cloud Platform WebIDE Full-Stack to further customize the coding there. Then register your app using the SAP Cloud Platform Portal Service in your Fiori Launchpad. And all of this for free in your SAP Cloud Platform Trial account.

  4. Know where to find more information

    Scenarios: Link
    Services: Link (especially Integrations / User Experience)
    Tutorials: developers.sap.com – Developer Center
    Documentation: Link
    Hands-On Experience: blogs.sap.com

Or just come to one of my sessions, if at the TechEd or one of the other events. 😉 There you’ll find everything in more detailed.

People are awesome!

Nothing to add. I didn’t expect that my first TechEd will get that inspiring! No matter whether in discussions during my lecture, then anywhere at SAP TechEd, in the Developer Garage or at one of the various evening events – the participants are so inquisitive and grateful for any input at the same time. This is also a new experience when you were a consultant for many years.

And on the other hand: The SAP Developer & Community Relations Team. People from all over the world. So many different cultures, so many different thoughts. It was a blast! See you in Barcelona! 🙂


SAP Developer & Community Relations Team

Stay updated on Twitter: @maxstreifeneder

 

 

 

As you all have heard by now that SAP S/4HANA 1809 has introduced “Manage Global Accounting Hierarchies” application (Fiori App ID F2918) having below key benefits:

  • Unified way to maintain hierarchies for different master data objects – Bank Account, Cost Center, Profit Center, Functional Area, Financial Statement Version, Product, Consolidation related objects
  • Single dashboard to control hierarchies
  • Time dependent hierarchies, so that you can create hierarchies for future even in advance
  • Status control, so that you activate hierarchy only when that is complete. Else keep that in draft mode
  • Support custom extensibility – like custom hierarchy, custom field and custom logic (Refer my blog “Report on the Fly Using Flexible Hierarchies in SAP S/4HANA 1709” on this at SAPinsider)

 

E.g. see below Consolidation Financial Statement Items Hierarchy in this unified app:

 

So, what’s the motive behind this blog? It is to highlight one impact on your classical hierarchies existing in system. E.g. you might have created a lot of Financial Statement Versions using classical method, e.g.:

 

However, when you want to use these in fiori applications like “Display Financial Statement”, you would not be able to see these FSVs there, e.g. I am checking for FSV 1030 which exist in GUI, but not available in the fiori app:

 

To leverage existing FSVs, there are two options available:

 

(A) Keeping same FSV number as earlier:

Just go to the old method of maintaining the hierarchy in change mode and save it:

 

On the pop-up, you can click on activate to do instant activation:

 

By default, it will activate from system date as key date start:

 

And your hierarchy will get replicated and will appear in fiori applications:

 

E.g. now fiori application “Display Financial Statement” contains the FSV activated above:

 

However, the negative point in this route is that, this replicated hierarchy will not be available in “Manage Global Accounting Hierarchies” application and thus you will not get benefited by new way of working to easily manage hierarchies.

 

(B) Migrating old FSV to new FSV in Global Hierarchies:

You can import existing FSVs into new architecture by clicking on “Import Hierarchy” in Global Accounting Hierarchies app (Fiori App ID F2918):

 

And then select the hierarchy which you want to import:

 

And then specify the target hierarchy’s attributes:

 

And your copied hierarchy will start appearing in “Manage Global Accounting Hierarchies” application:

 

But till now it is in draft status, so you need to check the definition. If things are ok, then click on activate button:

 

And your hierarchy will get activated:

 

The hierarchy will be available in various fiori apps also where it is being used for reporting etc., e.g. in “Display Financial Statements” app:

 

This new app to manage various accounting hierarchies at same place is very interesting and harmonized way to manage and control, particularly with the feature of time dependency solving this age-old problem.

Born in the internet revolution, we exactly know how to use tech to fuel our day to day operations. Still, most of us haven’t completely used technology to top their game. With the competition becoming fierce day by day, survival has become a huge matter of concern; and software systems being inherently vulnerable turns out to be a major source of cyber threats in the modern era of digitization.

There are basically two types of businesses; one which has been victimized by a cyber-attack while the other one is about those which are yet to discover it has already been infected. Now, do you know what hackers love the most? Startups! Well, that doesn’t mean well-established websites are never hacked. Basically, there is no escape as cyber criminals or hackers are always on the hunt – lusting to steal your valuable data in no time.

With technology evolving at a fanatic pace, the number of opportunities for cyber-criminals to exploit continues to grow. Today, cybersecurity has become incredibly important for protecting personal, business and customer data from unwanted threats. So if you think you are safe in this world, think twice because you don’t! Within a few seconds, you can become a victim of cyber-crime.

Why one needs cybersecurity?

#1 Mobile, mobile, mobile

Smartphones have taken over the world by storm and are likely here to stay for the long run. According to researches, people spend over 5 hours each day looking at their phone screens. This means, there is a bigger risk in cybersecurity space. It may also interest you to know that a missing cell phone is actually easier to hack and already has all of your information stored within. So make sure to protect your device with a password.

#2 Internet of Things (IoT)

I am pretty sure that you must have heard of the Internet of things by now. Think of iPad, iPhone connected together via Wi-Fi or Bluetooth. IoT basically comprises of growing number of gadgets and devices being synced to the internet. This definitely makes things convenient for the vast majority of people, including cyber criminals trying to access your information. Having a robust cybersecurity plan and a skilled team to implement is very much important for any business to keep their data secure.

#3 Private Data no more private

I am sure you will find security breach news rise and fall through your news feed every other week. Considering how much more often cyber-attacks are occurring, this should be quite the alarming trend to you. Not everyone practices safe online care of their data, and this is the point where cyber criminals break in and start using your crucial information for their benefit.

#4 Cloud services

Due to its affordability and unmatched utility, small businesses and startups are jumping on to cloud services like never before. It is assumed that cloud threats would increase significantly, thereby increasing the risk for start-ups and small businesses.

#5 Money definitely matters

Businesses often handle or have access to larger amounts of money. Thus, it becomes very easy for attackers to transfer escrow funds. Only those businesses survive and experience growth that identifies potential threats at the right time and successfully takes measures to fight them off.

Cybersecurity Tools & Services to Take Into Account

Cybersecurity is something that every business needs to take seriously or else the number of hacking attacks will always keep on increasing. However, there is no cookie cutter approach as you will come across a wide range of dangers that may need to be addressed differently. However, the public typically hears about cyber-attacks. As a result, due to their lack of resources, small businesses have the least-protected websites, accounts, and network systems, making cyber – attacks a relatively easy job.

Further below I would like to shed some light on a few of the amazing tools and services that every business needs to take into consideration.

Managed Detection Services

As cyber-criminals and hackers becoming the new trend, techniques and software use are more advanced, it has become necessary for businesses to invest in more powerful forms of defense. At present, the time has come enough simply to have defenses that react to threats – instead, they need to be proactive and identify attacks before they can cause problems. Speaking of which, cybersecurity has seen a great shift from investing in technologies that attempt to prevent the possibility of an attack towards advanced services that react to and detect potential security issues, and respond to them as fast as possible. Identifying and eliminating an attack before it spreads is far less damaging rather than trying to handle an attack that already has a strong foothold on your IT network.

Staff training

Considering staff training as a tool can be a smart move. I mean having knowledgeable employees who understand their role in cybersecurity is one of the strongest forms of defense against attacks. Of course, you will come across several training tools that you can invest in to educate staff about best cybersecurity practices. Like I said before, cyber criminals continue to expand at an exponential rate. Businesses must invest in these tools and services. As failing to do so can leave you in a position where your company is an easy target for hackers. Unfortunately, many of you fear about the expenses that might put us off but always remember that the initial outlay will reward your business with long-term security and protection.

Firewall

Whether you are a techie or a non-techie, I am sure you must have heard about the firewall, a stronger defense worth considering. Being able to block any unauthorized access to your system, firewall arguably turns out to be the most secured tools across the globe.

Having a firewall monitors network traffic featuring connection attempts, deciding whether or not these should be able to pass freely onto your network or computer. Being so useful in nature doesn’t mean that they do not have any limitations. Over the years, hackers have learned how to create data and programs that can actually trick firewalls into believing that they are trusted – this means that the program can pass through the firewall without any problems. Despite of these limitations, firewalls are still very effective in detecting the large majority of less sophisticated malicious attacks on your business.

Penetration Testing

Do you what is the most important way to test your business security systems. Of course, while performing a penetrating test, professionals will make use of the same techniques utilized by criminal hackers to check for potential vulnerabilities and areas of weakness. Simulating the kind of attack a business might face from criminal hackers, including everything from password cracking and code injection to phishing is what a pen test attempts to. Once the test takes place, the testers will present you with their findings and can even help by recommending potential changes to your system.

Antivirus Software

If you are already running a business and do not have a great deal of experience with cybersecurity, chances are there terms like ‘firewall’ and ‘antivirus’ might sound synonymous, but they are not! Make sure you have the strong firewall and up-to-date antivirus software in place to keep the system secure.

Antivirus software will alert you to virus and malware infections and many will also provide additional services such as scanning emails to make sure they are free from malicious attachments or web links. With useful protective measures, such as quarantining potential threats and removing them, modern antivirus programs have the potential to perform really well. However, you will come across a huge range of antivirus software, which can be easily found. Choose a package that is suited to the needs of your business.

PKI Services

Associating PKI with SSL or TLS can encrypt server communications and turns out to be pretty much responsible for the HTTPS and padlock you see in your browser address bars. In fact, it may quite interest you to know that PKI has the potential to solve a number of common cybersecurity pain points and deserves a place in every organization’s security suite. PKI can be utilized for different purposes such as:

  • Enabling Multi-Factor Authentication and access control
  • Creating compliant, trusted digital signatures
  • Encrypt email communications and authenticate the sender’s identity
  • Digitally sign and protect code
  • Build identity and trust into IoT ecosystems

HTTPS is everywhere

Hypertext Transfer Protocol Secure (HTTPS) has made communication easy and standard for different websites to protect data especially when users log into accounts, make online purchases and complete other transactions. https:// protected pages can be found everywhere, a browser extension for Firefox, Chrome, Opera and Android web browsers. While you are surfing websites, it encrypts communication filling in the gaps in no time. This ensures the security of your web browsing at all times.

Parting Thoughts

It is pretty much easy to count ways how tech did wrong, especially in terms of security. But on the other hand, I am sure you will find a plethora of software’s and systems that will work wonders and offer a value for your money. So do remember the aforementioned risks and be wary of the cybersecurity threats that do exist today or in future.

Introduction

It feels like yesterday when globalization was the dominating topic in the media. The big fish kept eating the small fish. Then digitalization kicked in heavily and has been changing the world ever since. As a result today we rather see the fast fish eating the slow fish, independent of their size.

In this blog series we want to take a look at the broad topic of digital transformation in the context of business process management (BPM) and show you how to approach it in a meaningful way. You will learn how you can bring your business up to speed by utilizing BPM(N) and which details matter in order to swim faster than the others.

This first blog covers some of the basic issues, companies are facing when trying to get started with digitalization. It also outlines an approach how to overcome these issues. The second blog will make use of the proposed approach in a concrete showcase example. The third and final blog offers a look behind the scenes of the showcase, focusing on the technical architecture utilizing SAP Cloud platform services, especially the workflow service component.

So let’s dive right in!

 

Digitalization is changing the market

Digitalization has not only affected our private lives in recent years but is more and more becoming a critical factor for companies around the world. Start-ups are ready to disrupt established markets with fresh ideas and new processes. In order to cope with these changes, companies need to find a way to adapt and compete.

It goes without saying that one major challenge in the first place is to establish the right mindset to enable any form of digital transformation within a company. Change needs to be welcome and not feared. Information silos need to be removed and service orientation and curiosity should be core competencies of all employees.

Eying the processes, in our experience many companies invest too much time and energy into implementing their standard processes while neglecting their differentiating business processes. Standard processes can usually be covered by standard software pretty well. On the other hand, processes which make you unique and stand out from your competition cannot be part of a standard software by definition and therefore require special attention.

In reality, often there is no transparency about the existing processes at all. Many key processes have been coded years ago and are only partially known in the heads of a few. Even worse: sometimes there is a huge gap between what the stakeholders believe a process is doing and what it actually does.

As a result, over the years many of those processes have grown into inflexible black box IT implementations with high dependency on a few people who carry all the knowledge. This results in lengthy change procedures which prevent the affected companies from keeping pace with the markets. Also this can lead to dramatic situations when key staff is leaving.

 

Finding the right approach

As a first step in order to prevent this from happening, companies need to identify and understand their unique selling point (USP) processes. There needs to be transparency and a common understanding about those processes in order to be able to constantly challenge and improve them.

Business and IT need to interact collaboratively with each other in order to transfer this core-knowledge from the heads of a few into common knowledge. We strongly recommend to make use of a visual representation of your processes instead of writing lengthy blueprints. From our experience, BPMN 2.0 has proven to be the perfect match to act as a common language between all participants here. Basics are not only learned easily, but the standard also offers the perfect base for automation of your process-models in so-called process engines later. Therefore using BPMN 2.0 as a foundation to describe and document your processes transparently is the first step in building a flexible process-driven IT architecture.

Creating your process models in a sustainable and meaningful way can be tough though. Details are important. The key to success is to keep technical implementation details separated from the core business process. It is also helpful to make use of re-usable micro-processes which are exchangeable in the future.

Figure: Micro-segmentation of processes enables flexibility for the future

 

It is vital to choose the right methodology when designing and cutting your process models. We recommend to follow the process-driven approach as defined by Dr. Volker Stiehl. Following this approach leads to process models which remain flexible and maintainable at the same time. To explain this approach in detail would go beyond the scope of this blog, therefore we will leave you with two pointers:

Figure: “Process-Driven Applications with BPMN” by Dr. Volker Stiehl

 

By applying the mentioned methodology, you will gain transparency and clarity about your USP processes, which will be documented in BPMN 2.0 while following the PDA-approach. This is already a huge step forward, since it enables you to discuss and adapt your business processes constantly. From here you can now think about (further) digitalization of your USP processes and use different methods to analyze them.

For example if your USP processes are in the area of sales and distribution you might use methods like customer journey analysis where you map out all the customer touch points for a specific persona along your process chains. For each touch point you might now go into deeper analysis of the process behind it. Think about which pain points exist for the specific persona you are looking at, how to resolve them and how well the touch points integrate with each other.

Once you identify a process along the chain which can be improved, you are now in a position to do so. Imagine doing this exercise without transparency about your own processes… it would be impossible.

An alternative approach to tackle digitalization is to try and disrupt your own company by coming up with a completely new purpose. Try to think ahead five or ten years from now and imagine what your industry will look like. Which role do you want your company to play and what kind of processes and services do you need to create in order to get there? First clarify the “What” and “Why” and then think about the “How” – e.g. by using methods like design thinking. Utilize BPMN to sketch out different ideas and analyze them within your digital lab or with external incubators. Based on the created models you might also build lightweight prototypes which can be adjusted and tried out quickly thanks to the power of process-engines – which brings us to another major advantage of BPMN 2.0: automation.

 

The supreme discipline: Automation

So now you have a common understanding of your processes and are able to analyze and adjust them. Well done! You are already ahead of most others.

What’s next? Well, there still is a long way to go. How will you transfer your process model into execution? Is it all manual? Maybe, but most likely you will aim for automation of your processes… in the end this is a major part of the digital transformation, isn’t it?

But how to handle the actual implementation? There are several possibilities. Do you just hand your process models over to your IT-staff to code along? Will you use loosely coupled micro services using an event-driven architecture? Or just outsource the implementation completely and get back a black box implementation again?

Of course there are advantages and disadvantages for all mentioned options. But we need to keep in mind our main goal: Gaining speed by providing transparency and flexibility. This does not only concern our process documentation, but of course also our process implementation.

You might remember that process engines have been mentioned before. Basically they allow to transfer your process models into execution and therefore represent a model-based approach. The big advantage here is that business and IT don’t only talk the same language during process design, but also during process execution. This creates true transparency and acts as an enabler for deciding on and implementing process changes quickly.

Technically speaking, the procedure is to take your PDA-ready BPMN 2.0 models, import them into your process development environment and enrich them with some configuration and technical artifacts like data types, mappings, interface calls and so on. Afterwards you have a runnable process which can be deployed to a process engine. Once a process is running, you always have full transparency where your process currently stands, which path it has taken and what data it is using. You could even go a step further and run process mining tools, which for example allow to create heat maps across lots of process calls in order to show you which process paths are frequently used and which ones might never be taken.

In the SAP world, SAP Cloud Platform (SCP) Workflow and Business Rules deliver the tools to achieve exactly that. Using SCP Workflow to model and deploy BPMN 2.0-based business processes lays the foundation of adjustable and quickly automatable business processes which are understood by business and IT. In combination with SCP Business Rules customers can enhance flexibility of their processes even further since this service enables them to maintain their process rules within the specialized business departments directly. In order to create more complex integrations in cross-system processes, you might also make use of Cloud Platform Integration which offers the full variety of integration options that you would expect from a modern integration platform.

Having these services available in the SAP Cloud Platform lowers the initial hurdles for customers to get started with creating process-driven IT architectures enormously. In the on-premise world customers were required to make a pretty heavy pre-invest and have the infrastructure ready even for simple proof of concepts. Individual processes can now be automated much quicker and with less initial effort due to the easy and flexible service activation in the cloud.

 

Be ahead of the market

We expect the cloud adaption to rise significantly within the next few years since the need for short innovation cycles and agile projects will rise. If you combine this with the fact that more and more companies will need to invest into topics around digital transformation and process automation, you can easily see that cloud services like CP Workflow, CP Business Rules or CP Integration will play a key role on the IT-roadmap of many companies.

In order to stay ahead of your competition, now is the right time to look into your processes and your technology. Is your company prepared to start the journey into digitalization?

 

Get in touch

itelligence already supports several customer projects which make use of the new cloud-based workflow service component and we have used the PDA methodology with great success in many BPMN implementation projects.

Interested into the capabilities of our methodology and the new SAP technologies? Are you looking for an experienced partner who can help you with finding a meaningful path trough your digital transformation journey? Don’t hesitate to get in touch with us.

 

Blog Series

In the next blog of this series we will see how the approach described in this blog can be put to use, looking at a practical example.