Category Archives: Uncategorized

Why Project Management for User Assistance?


To quote the wiki definition of project management:

“ Project management is the discipline of initiating, planning, executing, controlling, and closing the work of a team to achieve specific goals and meet specific success criteria”.

No great product can be created and launched with ease without proper estimation and planning. So is the case even with documentation or user assistance, which goes a long way in enhancing the experience of the user with the product. A great deal of planning is required to run, execute, and complete the product user assistance project.

The article focuses on the considerations to be kept in mind while creating user assistance or documentation for any product.

The Nitty-Gritty of User Assistance Projects

The User Assistance (UA) today is not confined to just writing but calls for better project management skills to run as an independent and significant department within the product life cycle.

Here are some of the key questions that are worth considering in any documentation project:

  • Who will create and what? (Authoring and deliverables)
  • Which are the channels to publish? (Delivery Channels)
  • For whom will we write? (Audience)
  • How many of us will write? (Team and Task)
  • How much will we write? (Content Volume)
  • Into how many languages we are going to translate the content? (Localization)
  • What would be the cost of translation? (cost)
  • How much time is required to develop content and publish? (Timeline)
  • What tools are we going to use? (Tools)
  • To what extent we can use single source and reuse strategy in addition to other content strategies such as profiling, versioning, linking etc.
  • What is the scope of innovation for our content management?
  • How are we going to collect the user feedback later?

Just imagine if we were to start writing at once without considering any of these? Getting started with writing mechanically without any consideration for any of these is unproductive.

Authors and Deliverables

Different companies have different ways of managing documentation projects. Some companies might as well efficiently track these with an excel. While few other companies might have dedicated tools for project management. Further, it is also possible to carry out project management in the DITA based content management systems.

Every project – technical or non-technical – requires human resources, who are hired based on the required skills for the docu project. Lot of time goes into finalizing such candidates to form a team. Thus, there is a need to plan the budget and time required to form a team. Sometimes a project might require a single resource with good knowledge to set up the docu project from the ground up while few other docu projects might require more writers to support the docu project.


Writing requirements might also differ from project to project. A docu project supporting an on-premise version of the product might require a different set of deliverables and delivery cycles. On-premise release might require writers to create and publish content quarterly or so. For a cloud version of the product, deliverables could differ. For example, an Installation guide may not be required for a cloud based product, which is available to customers remotely. But the challenge of keeping the customers engaged through appealing docu is always there because the customers have the choice of subscribing to a product for a short period and can leave it if the user experience is bad.

So, plan these aspects in advance prior to writing.

Delivery Channels

Where do we publish the content that is created?

  • Documents on the company help portal
  • Do we provide help in the application?
  • Is there an appropriate tool that supports integration of such help content from Content Management System to the software product? Consider the options and the feasibility?
  • Do we also use social media to talk about our product and its cool features?
  • Do we provide infographics?
  • Do we offer videos on features and tasks?
  • Do we have enough infrastructure and expertize to create such diverse deliverables in addition to documentation?

Address these concerns prior to setting up the docu project.


Who will be reading what we write – is it a mixed group of novices and experts? Or is it only the business users? Are there even technical audience and how do we balance content for these different set of audiences? What is their background – language should be simple or moderate. Consider the cultural aspects while using certain terms in UI or in docu – a single usage might have different connotations in different cultural contexts. Need to be aware of that as well. Do they need detailed information? Or they just need assistance with complex tasks?

Team and Task

A thought on how many writers will collaborate for the project and on what aspects of the product helps. Who has skills to create which deliverables? If there are more writers in a project, some might have a better understanding of what to provide for a business user while some might have a better skill to provide accurate info for a technical user. Hence give a thought and plan to match the skills to requirements in advance.

Content Volume, Localization, Cost, and Timeline

Assess the volume of content for each deliverable because if there is localization for the project, it helps. Care can be taken to provide concise information in localized deliverables without compromising completeness and clarity.

How much of time is required to create, review, and publish all deliverables? Would there be a mismatch of estimates for timelines? Such an estimate helps to track the current situation and communicate to the stakeholders with credibility. This also ensures accountability at individual level as well because writers align and provide an estimate for themselves (based on product development timelines) and  it is not made by others for them. What are the issues that are likely to come up after the content is produced? How much of time should be kept as a buffer to address such unexpected issues during publication.

At the same time, the current estimate might help in assessing future efforts required for a similar project.

Content Management and Other Tools

What tools are we going to use for creating content and another media for the project?

Are we going to stick to age old tools? Are we going to invest on advanced tools for a better user experience? Do we opt for DITA based tools? Is it worth investing in an advanced writing tool?

What is the feedback about the tool in the technical writing circles?

What tool to use for illustrations, infographics, and videos? Do we need a professional graphics specialist? Are there open source tools that we can use? Do we need to buy an advanced tool for creating graphics and illustration?

Is there appropriate audio support for the created videos? Should we have a dedicated blog page for our product wherein writers can contribute?

All these considerations should be openly discussed prior to content development. This helps you plan and deliver the project better.

Content Strategies

How are we going to handle single sourcing and reuse in our docu project? In other words, do we use the same source to create multiple outputs such as PDF and HTML. Do we reuse the same source in subsequent versions? If yes, how do we accommodate the new content with the old content.

What are our strategies to clean up the content periodically based on the changes in style aspects?

Trends and Innovation

What are the trends in user assistance outside? How many of these are suitable to try in our project? What are the trends in customer expectations in a similar domain? What are the expectations in terms of user experience? For example, can we provide fuzzy search in our docu help page? What measures would aid our help page show up in search engines?

What other writers at SAP are doing that is innovative? How can we reuse and accelerate current developments to incorporate their innovations into our solution or offering? What common approaches we can reuse? How can we better integrate for a seamless  customer experience.

User Feedback

Last but not the least is how do we improve product experience in future? This is possible when we are open to feedback from the customer. How do we collect such feedback? Should the users be allowed to share feedback on each pages of our deliverables in the product documentation portal? How do we respond to them? Is there an option to subscribe to customer feedback from the portal? Out of such comments, how do we segregate product related feedback and docu related feedback and the ways to address them efficiently.


These are only few generic aspects. There could be many more factors specific to your product and project. Some of these generic considerations listed above determine the success of any product through a well-planned user assistance project. Hence planning and managing these aspects in a professional manner benefits the product, people using it, people creating the product etc. Not just that, but uncertainties, frustrations, loss, and imperfections can be ruled out or minimized. On the other hand, efficiency, accuracy, and usability is ensured.

This was a webcast last week given by Visual BI   I tried not repeating content from here.


“Why move away from Xcelsius/Dashboards?” and understand SAP’s direction with SAP Lumira

Figure 1: Source: Visual BI

Figure 1 shows what is happening with SAP Lumira 2.0.

Figure 2: Source: Visual BI

Why phased out SAP Dashboards?  Figure 2 covers it, mostly due to the Flash dependency

Figure 3: Source: Visual BI

Figure 3 shows the start of Xcelsius beginning with Infommersion, the decline of flash, the Design Studio release in 2012, through SAP Lumira 2.0

In 2012 SAP updates to Xcelsius to limited HTML5 components

Figure 4: Source: Visual BI

SAP Lumira 2.x is a “”tool for IT & business”

It is possible for business users to use templates for generic visualizations; a controlled environment

Lumira Designer doesn’t have Flash dependency

It has “Out of box for SAP data sources”

Figure 5: Source: Visual BI


GA is expected August 21 (always subject to change)

Figure 6: Source: Visual BI/SAP


Some key features for Lumira Designer include interoperability and componentization

Figure 7: Source: Visual BI/SAP

Figure 7 shows SAP Lumira Designer supports direct HANA connectivity, background processing, high volumes, scripting

Figure 8: Source: Visual BI/SAP


Pain point for Dashboards was mobile compatibility

Figure 9: Source: Visual BI/SAP


Can be consumed on Mobile BI app or on browser

Figure 10: Source: Visual BI/SAP


Bypass row limits using Lumira Discovery

Figure 11: Source: Visual BI/SAP

Lumira Designer is built for self service for users to compose their own dashboards based on visualizations they created

Figure 12: Source: Visual BI/SAP


Figure 13: Source: Visual BI/SAP


Range slider is missing from Designer; available in Xcelsius

Figure 14: Source: Visual BI/SAP

The VBX extensions include the following and what is shown in Figure 14.

  • Range sliders
  • Stacked area charts
  • Regional heat maps
  • Can use Excel as a datasource


Reduce need for scripting, CSS customization, improve performance

Figure 15: Source: Visual BI/SAP


Most popular with the customer is conditional formatting, maps

Figure 16: Source: Visual BI/SAP

Custom case study in Figure 16 included using range sliders and progress bars

Figure 17: Source: Visual BI/SAP

This customer used the extensions for gauges

Figure 18: Source: Visual BI/SAP


Explorer is used for controlled data discovery, based on corporate data

Information spaces set up by IT

Business users can use these spaces to explore

Figure 19: Source: Visual BI/SAP

This is the first time I’ve seen something like this compared; it is interesting.

Figure 20: Source: Visual BI/SAP


Visual BI has an extension with explorer functionality including prebuilt applications, ability to switch data sources and online composition


Question and Answer

Q: migration, complete lift and shift or redesign?

A: at the moment no lift and shift, it is manual

Q: when imported from discovery to Designer – could not refresh data

A: on roadmap from SAP; looking at ways to refresh

Best use case is Excel data source

Q: Best way to approach existing set of dashboards – start from scratch?

A: Start from scratch in Designer/Design Studio

Q: What is row limit for HANA data source?

A: Safety belt; no limitation; remember this is a dashboarding tool so summarized data should be used

Q: When date available

A: Aug 21

Q: Web services based on Web Intelligence

A: Today Lumira doesn’t have connectivity to web services; VBX will offer web services as a data source

Q: Data limit for acquisition – 20K row limit?  Any best practice for mobile use

A: If have license for Lumira Discovery you can circumvent data limitation

Best practices – number of different practices – how design dashboard or data source

Q: How connect to 2 BW instances?

A: Yes, will be able to connect to two from within same Dashboard – Lumira Designer ; create on BI platform admin sets up connections to systems



Upcoming relates webcasts follows below:

7/11/2017 Agile Self-Service Reporting with Tetra Pak
7/25/2017 Mobile BI Strategy and Options for SAP Customers

How to connect Lumira 1.X and 2.0 to Vora Kerberized on Active Directory

This step-by-step guide has been derived from:



  • Vora client principal and keytab has been generated
  • Vora Thriftserver is running in KERBEROS mode
  • KDC server host is reachable by the Lumira workstation


  1. Copy the vora client principal keytab to the Lumira workstation


  1. On the workstation where Lumira is running create C:\Windows\LumiraJAAS.confwith following content:


Client {

com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true keyTab="file:<Path/to/vora.keytab" principal="<voraPrincipal>@<REALM>" doNotPrompt=true;



Note: Path must start with file: and use forward slashes. For example: file:c:/windows/keytabs/vora.keytab



  1. Create file C:\Windows\krb5.ini file:

default_realm = <REALM>
dns_lookup_kdc = true
dns_lookup_realm = true
default_tkt_enctypes = rc4-hmac
default_tgs_enctypes = rc4-hmac
default_domain = <REALM>


  1. Add the following lines to Lumira 1.X <Lumira_root_dir>/SAPLumira.ini or in case of Lumira 2.0 <Lumira_root_dir>/SBOPLumiraDiscovery.ini




  1. Select File -> Preference -> SQL Driver -> Select “Generic JDBC driver” -> Install
  2. Select all files in the <Lumira_root_directory>/utilities/SparkJDBC directory
  3. Create new connection to Vora using Query with SQL:
  4. Choose the generic driver that just installed:
  1. Enter anything for username and password (it won’t be used)
  2. Set the JDBC class to simba.spark.jdbc4.Driver
  3. Define the JDBC connection string as follows:


jdbc:spark://<Thriftserver_FQDN_Hostname>:<ThirftserverPort>/default;CatalogSchemaSwitch=0;UseNativeQuery=1;AuthMech=1;KrbAuthType=1;KrbHostFQDN=<Thriftserver_FQDN_Hostname>;KrbRealm=<Kerberos Realm>;KrbServiceName=<Vora service principal used by the thriftserver>;


  1. In Advanced settings set the JDBC driver property:


  1. Connect with these settings and navigate the hierarchy CATALOG_VIEW -> Spark -> Default

Note that only in-memory tables registered using the syntax REGISTER TABLE <tableName> using com.sap.spark.vora via Vora Tools or beeline will be visible here.


  1. When selecting a table, the pre-generated SQL string will not be Vora syntax compatible and must be manually re-written by removing the “Spark” prefix and all wrapper double quotes

For example:
SELECT * FROM default.v2Table



my name is Manuel Blechschmidt and I will be the speaker of the SAP TechEd session “Optimizing an SAP Fiori Application Based on a Real World Example“.

Who are you?

I am currently working for the FarmFacts GmbH. I studied IT systems engineering at the Hasso-Plattner-Institut in Potsdam and since then I worked in the eCommerce and agriculture branch.

What do you do there?

I am the team lead for a cloud software that is developed since 2015. It is a global digital farming platform.

Which real world application are you describing?

The software we are working on is https://www.nextfarming.de/. It is using a Java EE application server and an OpenUI5 front end. For geo spatial data we integrated OpenUI5 with openlayers.

You can find some more information about the server here: http://www.adam-bien.com/roller/abien/entry/satellites_iot_machine_tracks_or

Here is a screenshot of the front end:

What is different between your application and other SAP Fiori applications?

We are running our OpenUI5 app on Java EE 7 and therefore we can use all the Java and JavaScript built tools further geospatial data is a big topic for us and this has own challenges.

What is most important for performance optimization?

The most important thing is measuring the performance before optimizing anything.

We found multiple unexpect things like slow regular expression parsing in Internet Explorer or unfiltered binding loading.

If you don’t have a profiler you can use the build in measuring tool of openui5.

Here you can see a screenshot how it is possible to measure with Chrome:

Source: Prepared by Manuel Blechschmidt based on blueprint and Google Chrome Developers

Where are currently the performance boosters for a certain app?

In our case the biggest booster was reducing HTTP requests e.g. with bundling all files in a Component-preload.js file or using ODataModel batch requests, than all requests should be asynchronously and in then we discovered some possible enhancements in the rendering behavior of OpenUI5. Our custom controls are build for showing multiple thousand data objects from the server.

Here you can see a sketch how the rendering of OpenUI5 works. If you want to have a lot of performance you have to apply partial rendering techniques like virtual lists. whzz from the OpenUI5 slack channel gave us that tipp.

Source: Prepared by Manuel Blechschmidt with Signavio FMC Stencils based on OpenUI5 source


In the session we will define a workflow for performance optimization, explore the different techniques for measuring and show some concrete examples for solving common problems.

Join me at this SAP Community session at Las Vegas.

Release 2.0 is a major release of SAP Cloud Platform Predictive service because in addition to the Business services, this release include Predictive Analytics Integrator (PAI) as a new REST API.

I let you visit the What’s new blog of the Starter Kit page of the Predictive service to know more about the new features.

If it is the first time you deploy the Predictive service or if you already use a release 1.x, here are some differences (marked in blue) with the previous releases.

In the menu “Services”, click on the tile of the Predictive services and enable them if this is not already done from a previous installation.

Click on the “Go to service” link. A new page appears and a pop-up alerts you a new release is available. Click on “OK” button and then on the tile ‘New Version Available”. Then enter your identifications (user/password) and click on “Deploy” button and “OK” in the confirmation dialog.

Release 2.0 of SAP Cloud Platform Predictive service is deploying on your SAP Cloud Platform account. At the end an URL is displayed. This is the URL of the JAVA application of the Predictive service. Click on this URL and you will see that the services have restarted.

Go to the main menu and click on menu “Database Systems” and then on your database. Then click on the button “Check for updates”. SAP Cloud Platform researches the updates that can be applied on the database. I suggest you select the most recent of SAP HANA. For example on my account, it is target version

New in release 2.0 – At the bottom of the list, there is a new row named “SAP Predictive Analytics APL” and the target version value is “SKIP”. If you want to use Predictive service and Predictive Analytics Integrator (PAI), choose version This will install on your SAP HANA database: APL and PAI.

Click on button “Continue” to proceed. After a while, these components are installed and the database restarted.

Follow instructions to assign roles to users and to create a database user.

New in release 2.0 – Once your database user is created, to use PAI REST, in addiction to role “ap.pa.apl.base.roles::APL_EXECUTE”, it is necessary to also assign the new role “sap.hana.pai::ExecutePAI“. The easiest way is to launch “SAP HANA Web-based Development Workbench”, click on the tile “Security” and add this role to your technical user.

The last step is to create the data binding between Predictive service and SAP HANA. To check the Predictive service is ready to be used, click on the link of the JAVA application of the Predictive service. Then click on the “Administration” tile. The “Status” tile should contain a green check mark and under the “About” tile, you should read “Implementation -Version 2.0.0”.

This means that you can now include Business services and/or Predictive Analytics Integrator within your SAP Cloud Platform applications.



Strengthen the adoption of your enterprise resource planning (ERP) application in the cloud. Drive your digital transformation with change management and training delivered through SAP Live Class, provided by the SAP Education organization.

From 2017, customers can complete their SAP learning units through SAP Live Class to get ready for the cloud migration. The trainer and all the participants can see and talk about video tiles. This combines the flexibility of virtual learning with the interactive nature of classroom training.

 Learning on the same level

SAP Live Class combines the best of both worlds. The solution was developed by SAP and the Education company alfatraining. For the first time, employees can learn with their own computer and feel as if it’s in a classroom. Because unlike on an ordinary online learning, they are linked together with a lip-synchronous webcam livestream.

Group discussions in real-time enables a completely new training experience. Learners can contact their trainer at any time, ask questions or ask for help. The trainer is able to give immediate feedback in any situation – just “live”, as in the classroom training.

If necessary, separate rooms can be opened for individual calls. In addition, the trainer can be integrated into the students’ computers and directly support them, provided the participant gives their express permission. In addition, content can be accessed and repeated at any time via the training platform.

Our innovative delivery method SAP Live Class supports optimally the migration of customers into the cloud. Customers do not even have to travel and move for the necessary training sessions, and can benefit from the cloud training from anywhere.

With SAP Live Class, the existing learning offer is not only faster and easier to convey, but it allows to add new modules as well. For the first time, it is also worthwhile to offer smaller learning units, for example on the innovations in a cloud update.

Your advantages

  • Saving resources: Travel expenses and absence are no longer required.
  • Learning Effectiveness: Fast and sustainable training success.
  • Live and lap synchronous: The participants see and talk to each other via camera and microphone.
  • Group Training: The team is trained, as required, separate rooms can be opened for individual discussions.
  • Close: In the digital classroom, knowledge becomes an experience.

Contact your SAP Education office for more details about our innovating delivery method SAP Live Class.


(e.g. education.germany@sap.com, education.americas@sap.com etc.)

Don’t miss the upcoming live knowledge transfer webinar about SAP Single Sign-On! Our expert Christian Cohrs, Area Product Owner SAP Single Sign-On, SAP SE, will present the latest and what’s ahead.

SAP Single Sign-On is the de facto standard for secure authentication in on-premise SAP landscapes, where it delivers both increased security and higher productivity. In the webinar, we will present and demonstrate both the typical single sign-on scenarios including multi-factor authentication and the latest enhancements of the current product version 3.0 Support Package 2. These enhancements include automated management of server-side certificates, digital signatures for browser applications, and new capabilities on macOS and iOS.

Key takeaways:

  • How to decide on the right technology for single sign-on in your system landscape
  • How to enable strong authentication and find the right balance with fast access
  • How to manage server-side security capabilities efficiently

The webinar will take place on July 4, 2017, 10-11 am CET.

For dial-in information, please visit the International Focus Group for SAP Security, Data Protection & Privacy site. Click on “SAP Single Sign-On” and then on “Meeting Request” to find the dial-in details. Mark your calendars now!

In case you cannot attend, the webinar will be recorded and made available at the site mentioned above.

For more information about SAP Single Sign-On, please also visit our community page here: https://www.sap.com/community/topic/sso.html.


This course is designed to provide an in depth understanding about the implementation of different modules of the SAP Hybris Marketing Cloud solution. The course aims not only to showcase “What are the underlying technical concepts” but also to provide information on “How to enhance the standard functionalities”.

Multiple hands-on exercises are used to facilitate:

  • Options of configuration and enhancements in the standard solution
  • Details about the integration of SAP Hybris Marketing Cloud with SAP Hybris Commerce and SAP Hybris Cloud for Customer

Date & Time:

Day 1

12 July 2017: 09:30AM CET – 04:30PM CET

  • Overview – Introduction to SAP Hybris Marketing Cloud
  • Unit 1
    • Consumer & Customer Profiling
    • Segmentation
    • Predictive Models and Scores

Day 2

13 July 2017: 08:30AM CET – 04:30PM CET

  • Unit 2
    • Business Administration Tasks and Data Upload
    • Data Upload
  • Unit 3 – Marketing Campaign Scenarios
  • Unit 4 – Marketing Resource Management
  • Unit 5 – Commerce Marketing – Recommendations

Day 3

14 July 2017: 08:30AM CET – 04:30PM CET

  • Unit 6 – Marketing Analytics
  • Unit 7 – Lead Management and Integration of SAP Hybris Cloud for Customer
  • Unit 8 – Other Integration Scenarios
    • Loyalty, Profile & Customer Journey Manager
  • Unit 9 – Best Practices for SAP Hybris Marketing Cloud
    • Activate Framework & Best Practices

Presenter: Tim Nusch



Dietmar-Hopp-Allee 20
69190 Walldorf


Room CE.02

See map: https://training.sap.com/shop/training-locations/DE/walldorf

Language: German


Best regards,

Your SAP Hybris Partner Team

SAP Activate is the leading methodology for project implementations for partners, system integrators, and customers. SAP Activate is highly acclaimed due to the simplified processes, easy to follow templates, Agile methodologies, time and cost saving accelerators, and guided step-by-step approaches. All of these elements combined, help customers and project teams implement a successful go-live and reach a faster Time-to-Value.

System integrators have already named SAP Activate as the chosen methodology for their project implementations, and customers also agree! Check out the feedback from our top partners below.


SAP Activate has a dedicated Jam group, SAP Methodologies, which consists of approximately 18,900 members. The community allows users to collaborate and engage with one another, get expert feedback, and understand the latest updates to the methodology. Our new structure allows our members to understand SAP Activate, with its one methodology and six phases, in a storyboard format. In addition to the SAP Methodologies parent group there are two sub-groups, SAP Activate for Cloud and SAP Activate for On-Premise, which offer tailored roadmap and methodology content.

Top reasons why you should join our community:

  • Learn from our experts on various SAP Activate topics, gain key insights, and stay up-to-date on the latest releases
  • Webinars and events outlining the key topics which help organizations during project phases
  • Interactive collaboration with users regarding content, questions, and updates
  • Free access to the latest content to help organizations with their project implementations
  • Mobile friendly, on the go, 24/7 access to view SAP Activate content
  • Structured toolkit focused on SAP Activate courses and a certification to help you understand the methodology

If you are already a member of our SAP Activate Jam groups, click here. To become a member of our SAP Jam groups or to invite your colleagues, visit this link: http://bit.ly/SAPActivate

Check out our previous blog for key highlights from our SAP Activate Jam group refresh!

In my previous blog I showed how to gain some Quick wins from the SAP Readiness Check for SAP S/4HANA given that you had already run the Custom Code Analyzer and the Suite on HANA memory Sizing report.

In this blog, I would like to go into more details and potential stumbling blocks of the SAP Readiness Check for SAP S/4HANA. To start with, it is important to understand how the communication between your Solution Manger and the system that you would like to analyse, the so called managed system, work. In Solution Manager, you get this information from the Technical System that represents the system you would like to analyse:

In my case, there are two RFC destinations from my Solution Manager to my managed system:

  • SM_S02CLNT001_READ with user SM_S01
  • SM_S02CLNT001_TMW with user SMTMS01

As it turns out, the SAP Readiness Check for SAP S/4HANA uses the latter, so I must ensure that this user has the correct authorizations otherwise you will receive an error message like the following when trying to run it:

There is also one RFC destination back from my managed system to my Solution Manager:

  • SM_S01CLNT001_BACK with user SMB_02

This destination I must maintain in maintenance view BCOS_CUST in transaction SM30:

With that I am prepared and able to run the different reports in my managed system:

Custom Code Analyser

Implementing and running the Custom Code Analyser is straight forward, but program SAPRSEUB to update the where-used-index for customer and SAP objects might run for quite some time.

Suite on HANA memory Sizing report

Here it is important not to start program /SDF/HDB_SIZING from transaction SE38 but schedule this program as a background job with function module /SDF/TRIGGER_HDB_SIZING in transaction SE37:

Thus, you can monitor the progress of job /SDF/HDB_SIZING_SM in transaction SM37. As it turns out, the SAP Readiness Check for SAP S/4HANA checks for the successful execution of this job:

Simplification Item Check

Creating the Simplification Item List is straight forward and quite quick:

Business Process Improvement Content

For this ST-A/PI Release higher equal 01S is required. If you ignore this precondition, in the current version of the SAP Readiness Check for SAP S/4HANA, you do not even get an error message but a core dump, because there is a bug in the call of the error message.

SAP Readiness Check for SAP S/4HANA

If you have not implemented Usage and Procedure Logging (UPL) you will get a warning that is safe to ignore:

Finally, the SAP Readiness Check for SAP S/4HANA is on its way and you can monitor its progress in transaction SM37 on the managed system:

Thus, you will get an e-mail with a link to your results in the SAP ONE Support Launchpad:

Also, a respective transaction in the Maintenance Planner will have been created, in case you wanted to progress with the conversion.