Category Archives: Uncategorized

Migrating employee photos from SAP HR to SuccessFactors can be done in multiple ways.

  1. Bulk download photos from SAP and upload to SuccessFactors
  2. Use the ODATA  API.
  3. Link to photos hosted on a different web server

Every method has it’s advantages and disadvantages and the best approach will depend on the customer’s requirement.

In this article, we will look at approach 1, which downloads all the photos available in SAP and uploads them to SuccessFactors. The details and prerequisites are explained well in note 2094242.


Bulk photo upload through SuccessFactors SFTP is available only to Enterprise Subscription clients only and not for Professional Edition at the time of this writing. Please read through 2094242 for updated information.


  • Copy and create program ZMIGRATE_EMP_PHOTOS_SFSF in your production system
  • Execute for desired selection of employees. The program supports both online and batch modes
  • The program output contains the following
    • Downloaded Employee photos
    • CSV metadata file necessary for batch upload in SFTP
    • Failures if any in ALV
  • Once the photos and metadata csv file are downloaded, upload them onto your SuccessFactors SFTP folder under /photos. Reach out to Cloud Product Support in case of any issues.
  • Once the upload is processed, SuccessFactors system will send an email with the job results.

It is also possible to further modify this program to support option 3 using  FM SCMS_AO_URL_READ. This FM returns the url links to photos and this will also need further configuration w.r.t crossdomain access. Please look at CL_HIER_VIS_PHOTO_URL->GET_PHOTO_URLS if you would like to explore this.

Welcome any comments / suggestions. Hope this adds to your SFSF migration arsenal.

*& Program to Migrate Employee Photos from SAP to SuccessFactors
*& Photo download based on FM PAD_PHOTO_UPDATE_GET_DETAIL
*& Metadata file and Photo signatures based on SAP note 2094242

REPORT zhtr_emp_photo_dwnld.
DATA: lt_connections TYPE TABLE OF toav0.
DATA: ls_connections TYPE toav0.
DATA: lv_doc_type TYPE toadd-doc_type.
DATA: ls_return TYPE bapiret2.
DATA: ls_employee_number TYPE toav0-object_id.

DATA: lc_update_start_date TYPE sy-datum VALUE '18000101'.
DATA: lc_update_end_date TYPE sy-datum VALUE '99991231'.

DATA: lt_photo_archive TYPE TABLE OF  tbl1024.
DATA: ls_photo_archive TYPE tbl1024.
DATA: lt_messages TYPE bapiret2_tab.
DATA: lv_filename TYPE string.
DATA: lt_out_file TYPE TABLE OF string.
DATA: lv_out_file TYPE string.

        pernr LIKE pernr-pernr,
        uname LIKE p0105-usrid.
DATA: END OF gt_pernr.

NODES:pernr, peras, person.

PARAMETERS: p_file TYPE rlgrap-filename.

GET peras.
* Collect the Employees based on selection
  rp_provide_from_last p0105 '0001' pn-begda pn-endda.
  gt_pernr-uname = p0105-usrid.
  gt_pernr-pernr = peras-pernr.
  APPEND gt_pernr.

  lv_out_file = 'Username,Filename'.
  APPEND lv_out_file TO lt_out_file.
* Prepare and dowload files
  PERFORM get_and_store.
* Display errors
  PERFORM display_alv.

*&      Form  get_and_store
*       Retrieve Photos and download files
FORM get_and_store.
  LOOP AT gt_pernr.
    CLEAR: ls_employee_number, ls_return, ls_connections.
    REFRESH: lt_connections, lt_photo_archive.
    CONCATENATE gt_pernr-pernr '%' INTO ls_employee_number.

* Function Module to get count of photos for an Employee with the date,
* Content Repository Information and Archive-Document-Id

        object_id     = ls_employee_number
        documenttype  = 'HRICOLFOTO'
        from_ar_date  = lc_update_start_date
        until_ar_date = lc_update_end_date
        connections   = lt_connections
        nothing_found = 1
        OTHERS        = 2.

* If no Photo exists Generate Message
    IF sy-subrc <> 0.
          type   = 'I'
          cl     = 'HR_SE_PA'
          number = '001'
          return = ls_return.
      ls_return-message_v1 = gt_pernr-pernr.
      APPEND ls_return TO lt_messages.

* Sort the table to get the most recent Entry
    SORT lt_connections BY ar_date DESCENDING.
    READ TABLE lt_connections INTO ls_connections INDEX 1.

* Function Module to get the Binary MIME object of the Photo
        archiv_id                = ls_connections-archiv_id
        document_type            = lv_doc_type
        archiv_doc_id            = ls_connections-arc_doc_id
        binarchivobject          = lt_photo_archive
        error_archiv             = 1
        error_communicationtable = 2
        error_kernel             = 3
        OTHERS                   = 4.

* If no MIME object exists Generate Message
    IF sy-subrc <> 0.
          type   = 'E'
          cl     = 'HR_SE_PA'
          number = '000'
          return = ls_return.
      ls_return-message_v1 = gt_pernr-pernr.
      APPEND ls_return TO lt_messages.
    CLEAR lv_filename.
    lv_filename = |{ p_file }{ gt_pernr-pernr }.jpg|.
    lv_out_file = |{ gt_pernr-uname },{ gt_pernr-pernr }.jpg|.
    APPEND lv_out_file TO lt_out_file.

    IF sy-batch IS INITIAL.
          filename = lv_filename
          filetype = 'BIN'
          data_tab = lt_photo_archive
          OTHERS   = 1.
    ELSEIF sy-batch = abap_true.
      TRY .
          IF sy-subrc = 0.
            LOOP AT lt_photo_archive INTO ls_photo_archive.
              TRANSFER ls_photo_archive TO lv_filename.
          ELSE .
            ls_return-message = 'Failure-Unable to open the path'.
            ls_return-message_v1 = gt_pernr-pernr.
            ls_return-message_v2 =  lv_filename.
            APPEND ls_return TO lt_messages.
          CLOSE DATASET lv_filename.

        CATCH cx_sy_file_authority.
          ls_return-message = 'Failure-Authorisation to file location'.
          ls_return-message_v1 = gt_pernr-pernr.
          ls_return-message_v2 =  lv_filename.
          APPEND ls_return TO lt_messages.
    "ls_alv_output-status = 'Success'.
    CLEAR lv_filename.

  PERFORM meta_file_create.
ENDFORM.                    "get_and_store

*&      Form  display_alv
*       Display Errors in List
FORM display_alv .
  DATA functions TYPE REF TO cl_salv_functions_list.
  DATA: gr_table TYPE REF TO cl_salv_table.
*... Create Instance
  CALL METHOD cl_salv_table=>factory
      r_salv_table = gr_table
      t_table      = lt_messages.

  functions = gr_table->get_functions( ).
  functions->set_all( ).
*... Display Table
  gr_table->display( ).

ENDFORM.                    "display_alv
*&      Form  meta_file_create
*       SFSF needs Metadata file linking user names to photos
*       This Metadata file is stored here
FORM meta_file_create.
  lv_filename = |{ p_file }employee_photos_{ sy-datum }.csv|.
  IF sy-batch IS INITIAL.
        filename = lv_filename
        data_tab = lt_out_file
        OTHERS   = 1.
  ELSEIF sy-batch = abap_true.
    TRY .
        IF sy-subrc = 0.
          LOOP AT lt_out_file INTO lv_out_file.
            TRANSFER lv_out_file TO lv_filename.
        ELSE .
          ls_return-message = 'Failure-Unable to open the path'.
          ls_return-message_v1 =  lv_filename.
          APPEND ls_return TO lt_messages.
        CLOSE DATASET lv_filename.

      CATCH cx_sy_file_authority.
        ls_return-message = 'Failure-Authorisation to file location'.
        ls_return-message_v1 =  lv_filename.
        APPEND ls_return TO lt_messages.
ENDFORM.                    "meta_file_create

Accelerate your business to comply with IFRS 15 regulations using SAP Revenue Accounting and Reporting (RAR) Model Company Solution. Two leading accounting standard bodies IASB (International Accounting Standard Board) and FASB (Financial Accounting Standard Board) jointly released new IFRS regulation (IFRS 15) in 28 May 2014, which will be effective from Jan 1 2018. This new Revenue Recognition regulations influences many customers globally!!!
Understand IFRS 15 Accounting concept through 5-Step Model based on Business Use Case. Afterwards, watch End-to-End demo scenario through RAR Model Company Set up to understand revenue accounting and reporting (RAR) solution in SAP ERP Central Component (SAP ECC) 6.0 that will help to expedite your business to comply with new IFRS 15 accounting standard.

This webinar will demo E2E Scenario (Starting with Sales Order in SAP SD and Sales Order through Excel Upload and process Order Items into RAR for Revenue Recognition, Right of Return functionality & finally Posting in SAP RAR Solution.

Click here to register for the Expert Session @ SAP Enterprise Support Academy (valid SAP ID required).

This expert session is provided to you by SAP Global CoE as part of SAP MaxAttention Expert Series, in collaboration with SAP Enterprise Support Academy.

What is MaxAttention Expert Series?

MaxAttention Expert Series is a platform for SAP MaxAttention customers to exchange success stories, share innovative ideas, and learn from MaxAttention peers and experts in SAP’s Global Center of Expertise (CoE).

The CoE, under the umbrella of MaxAttention Next Generation, drives innovation for SAP customers by implementing new business models and overcoming operational IT barriers.

Stay tuned for the upcoming Live expert sessions! These will be posted on SCN and tagged with “MaxAttention Expert Series”. The Webinars will be delivered through SAP Enterprise Support Academy.

In order to register for the upcoming Webinars, you need to follow the registration link at the end of the Event description. Upon registration, you will receive web conference data via email from SAP Enterprise Support Academy.

All Webinars will be recorded and made available for offline replay. The replay link for the session will be posted on this page once it becomes available.

If you are interested to learn more about a certain innovation or even apply it in your company, please contact SAP Innovation Zone COENA
We look forward to serving you!

SAP Global CoE

I have been using the sap.ui.core.message.MessageManager for a while now. What I really like is the ease in which it handles messages raised on the Client side and the Server side in almost the same way. One of the nice features of the Message Manager I especially like is the ability to link a message to a target field and bring the attention of the message to the user.

In this blog I will demonstrate how to trigger and show back-end BAPI Warning, Error, Information and Success messages without throwing exceptions.


The Gif below shows an example of server side messaging, initially showing messages included in the http header and after that showing messages in the response body as part of business exception.

Try the app for yourself




There are numerous use cases, the one that is front and centre in my mind is a BAPI simulate scenario.

Your application allows the user to change multiple lines and post the changes in a batch request.  The changes are processed through a BAPI and you wish to inform the user of warnings and information relating to the data changes before committing.



When processing BAPI messages in our Gateway service, we can either chose to add the messages to the header property “sap-message”, this will send a HTTP 204 response, telling the app the call was successful,  else we can throw an exception, which it will send a HTTP 400 response and add the messages in the response body and the call of course will result in a failure.

LOOP AT lt_return ASSIGNING <fs_return>.
    " add message from bapi structure
    mo_context->get_message_container( )->add_message_from_bapi(
    EXPORTING is_bapi_message   = <fs_return>
    iv_entity_type = iv_entity_set_name
    it_key_tab = VALUE /iwbep/t_mgw_name_value_pair( ( name = 'KEY1' value = er_entity-key1 ) )
    iv_add_to_response_header = boolc( er_entity-throw_exception = abap_false )
    iv_message_target = CONV string( <fs_return>-field ) ).

IF er_entity-throw_exception = abap_true.
    RAISE EXCEPTION TYPE /iwbep/cx_mgw_busi_exception
        message_container = mo_context->get_message_container( ).

Code above, if we don’t need to throw an exception, set IV_ADD_TO_RESPONSE_HEADER to True

example messages in Header response

example messages in Body response

To get the message to show inline our Input fields, we can use IV_MESSAGE_TARGET, in this field we want to put the binding context path for the field.

We can get the path value easily using the UI5 Chrome extension “contextPath” + “propertyPath”


In our Gateway service we can derive the context path from the request URI like pic below

Note  we can only map IV_MESSAGE_TARGET on Errors and Warnings, an assertion in SAPUI5 will fail if we try and map on Information and or Successes (kind of makes sense)

UI5 Code

In your UI5 app, all you have to do is ensure the following is in your manifest.json file

"handleValidation": true,

I wrote some code for the creation of header and body messages for mockserver requests, handy if you want to test server side field validation.




Hi all,

A few months ago I’ve posted a blog about a SAP WebIDE Plugin that I’ve created to generate custom UI5 controls. In that blog I also showed how to create a custom control using this plugin. You can find the blog here: https://blogs.sap.com/2016/11/27/custom-ui5-control-generator-sap-web-ide-plugin/

The plugin is also in the SAPStore and can be used free of charge! https://www.sapappcenter.com/p/20553/custom-ui5-control-generator–sap-web-ide-plugin–flexso-nv

At the moment the plugin is very basic and can only be used for creating basic controls. Building advanced UI5 controls is not yet foreseen in this plugin. But still, it can boost your developments.

In this blog I’ll show you how to create UI5 controls that can be used together and how my plugin can help you with it.

As an example I started from an HTML snippet that I’ve found on the web:


More HTML snippets on https://designshack.net/articles/css/5-simple-and-practical-css-list-styles-you-can-copy-and-paste/

I’m going to convert this HTML snippet into a UI5 control. Therefore I have to split the HTML in two parts:

  • One control for the list item: An item can be used zero to multiple times in a list. This is something that should be generic and so it requires a separate UI5 control
  • One control for the list: This is the place where items will be added. This is a separate control with a specific css class for the design

First I’ll start with creating the controls, then I’ll connect the two controls and last but not least I’ll use them in my view with bindings.

UI5 List Item Control

Okay, let’s start with the item and create a file “CustomListItem.js” in the folder “control” (you have to create this folder)


I copy the HTML for one item of the list out of the snippet, which looks like this:

<li><a href="#"><img src="http://placehold.it/150x150" /></a></li>

Put it in the Generator plugin:

Hit the convert button ,use the WebIDE beautify function and change the namespace regarding to your project namespace and filename.

UI5 List Control

Now create a javascript file for the list in the control folder.

Now I take the outer tags of the list from the snippet:

<div class="customlist">

Again put it in the generator:

Let the converter do his job, then run the beautifier in the WebIDE and change the namespace and control name.

Connect the two controls

I want to use the List Item control in my List control. I’ll need to add some code manually. This part is not (yet) foreseen in the control generator. In the metadata of the list control I’m going to add an aggregation “items”. The definition of the aggregation contains following attributes:

  • Type: This is the type of the controls that you can use in the aggregation. We want to use our CustomListItem control in this CustomList. We use the namespace + the name of the custom list item to define the type of the aggregation.
  • Multiple: Defines how many times the childe control can be used in the CustomList
  • singularName: a reference to one item of the aggregation

Next step is to integrate the aggregation “items” in the renderer function of the list control. Therefore I’ve added a loop ($.each) to render all items in the aggregations.

I also include the css in my project. Just by pasting it in the “style.css”.

Use the control

First step to use the control in the view is by adding a reference to the  folder where the control is located and add a namespace to it (cust):

With the namespace (cust) I can now use my two controls together. First I open the tag for the “CustomList”. Between the “CustomList” tags I open the aggregation tags “items”. There I can then add the “CustomListItem” control. In the “CustomListItem” tags I add the attributes for the hyperlink (href1) and the image (src1).

This is the result:

Use the control with bindings

This step is similar to the previous one but now I’m going to use bindings to show images and hyperlinks.

In the Controller I’ve created a JSONModel with multiple images and hyperlinks which I bind to the view.

In the view I add the binding for the items in the “CustomList”. “grid” is a reference to the array in the JSONModel.

“hyperlink” and “image” are references to the properties in the objects of the array “grid”.

This will look like this.

You can find all the code on github:


Preview of the result:



You can do make some real cool stuff with custom controls!


Kind regards,


Learn how to set up SLT User Based Archiving Solution to allow archiving on a source system without impacting the data on Hana

Click here to register for the Expert Session @ SAP Enterprise Support Academy (valid SAP ID required).

This expert session is provided to you by SAP Global CoE as part of SAP MaxAttention Expert Series, in collaboration with SAP Enterprise Support Academy.

What is MaxAttention Expert Series?

MaxAttention Expert Series is a platform for SAP MaxAttention customers to exchange success stories, share innovative ideas, and learn from MaxAttention peers and experts in SAP’s Global Center of Expertise (CoE).

The CoE, under the umbrella of MaxAttention Next Generation, drives innovation for SAP customers by implementing new business models and overcoming operational IT barriers.

Stay tuned for the upcoming Live expert sessions! These will be posted on SCN and tagged with “MaxAttention Expert Series”. The Webinars will be delivered through SAP Enterprise Support Academy.

In order to register for the upcoming Webinars, you need to follow the registration link at the end of the Event description. Upon registration, you will receive web conference data via email from SAP Enterprise Support Academy.

All Webinars will be recorded and made available for offline replay. The replay link for the session will be posted on this page once it becomes available.

If you are interested to learn more about a certain innovation or even apply it in your company, please contact SAP Innovation Zone COENA
We look forward to serving you!

SAP Global CoE

This blog is focused on New Asset accounting for ledger approach in multiple currency environment. New Asset Accounting is the only Asset Accounting solution available in S/4 HANA, classic Asset Accounting is not available any more.

I have covered following key topics within S/4 HANA New Asset Accounting keeping in view various questions coming in from different customers/partners on this key innovation step taken within Finance as part of S/4 HANA simplification and we need to be very clear on this new requirement before starting the new or conversion S/4 HANA project.

1. Pre-requisite Business Functions

2. Data Structure Changes in Asset Accounting

3. New FI-AA-Integration with the Universal Journal Entry

4. Asset Accounting Parallel Valuation

5. Key Configuration Consideration in Ledger Approach

6. Why will use a technical clearing GL account

7. New Asset Accounting Posting Logic

8. FI-AA Legacy Data Transfer

9. Adjusting chart of Depreciation Prior to Conversion

10. Installing SFIN in Conversion/Migration Scenario


1. Pre-requisite Business Functions

Activate the following Business Functions


2. Data Structure Changes in Asset Accounting

  • Actual data of ANEK, ANEP, ANEA, ANLP, ANLC is now stored in table ACDOCA. ANEK data is stored in BKPF.
  • Compatibility views FAAV_<TABLENAME> (for example, FAAV_ANEK) are provided in order to reproduce the old structures.
  • Statistical data (for example, for tax purposes) previously stored in ANEP, ANEA, ANLP, ANLC is now stored in table FAAT_DOC_IT
  • Plan data previously stored in ANLP and ANLC is now stored in FAAT_PLAN_VALUES
  • Classic Asset Accounting is mostly transformed automatically into the New Asset Accounting by executing mandatory migration steps related to Asset Accounting.
  • Posting to different periods possible (restriction: beginning/end of FY needs to be equal) refer OSS note 1951069/ 2220152
  • Following table shows some redundant/new Asset accounting programme.

3. New FI-AA-Integration with the Universal Journal Entry

Asset Accounting is based on the universal journal entry. This means there is no longer any redundant data store, General Ledger Accounting and Asset Accounting are reconciled Key changes are listed below: –

  • There is no separate balance carry forward needed in asset accounting, the general balance carry forward transaction of FI (FAGLGVTR) transfers asset accounting balances by default.
  • The program Fixed Assets-Fiscal Year Change (RAJAWE00) transaction AJRW is no longer has to be performed at fiscal year change
  • Planned values are available in real time. Changes to master data and transaction data are constantly included
  • The most current planned depreciation values will be calculated automatically for the new year after performing the balance carry forward. The depreciation run posts the pre-calculated planned values.
  • The Selection screen is simplified as the “reasons for posting run” (planned depreciation run, repeat, restart, unplanned posting run) are no longer relevant.
  • Errors with individual assets do not necessarily need to be corrected before period-end closing; period-end closing can still be performed. You have to make sure that all assets are corrected by the end of the year only so that depreciation can be posted completely.
  • All APC changes in Asset Accounting are posted to the general ledger in real time. Periodical APC postings are therefore no longer supported.
  • Transaction types with restriction to depreciation areas are removed in new Asset Accounting and you can set the obsolete indicator in the definition of the transaction that were restricted to depreciation areas in the classic asset accounting.

4. Asset Accounting Parallel Valuation

  • Very Important part of new Asset accounting is parallel valuation in multicurrency environment.
  • The leading valuation can be recorded in any depreciation area. It is no longer necessary to use depreciation area 01 for this. The system now posts both the actual values of the leading valuation and the values of parallel valuation in real time. This means the posting of delta values has been replaced; as a result, the delta depreciation areas are no longer required.
  • New Asset Accounting makes it possible to post in real time in all valuations (that is, for all accounting principles). You can track the postings of all valuations, without having to take into account the postings of the leading valuation, as was partly the case in classic Asset Accounting.

5. Key Configuration Consideration in Ledger Approach

We need to answer some basic question before configuring new asset accounting in S4 Hana environment as this would determine the required minimum depreciation areas to align the FI with Asset Accounting. i.e.

  • Required Valuation Approach

  • How Many Ledgers (Leading + Non Leading) exists or to be configured.

  • What all currencies are used in each of the ledgers.

For Example:-

In this Example we have one com code which has 2 ledgers 0L & N1 & these 2 ledgers having 3 currencies i.e 10,30 & 40 as shown below.

Above mapping is to ensure and establish link between depreciation area/accounting principal and Currency

Explaining with ledger approach example. From release 1503 i.e initial version of SAP Finance add on version in S4 Hana a new table ACDOCA is introduced which stores the asset values also per ledger /per currency on real time basis & no need to have any reconciliation between Finance and Asset accounting and to do so it is must to follow the guidelines while setting up depreciation areas & respective currencies, which I have tried to explain with an example as given below: –

Ledger & currency setting has to be done in New GL in the following SPRO node.

Financial Accounting (New)–> Financial Accounting Global Settings (New)–> Ledgers–> Ledger –> Define Settings for Ledgers and Currency Types

Define Depreciation Areas

Depreciation Areas defined as per new FI-GL & FI-AA requirement so here at least 6 depreciation areas are must so that ledger wise each currency can be represented in separate depreciation area & these depreciation area is assigned to Accounting principal.

Specify Depreciation Area Type

Specify Transfer of APC Values

In this activity, you define transfer rules for the posting values of depreciation areas. These transfer rules let you ensure that certain depreciation areas have identical asset values

Specify Transfer of Depreciation Terms

In this activity, you specify how the depreciation terms for a depreciation area are adopted from another depreciation area. You can specify if the adoption of values is optional or mandatory. If you specify an optional transfer, then you can change the proposed depreciation terms in the dependent areas in the asset master record. In the case of a mandatory transfer, you cannot maintain any depreciation terms in the asset master record. In this way, you can ensure that depreciation is uniform in certain depreciation areas.

Define Depreciation Areas for Foreign Currencies

For every additional currency type defined on the company code a corresponding depreciation area needs to be set up.

As explained in previous step here we need to define the currency for each dep area so for example if a company code has 2 ledgers i.e 0L and N1 with 3 currencies then at least 6 depreciation areas should be setup & currencies should be assigned for each depreciation area ( Here leading valuation depreciation area will derive currency from com code currency)

Specify the Use of Parallel Currencies

Here we need to specify the Currency type for each for the Depreciation area which will align FI Currency type with Asset Depreciation areas & accordingly will be updated in ACDOCA.

With this setting its ensured that all currency types are aligned with respective depreciation area and asset values are getting updated parallel to Financial accounting per currency.

6. Why will use a technical clearing GL account

Architecture has been changed in the way that we now post in asset accounting for each valuation a separate document. So we perform on the asset part accounting principle specific postings. Technically we perform ledger-groups specific postings.

On the operational part (accounts receivable, accounts payable) the value is always the same for each accounting principle. So for the operational part we have to perform postings which are valid for all accounting principles. Technically we perform postings without specifying the ledger-group.

To split the business process in an operational and a valuating document there was a need to establish the “technical clearing account” for integrated asset acquisition.

  • For the operational part (vendor invoice/GRIR), the system posts a document valid for all accounting principles against the technical clearing account for integrated asset acquisitions. From a technical perspective, the system generates a ledger-group-independent document.


  • For each valuating part (asset posting with capitalization of the asset), the system generates a separate document that is valid only for the given accounting principle. This document is also posted against the technical clearing account for integrated asset acquisitions. From a technical perspective, the system generates ledger-group-specific documents.

Define account “Technical clearing account” for integrated asset acquisition.

Specify Alternative Document Type for Accounting Principle-Specific Documents

Here Operational document type will have original document used during entry & while generating accounting principal wise separate document it would be document type AA.

7. New Asset Accounting Posting Logic

The Operational Entry Document posts to a technical clearing account. The Operational Entry Document does not update the asset values; the asset data is only used to perform checks.

Accounting principle specific documents (1 to n). The accounting principle specific documents post to: – the technical clearing account in each view (balancing to zero) and the asset reconciliation account (and update the asset line items).

Asset Acquisitions Operational Document

Asset Acquisitions Accounting Principal (IFRS) specific Document

Asset Acquisitions Accounting Principal (LOCA) specific Document

Universal Table updated with respective ledger (0L & N1) and currencies.

Correction Asset Acquisition value in specific GAAP

Use Transaction code AB01L


8. FI-AA Legacy Data Transfer

  • You create asset master records for the legacy data transfer using transaction AS91.
  • You post the transfer values using transaction ABLDT; in doing so, a universal journal entry is posted for the fixed asset.
  • If wrong transfer values were posted, you must reverse the journal entry and then recreate it.
  • You can use transaction AS92 to change master data; transaction AS93 to display master data; and transaction AS94 to create sub numbers for the Asset master record.

Time of Legacy Asset Transfer

The transfer date is the cut-off date for the transfer of legacy data. The transfer will only include data up to this point in time. There are two possible scenarios.

  • The transfer date can be the end of the last closed fiscal year.

  • The transfer date can be in the fiscal year. This is called “transfer during the fiscal year.

Scenario 1: Transfer Date is the End of the Last Closed Fiscal Year:

In this case, you do not need to include any posted depreciation or transactions in the transfer of legacy data. You only need to transfer master data and the cumulative values as of the end of the last closed fiscal year.

Scenario 2: Transfer During the Fiscal Year

Along with the general master data, and the cumulative values from the start of the fiscal year (time period A), you must also transfer the following values.

  • Depreciation during the transfer year and Transactions during the transfer year
  • Include the depreciation posted in the legacy system since the end of the last closed fiscal year up to the date of transfer (time period B).
  • Any asset transactions in your legacy system that have a value date after the transfer date, but before the date of the physical transfer of data (time period C), need to be posted separately in the Asset Accounting component in any case.

Example of scenario 2 Legacy Data Transfer During the Fiscal year:-

Case: Legacy asset is acquired in previous year 01.01.2015 and taken over into simple finance system in mid-year of current year (30.04.2017)

Specify Transfer Date/Last Closed Fiscal Year (V_T093C_08)

Specify Last Period Posted in Prv. System (Transf. During FY) (OAYC)

Step 1:- AS91 to create Legacy asset master data

Step 2:- ABLDT to update Legacy Original Acquisition Value/ Accumulated Depreciation and current year Depreciation Posted.

Step 3: Verify Legacy Asset Planned Value

Step 4: Verify Legacy posted Value

9. Adjusting Chart of Depreciation prior to Conversion

  • For the leading valuation of the ledger approach and accounts approach and for parallel valuations of the ledger approach its must that the parallel currencies in the leading ledger in General Ledger Accounting and in the depreciation areas in Asset Accounting must be the same as explained one example above with ledger approach scenario.
  • Using the migration program available under Migration Tools, you can automatically adjust the parameters in your charts of depreciation. If error messages appear stating that automatic adjustment is not possible, you have to adjust the charts of depreciation manually.
  • If until now you have been using parallel currencies in General Ledger Accounting, but you have not implemented the corresponding parallel currency areas in Asset Accounting for all depreciation areas, you must implement these areas in a separate project before you install SAP Simple Finance. In such a project, you must first perform the preparatory steps for creating depreciation areas in Customizing; you must then determine the new values for each fixed asset for a newly created depreciation area.
  • For company codes that are assigned to the same chart of depreciation, these company codes are not allowed to differ in number and type from the parallel currencies used in General Ledger Accounting.
  • Even if you migrate to SAP Accounting powered by SAP HANA from a system (e.g. EHP7) having FI-AA (new) already active, you still must migrate every active chart of depreciation.

10. Installing SFIN in Conversion/Migration Scenario

From the viewpoint of Asset Accounting, it is not necessary that you install SAP Simple Finance at the end of the year or period. However, it is required that you perform a complete period-end closing directly before you install SAP Simple Finance and some of the important point you must consider w.r.t New Asset Accounting. (for detail you may refer conversion guide)

  • To check if the prerequisites outlined are met, you have to check using the program for preliminary checks RASFIN_MIGR_PRECHECK. You import the current version of this program using SAP Note 1939592, before you install SAP Simple Finance in your system. Perform this check in all of your systems – in the Customizing system as well as in the downstream systems (test system and production system).
  • If until now you updated transactions in parallel valuations with different fiscal year variants and want to continue using this update, then you must implement a new representative ledger using the SAP General Ledger Migration Service before you install SAP Simple Finance. For more information about alternative fiscal year variants with parallel valuation, see SAP Note 2220152 Information published on SAP site.
  • You must have performed periodic APC posting (RAPERB2000) completely; the timestamp must be current.
  • Execute the periodic depreciation posting run (RAPOST2000).
  • Run the program for recalculating depreciation (transaction AFAR).
  • Reconcile your general ledger with the Asset Accounting subsidiary ledger, both for your leading valuation and for parallel valuations.
  • The migration must take place at a time when only one fiscal year is open in Asset Accounting.
  • You can check which fiscal year is closed in your company code in Customizing for Asset Accounting (New) under Preparations for Going Live à Tools à Reset Year-End Closing.
  • Ensure that no further postings are made in your system after running period end transactions before installing S4 Hana Simple Finance hence lock the users.
  • Perform a backup before installing SFIN
  • As soon as you have installed SAP Simple Finance, you can no longer post in Asset Accounting. To ensure that migration is successful, it is essential that you make sure that the prerequisites are met and a complete period-end closing was performed before you install SAP Simple Finance. Posting for new Asset Accounting is only possible again after you have completed the migration fully and successfully.
  • After completing the migration, make sure that no fiscal year that is before the migration is reopened in Asset Accounting.


Thanks a lot

Ajeet Agarwal



I will be sharing here a different approach other than the very old program that downloads SAP BW  Hierarchies to flat files.

That is usually done to keep the maintenance of the SAP BW Hierarchies in Development Box and Export/Import them through SAP BW systems.

In order to upload the file back to SAP BW we just need the Data Source based on Hierarchies. Nothing new here.

However, I came across an issue that happens very often with the sorting (or order) of the nodes. This because the segment 3 of the datasource (Hierarchy Structure) does have the fields Child ID and Next ID in its metadata. Those fields provide the hierarchy with the right positioning of the nodes:

as we can see here, the right order for the hierarchy are nodes 115, 88 and 89.


This is what we can see from the segment datasource based on Hierarchy for segment 3 (Structure):

To make things a bit more difficult, the export file provided by the custom ABAP program (the one out there!) has the sorting by Node ID:


so, by the time we import the file here (in this sample) the results will be the following:


Hierarchy is sorted by NodeID from the underling node(no matter what). It lost its original presentation. (fig. 1)



We could just ignore the old solution and try something else. To bypass this issue I carried out the following steps:

  1. New Open Hub Destination based on SAP BW Hierarchies to a Flat File
  2. New “Flat File” datasource using the Open Hub Destination previously created
  3. Transformation from the datasource to the infoobject (Hierarchy)


Open Hub Destination based on Infoobject Hierarchies: I will not go through the creation of the Open Hub here. Just follow this post for the complete details.

New flat file Datasource using the Open Hub as a template: keeping it simple. Just select the one created on the previous step.

This is the new structure now showing all the internal fields:


Transformation based on the new Datasource: In this case, I coded an ABAP Expert routine. This is because the regular transformation based on Hierarchies has 5 segments (Header. Hierarchy Description, Structure (nodes), Texts for the Nodes, and Texts for Hierarchy levels (not needed here)). Moreover, the Datasource based on the OHD file is not segmented. Only one structure is provided.

Inside the Expert Routine, the transformation based on SAP BW Hierarchies in this case uses the following 4 ABAP internal tables:

  • RESULT_PACKAGE_1 : Header
  • RESULT_PACKAGE_2 : Hierarchy Description
  • RESULT_PACKAGE_3 : Hierarchy Structure
  • RESULT_PACKAGE_4 : Hierarchy Nodes Text

The ABAP code is very simple. I only had to populate the internal tables accordingly using MOVE-CORRESPONDING based on the source_package:


DATA: ls_source_fields TYPE _ty_s_sc_1.

* Hierarchy Header – RESULTS_PACKAGE_1


* Hierarchy Description : RESULTS_PACKAGE_2
RESULT_FIELDS_2-txtsh = ls_source_fields-h_hienm.
RESULT_FIELDS_2-txtmd = ls_source_fields-h_hienm.
RESULT_FIELDS_2-txtlg = ls_source_fields-h_hienm.


* Hierarchy Structure : RESULTS_PACKAGE_3


* TExt for Hierarchy Level: RESULTS_PACKAGE_4
IF ls_source_fields-langu IS NOT INITIAL.


That’s all!

if you follow those steps the Hierarchy will be imported as same as the original one. No need to use that old program again.






I am genuinely jubilant in presenting my first paper in SCN. As this is my first paper, humbly request you to correct me if you find any mistakes .
With the release of OP edition 1610, SAP has optimized the functionalities in an efficient manner.
Here by I would like to furnish the glimpse of innovations in ML functionality.

Material Ledger Activation

Material Ledger  activation is mandatory for Material Valuation in all S/4 HANA On-Premise releases and becomes now part of the standard- On premise release 1610. However activation of Actual Costing functionality is still optional.

Innovations in Material Ledger functionality with the release of 1610

a) Significant increase in Transaction data throughput for high volume transactions such as goods movements

We know the fact that for materials with price control Standard price, statistical moving average price gets calculated and this requires exclusive locking eventually limiting the data through put in the sap system. To get rid of this , deactivation of statistical moving average price is imperative.

It is important to note that the Deactivation of statistical moving average price is not reversible.

Then you might question, what about the materials with price control – V?
Answer is Transaction data throughput is still limited by exclusive locking.

The following database fields gets effected by the deactivation of Statistical MAP

Note: If Actual Costing is active, the aforesaid fields will still be updated with the periodic unit price during the Actual Costing Closing Steps.
Finally, the report SAPRCKM_NO_EXCLUSIVELY_LOCKING can be used to deactivate the statistical moving average price. This can be done for converted systems running the S/4HANA on-premise edition.
In the S/4HANA cloud edition or new installs of the SAP S/4HANA on-premise edition, the statistical moving average price is disabled by default.

For more info,refer to Oss note -2267835
b) ML Actual costing Data Conversion in S4HANA 1610 Release

After activation of Actual costing, Data conversion to S/4 HANA data structures could be done via program FCML4H_STARTUP or via transaction FCML4H_STARTUP.

For more info, refer to Oss notes -2403356
c) Currencies and ML type
It is not allowed to use a ML Type that references to currency settings defined in FI or CO (flags Currency Types from FI, Currency Types from CO). There is no default Material Ledger Type 0000 anymore. Instead you have to define explicitly the currency and valuation types that are relevant for Material Ledger.

Recommended order of customizing steps for assigning a ML type to valuation area

  1. Define Material Ledger type in transaction OMX2.
  2. Assign corresponding Material Ledger type to valuation area in OMX3.
  3. Activate Material Ledger for corresponding valuation area in OMX1.

For more info, refer to OSS note -2427356, 2426246
d) Simplified Data model

The following simplified data model has been introduced

For more info,refer to Oss notes – 2352383

Some of the former periodic tables are still required for the following purposes

  • The Material Ledger Closing Document is stored in the former ML document tables (ML*).
  • Standard Price. Periodic Unit Price and Price Control are still managed in table CKLMLCR.
  • Cost Component Split for Prices is still managed in tables CKMLPRKEKO and CKMLPRKEPH

Significant design changes to be noted

  •  Single- and multilevel differences are no longer distinguished. In table MLDOC all price/exchange rate differences are stored in fields PRD/KDM. In table MLDOCCCS the CCS for price/exchange rate differences are stored under the CCS types E/F (field MLCCT).
  • The CCS is stored in a table format using the cost component (field ELEMENT) as additional key. This allows increasing the number of cost components without modification.
  • The CCS for preliminary valuation is always stored.

e) Innovations in CKMLCP screen

Earlier, CKMLCP Screen is embedded with the individual processing steps. Now it has been optimized to four steps
1. Selection
2. Preparation
3. Settlement
4. Post Closing
Note: Individual processing steps such as Single level price determination, Multi level price determination, Revaluation of Consumption, Work in process revaluation has been concatenated into single step as Settlement.
The section “Costing Results” has been removed. It will be replaced by the report FCML4H_RUN_DISPLAY_MATERIALS that displays a material list with status and it will be called via the button “Results” in the section “Processing”.
Additionally to the status of the materials, the status of the activity types is displayed in the
section Processing. The button Activity Types for displaying the activity types value flow has
been removed.


f) Innovations in CKM3
The following buttons are added in CKM3:
1. Source Document: steers to the document that triggers the creation of material ledger document. The typical source document is MM document or price change document or incoming invoice;
2. Reference Document: navigates to the original business object as of Production Order, Process Order, Product Cost Collect, CO production Order or Purchase Order and so on.
3. Closing History: routes to show the closing history if the closing is cancelled at least once.
Apart from the above, the following are substantial changes done in CKM3 screen.

  •  No separate view for Cost Components, but integrated in main screen.
  •  Flag for selection of cost components not relevant for material valuation, or only cost
    components relevant for material valuation; by default selection of inventory relevant cost component split.
  • Display of WIP Reduction for material; by default WIP Reduction is hidden; If WIP reduction is displayed, both WIP reduction and consumption of WIP for order are shown in opposite sign in different folder.
  • Plan/Actual Comparison is removed in new CKM3

CKMLQS- Multilevel Quantity Structure screen is not available from the release 1610.

For more info ,refer Oss note – 2386597
h) Other innovations

  •  A new 2-dimensional distribution logic avoids rounding errors. The new CKM3 will match vertically and horizontally even on level of Cost Components.
  • Price Limiter Logic is accurate on level of Cost Component Split.
  • Change of standard price for materials and activities within the period is supported.
  • Materials and Activity Types are widely treated as equivalent objects. The cost allocation logic is essentially the same. The same reports (e.g. CKM3, CKMVFM) can be used for both of them (-> to be enabled yet!)

For more info, refer to Oss notes – 2354768

Best Regards,


Good day:

One of the realities you face in Solution Manager 7.2, is the fact that SAP introduced so many good new features to the SM* transaction types, that you want to incorporate them all into your ZM* transactions.  The problem is that the incorporation of those changes via update, can only happen in your development system, because that is the system where the ZM* transactions were created, in principle.  If a transaction type was not create in the system you are first testing the upgrade, e.g. a sandbox,  you are left with only the option of working with the SM* transactions.

As upgrade times and deliverables are important, the more sophisticated or rooted you have your Solution Manager system in your organization, the more it may take for your upgrade project.  In that scenario, your organization may end up upgrading one or two sandboxes before you reach your development system.  If that is the case, the present blogs gives you the opportunity of incorporating all the new SM* features into your ZM* transactions for non-development systems, although you can still use the same approach explained below, in that system.

Objective: To show you how to use transaction AI_CRM_CPY_PROCTYPE to recreate your ZM* transactions.

Summary of how we will achieve the objective: We use of the copy option found in transaction  AI_CRM_CPY_PROCTYPE.  That copy option does not work immediately in the first try, so we show you the steps to achieve that, by taking advantage of the very useful information the transaction itself provides.  You will be able to identify the tables that need to be adjusted, and we will tell you the methods to adjust them.

Some of the table entries are to be deleted via spro, some others via SM30/31, or even SE16.

Seems quite logical and obvious to undo configuration, but there are some tricks we explain here to achieve the whole objective, particularly  at the end, when your are left with one single table, in the case of ZMRC, for which you need to create a maintenance view in order to be able to delete the only entry left that does not allow the copy process to become achievable.

Assumptions:  We assume you are familiar with SPRO configuring ChaRM configuration, as well as using SM30/31.  There are many steps we will go through quickly to make this document lighter.  In that sense, when some tables are mentioned and we tell you to remove the records for an specific transaction type, e.g. via SM30, so you should know what we are talking about.  Not a big deal taking special care, doesn’t it?

Time for a cookie.   Please take the first one from the jar.

The table that stores the transaction type information for the transaction types created in your SAP system is AIC_PTYPE_TABLOG.  If a transaction type you are looking for has no entries in that table, it means the transaction type was created in another system of your landscape and it came into the system you are checking via transport.  This may give you a tip of why the update program does not work for you and no tables are retrieved when you press F4 in the selection screen below:

Process:  We will take ZMCR for the exercise as that is the most challenging of all.  For the rest of the transaction types, the process is very similar.

Step 1:  Use the transaction AI_CRM_CPY_PROCTYPE to backup your transaction types, e.g. ZMCR > YMCR, ZMAD > YMAD, ZMHF > YMHF, ZMMJ > YMHJ.  This will help if in the future you need a reference of how things were before you started the re-creation of the transaction types.

Step 2:  Delete you transaction type ZMCR.  Via SPRO > IMG >  SAP SolMan > Capabilities > ChaRM > Define Transaction Type.

Step 3. Use the copy program to try to see what else needs to be cleared.  Unfortunately the step above leave lots of entries still in some tables, that need to be manually cleared.

Run the transaction AI_CRM_CPY_PROCTYPE to try to copy from the first time SMCR > ZMCR.  You will get a first glance of the work to be done.

So, what do we have above?  Two types of pending issues.  The yellow warnings (in red rectangles), refer to objects that are found still in the system and will not be overwritten by the copy.  You can decide whether to incorporate the SM* changes or not into those tables, e.g. partner determination procedure.  In our first approach, we decided to overwrite everything, so we had to go into each of the configuration areas or the red squared items and manually remove the configuration for ZMCR via SPRO.   We hope with this explanation is more than enough for you to know what to delete. You should know where in spro to delete: action profile, text determination procedure, etc.  If you are not familiar my friend, either you are learning or you are in the wrong blog.

The clearing of the 13 tables in the blue squares is the most.  Your mission is to clear all the entries referring to ZMCR in those tables, else the copy will not be possible.  As a final note, you may get a list of more or less tables with error status.  Still the logic presented here can be used for your own reality.


Before we show the analysis per table, one nice feature in the copy program happens when you double-click on an entry in error status.  On the right hand side of the screen it will be displayed the entries the copy program is planning to insert.  You can use that to focus on the entries you need to remove from that table, in particular. Example:

For the table above CRMV_TRJN_TRCA you use  SE16 to access it.  Below the process:

After you do that, the table entries are displayed and after you switch to change mode, you will be able to clear the entries.

Step 4. Re-point current usage of ZMCR.  Via SM30, update table DNOC_USERCFG by repointing the entries for ZMCR and ZSOLMANPRO to the standard.



Note: After the copy, you will need to point back to ZMCR, even the newly introduced entry in 7.2 CHARM_ADD.

Step 5. Begin clearing up all the tables. The table below shows you the tables from the copy report above and the method used to clean up each table:  Some via SE16, some via SM30/31, some via SPRO.  Some of the SM30/31 tables could have been accessed via SPRO, as well. but after a couple of searches with no results, we decided to take the quick path.

Note:  There are some additional tables placed in italics, that appeared, e.g. when we decided to delete the partner determination procedure for ZMCR.

The last table, the view in red font, SOCMV_PROC_TYPE2, was the real challenge. That view is formed by various tables, one of which is TSOCM_PTOC_TYPE.  Unfortunately, there is no maintenance view built for that table.  Without that maintenance view, you cannot delete the only entry left to get the green light to successfully copy SMCR > ZMCR.  How did we achieve that?

Create a function group via SE37.

Provide information similar to the one shown and save.

In transaction SE11 take a look at view SOCMV_PROC_TYPE2 and see the tables it is conformed of:

In the screen below, you are supposed to select the fields for the maintenance view.  For our case we leave them all there.   You may want to activate now and may want to use a package related to the upgrade.

Assign the appropriate details, save and activate.

With that in place you should be able to use SM31 to access view ZTSOCM_PROC_TYPE to edit table TSOCM_PROC_TYPE and delete the only line left before you can actually copy SMCR on to ZMCR.


Step 6.  Copy process.  Launch AI_CRM_CPY_PROCTYPE to copy from SMCR to ZMCR.  The green light to copy should be visible.   Screenshot from ZMMJ, another transaction type we were able to recreate

Step 7. Post-copy actions.    Up to here, you have a vanilla transaction type.  The pending actions are divided in 2:

  1. The SPRO-based customization you had in ZMCR prior the copy that you will need to add back via SPRO or the new flavor of SOLMAN_SETUP.
  2. The Web UI, that also needs to go back to standard.

With item #1, there are some important things to consider:

  • Via SM30, update table DNOC_USERCFG by repointing back the entries to ZMCR and ZSOLMANPRO.  Check out the newly introduced parameter, as well.

  • Restore back number range, approval procedure, and early numbering assignment, if applicable.

  • In the partner determination procedure, you may need to adjust the main partners of your ZMCR.  You can compare the screen below with the one your backed up when you created YMCR to bring back the old customization.

Note:  Keep in mind, as of 7.2 Support Team has been replaced by Development Team.

  • Also in Change Request Management Framework, review and compare between YMCR and ZMCR, Define status change depending on approval result.
  • Compare YMCR and ZMCR  via SE16 for table CRMV_TRINB_TRCA > SOLMAN-TRANSACTIONS.
  • Are there are new texts in Text Determination Procedure that need to be incorporated.

  • In multilevel categorization, are there business proposals to be adjusted?

  • And the copy control, because the vanilla may have additional entries that are not required.

With regards to Item #2, Web UI, same as it was done with the transaction types, you may want to backup the old configurations, delete the old ones and recreate them.  Example ZMMJ:

With this in place, ZMCR is fully functional.

Step 8. Place back any additional SPRO customization performed in the past.

Step 9.  Incorporate any pre-existing Web UI customization performed in the past.


Conclusion: We know, it seems  like a long walk, but the satisfaction in our team was that we were able to fully access ZMCR in the sandbox to quantify the gaps we had, test security, etc.  We were ahead of time in identifying all necessary steps for after the upgrade, before we reached the development system of the production landscape.  That is not possible in the status quo right after the upgrade.


Hope you enjoy and thanks for reading.

Juan-Carlos Garcia-Garavito

This blog is co-authored by Gobinder Sandhu  (https://people.sap.com/gobi.sandhu) and Stéphanie Bourgault-Mongeau (https://people.sap.com/stephanie.bourgault)


Employees under collective bargaining agreements or hourly employees can be compensated based on a pre-define pay structure following specific rates of remuneration. In SuccessFactors, it is possible to implement the pre-defined remuneration structure using the Pay Scale Structures Objects and auto-generate the employee Pay Component Information accordingly.

The Pay Scale Structure concept is well known in the SAP HCM world but we found few reference of it for SuccessFactors. Hence, in this blog entry, we will define the pay structure objects, explain how to configure the auto-assignment of employee’s compensation based on pay scale structure using business rules and highlight how general increase can be manage using the Adjust Employees’ Compensation to Tariff Changes transaction in SuccessFactors.

This blog entry can be useful for existing and future SuccessFactors implementations providing the understanding around handling Pay Scale Structures in SuccessFactors (and understanding how it can ease the data entry and facilitate data accuracy in the system). Moreover, this blog will mostly be useful for the consultants in explaining how to achieve Pay Scale Structures configuration in SuccessFactors in detail.

Pay Scales versus Pay Grade and Pay Range

Before we dive in into Pay Scale Structures, we want to state that Pay Scales are not to be mistaken with Pay grade or Pay Ranges.

Pay Grade Structure are defined together with pay or salary ranges and frequency, which determines the pay to the salaried employee placed on a particular job or function. The pay range will have minimum, maximum and mid or reference value as well assigned with the pay frequency.

A Pay Grade structure thus will have a pay grade, pay range and frequency
Pay Scale Structure are defined together with pay scale type, area, group and level. The pay scale defines pay rate based on the skill, area of work, time spent on a particular job, generally in a unionised environment.
While the pay grade/ranges are determined by the market pay studies, the pay scale structure is determined as per agreement during collective bargaining with the unions. Both type of structures are used to define the base pay of an employee.

Process for implementing Pay Scale Structure in SuccessFactors Employee Central

The steps to configure and used Pay Scale Structure are:

  1. Configure and Maintain Pay Scale Objects
  2. Configure and Maintain Pay Scale Structure fields on position and employee
  3. Configure the Business Rules to assigned the Pay Components Information on employee based on their Pay Scale Level
  4. Manage General Increases

STEP 1: Pay Scale Objects

Pay Scale related objects are Generic Object in SuccessFactors using the MDF framework. As all MDF object, they are configurable via Configure Object Definition. In this blog, we will uses screenshots of the standard configuration of these objects but keep in mind that it is possible to add custom fields according the organizational needs.

Pay Scale Structure Relationships

Pay Scale Type:

Pay Scale Type is usually used to define a the type of collective agreement. The field can be used to clearly differentiate between different collective agreements within an organisation.
Example of Pay Scale Type:

For integrating with SAP Payroll: Keep in mind that for the integrations with the SAP payroll, the code for pay scale type should be restricted to 2 alphanumeric characters.

Pay Scale Area:

Pay Scale Area is usually used to define the applicability of the pay scale type by geographical area or by any specified criteria.
Example of Pay Scale Area:

For integrating with SAP Payroll: Keep in mind that for the integrations with the SAP payroll, the code for pay scale area should be restricted to 2 alphanumeric characters.

Pay Scale Group:

Pay Scale Group is linked to a pay scale type and pay scale area and is used to group the pay scale levels of each specific job that follows a pay scale structure in the organization.
Example of Pay Scale Group:

For integrating with SAP Payroll: Keep in mind that for the integrations with the SAP payroll, the code for pay scale group should be restricted to 8 alphanumeric characters.

Pay Scale Level:

Pay Scale Level are multiple “pay steps” attached to pay scale groups that defines for each step the pay component(s) their values for employees being in a level.
Example of Pay Scale Level:

For integrating with SAP: Keep in mind that for the integrations with the SAP payroll, the code for pay scale level should be restricted to 2 alphanumeric characters, preferably a numeric value so that step increase process as well as pay scale reclassification is well supported.

STEP 2: Assigning the Pay Scale Objects to Positions and Employees

Once, the Pay Scale Structure is implemented, it is possible to configure the Pay Scale Structure fields on Position and Employee’s Job Information. Please notes that, in the position object, these field are not standard fields, therefore custom-fields referring to the pay scale generic objects are needed. Here is what it looks like once configured:

Pay Scale Structure on the Position Object

Pay Scale Structure on the employee’s Job Information

STEP 3: Automatic Assignment of Employee’s Pay Component based on Pay Scale

One of the best value in maintaining the company Pay Scale Structure in SuccessFactors is that the user doing the data entry will no longer have to enter the Pay Component information in the system. Using Business Rules, it is possible to create employee’s Pay Component automatically when Hiring or updating an employee’s Pay Scale Level. Here how it works for the end user:

1. Create Pay Scale Assignment on Employee Hire Rule

As in this example, during the new hire process, once getting to the Compensation Information screen, the Pay Component gets pre-filled with the Pay Scale Structures based on the employee’s Job Information Pay Scale Level, no need for data entry:

Check out a business rules example to set this up in the system at the end of the article. This rule should be set on-Init of the Recurring Pay Component section in the Succession Data Model. Note that rule will vary based on the customer’s requirements.

2. Create Pay Scale Assignment on Employee Update Rule

Same thing happens when an employee Pay Scale Level gets changes. There is no need to select Compensation Information during the changes and a business rules will adjust the Pay Component information on-Save. Check out a business rules example to set this up in the system at the end of the article.

Result on Pay Component Information:

STEP 4: Manage General Increases

Managing General Pay Increase using feature – Adjust Employees’ Compensation to Tariff Changes
One great Pay Scales Structure feature in SuccessFactors, is the “Adjust Employees’ Compensation to Tariff Changes” transaction also termed as Manage pay Increases. With this transaction, it is possible to mass change employee’s compensation after a general pay increase.
To see how this works, let’s say, there is a change of a Pay in the new agreement effective as of March 20th 2017 where the employees now makes 25 Euros hourly instead of 20 Euros on a specific pay scale structure. We would insert a record specifying the new amount in the existing Pay Scale Level object effective March 20th 2017:

Then, to update all the employees belonging to this Pay Scale Structure to the new Pay Component Amount, we would run Adjust Employees’ Compensation to Tariff Changes transaction.

This transaction allows you to fetch all the employees that pay components amount should be increased from a specific date. In our example, we would run the Tariff Changes program as of March 15th, 2017. It is possible to build an Employee Group as a filter to limit which employees should be picked up by the Tariff Changes programs and/or limit the changes to employees belonging to a particular Pay Scale Type/Area/Group. In our example, we will select Pay Scale Group G2 as the Pay Scale Level we modified belongs to the G2 Pay Scale Group.

Once, our Tariff Changes run as been set-up, we can run the program in simulation mode or perform the changes using update mode. The system will ask the user to select the event reason to be used and if a log file needs to be generated (log file generation is recommended when running this transaction in order to view all employee changes).

Here how the employee compensation history looks like after running the Tariff Changes program:

Here is an example of the Log file given below:

In order for the Tariff Changes program to work, the following IF statement needs to be added to the Create Pay Scale Assignment on Employee Update rule. (See rule example at the end of article)
This is it for Pay Scale Structure in SuccessFactors. We hope that our blog entry was helpful to the SuccessFactors community in understanding what are Pay Scale Structures and how can we configure them in SuccessFactors.

Stay tune for our future blog entry on how to upgrade Pay Scale Level automatically using Off Event Batch (Also termed as OEB) Cycle rule in SuccessFactors!!


On Hire Rule (On Init of Recurring Pay Component):

On Update of Job Info Rule (on Save of Job Info

Adapting the on Update of Job Info Rule to support general Increase: