Skip to Content

1. Functional Correctness

Prerequisites

Before you start reading this blog it is good to read the blog series.


  1. Introduction – Unleash the power of SAP HANA from your ABAP Custom Code – http://scn.sap.com/community/abap/hana/blog/2014/06/20/abap-custom-code-management–leverage-the-power-of-sap-hana-in-abap


What does Functional Correctness mean?

Before going for optimization, the existing custom code should result as expected after SAP HANA migration. In general the custom code should work perfectly as expected after migration until unless

  • The custom code has some DB specific code or query in it
    • Each DB has specific features and unique technical behavior.
    • DB specific code may rely on these features of the used database
  • The custom code has used some DB indexes specified in earlier DB
    • Column based architecture – as a consequence e.g. secondary DB indexes are less important.
    • DB specific code may rely on the existence/usage of certain DB indexes.
  • The cluster/pool specific table reading (for Binary Search)
    • During the migration to SAP HANA most pool and cluster DB tables are transformed to transparent DB tables (de-pooling / de-clustering) so that the tables can be used in analytic scenarios.
    • DB specific code may rely on the technical specifics of pool and cluster tables.
  • The SAP OSS Note 1785057 details about the above checks.


The above checks can be identified using functional check. SAP provides tools to identify them and suggestions are given on the tool for the identified issues to correct them to work as expected.


Example of DB Hint:

Image1.png

The above source code shows an example of usage of DB hint in the SQL Query. The DB hint (from Oracle DB) is enforcing the SQL query to use the index defined at the DB level which means that post SAP HANA migration this index becomes invalid and leads to functional issues.


Example for Cluster/Pool table read:

Have a look at the code which is reading data from the table “EKPO” which was a cluster / pool table before migration.  After migration to SAP HANA this table becomes transparent table. The code is written to fetch the records from the table “EKPO”. The following statement reads the internal table “IT_BKPF” using binary search. Binary search, expects the internal table sorted by key attributes. If not, the search fails. The code works perfectly fine before migrating to SAP HANA as the internal table “IT_BKPF” is sorted by default as it is implicit behavior of cluster / pool table. Post migration to SAP HANA, this statement fails as the internal table “IT_BKPF” is not sorted. Hence before using the binary search on the internal table an explicit sort (based on primary key at least) is needed to make sure the migrated report / custom code results the same output.



SELECT

    awkey

    gjahr

    belnr

    xblnr

    bldat

  FROM bkpf ” BYPASSING BUFFER

  INTO CORRESPONDING FIELDS OF TABLE it_bkpf

  FOR ALL ENTRIES IN lt_invoice_details

  WHERE

    gjahr = lt_invoice_details-gjahr AND

    xblnr = lt_invoice_details-xblnr AND

    awkey = lt_invoice_details-awkey.

“…… Some calculations

READ TABLE it_bkpf WITH KEY gjahr = invoice_details-gjahr

                            xblnr = invoice_details-xblnr

                            awkey = invoice_details-awkey

           BINARY SEARCH

           TRANSPORTING belnr  bldat.

IF sy-subrc = 0 .

“ Further calculations

ENDIF.



The recommendation / solution:

Image1.png

  • SAP Note 1622681 for the supported DBSL hints for SAP HANA.

How to find the functional issues:

SAP Provides tools to identify such issues during migration to SAP HANA. The tool ATC (ABAP Test Cockpit) helps to identify the functional issues.

  1. Start the transaction SATC
  2. Add your objects (into the object list)
  3. Use the variant “FUNCTIONAL_DB” which is preconfigured with necessary checks for identifying the functional issues.

The code inspector tool (ATC/SCI):

The code inspector tool helps to identify the functional issues and potential performance issues. These checks are static check based on the custom code. SAP has improved the code inspector tool with more checks to identify the functional issues which can occur after migrating to SAP HANA. For SAP HANA, ATC is the tool for preparing the custom code for functional correctness and detecting custom code for potential optimization.


  • ATC availability starts with NW 702 SP12 / NW 731 SP5. In older releases the Code Inspector can be used.


The code inspector tool is available to find all the functional issues easily. The image below shows the new checks added to the tool and its purpose. Image1.png There are new checks added under category,

  • Security Checks: Analyze the native SQL and open SQL carefully and finds the functional issues.
    • Critical Statements: Checks the native SQL and DB hints on the SQL statements.
    • Use of ADBC interface: Checks the ADBC class for the SQL statements used in the query.
  • Robust Programming: Analyze the problematic statements which can lead the wrong results for cluster/pool table (Transparent tables after migration).
    • Search problematic statement without ORDER BY clause:
      • Finds the custom code which relies on implicit sorting
      • Search statements for BINARY SEARCH, DELETE ADJACENT DUPLICATES for cluster/pool tables
      • Mostly these checks fall under false positives
    • Depooling/Declustering – Search without ORDER BY clause:
      • Searches the statements for cluster/pool tables without ORDER BY clause on it
      • Works only for cluster/pool table reads
      • Mostly these checks fall under false positives
  • Search Functions: Analyze the function module call for index related usage.
  • With ATC, SAP delivered a standard variant for Functional correctness checks named “FUNCTIONAL_DB” which is configured to identify all functional issues discussed earlier.



Follow up blogs

The blog series would discuss the different phases of the custom code management as follows.

  1. Detect and Prioritize your Custom Code – Unleash the power of SAP HANA from your ABAP Custom Code- Accelerate your custom reports like never before – Detect and Prioritize your Custom Code
  2. Optimize your Custom Code – http://scn.sap.com/community/abap/hana/blog/2014/06/27/unleash-the-power-of-sap-hana-from-your-abap-custom-code-accelerate-your-custom-reports-like-never-before–optimize-your-custom-code


  • Please note that the approach shared here is a completely iterative and flexible model which is just suggestive in nature rather than a rigid process.
To report this post you need to login first.

2 Comments

You must be Logged on to comment or reply to a post.

  1. Matthew Billingham

    One of the few reasons to use FOR ALL ENTRIES is if one of the tables was a cluster table. With HANA that reason has gone. So the read from BKPF should be now an INNER JOIN.

    Also… you’re solution involves a SORT and BINARY SEARCH. Why not define T_BKPF as as a HASHED table (or SORTED if you need a non-unique key).

    These structures have only been around for 15 or more years, so why this obsession with BINARY SEARCH, I really don’t know – except perhaps choosing the correct table type involves a bit of analytic thought… 😏

    (0) 
    1. K Prakash Post author

      Hi Matthew,

      I agree your point, This code is an example to showcase the functional correctness. The approach you have given is valid for “HANA” database. But during the migration, until we proceed to find out the reports that we optimize, we generally don’t change the code (like you mentioned). But it is our duty to ensure that POST HANA DB migration all the existing reports results the same as earlier. So BINARY search is used on the internal table assuming that internal table is sorted (as it was taken from cluster table). But after migration the source table becomes transparent table and sorting is unsure. Hence these kind of table read when it uses binary search, we should make sure that the table is sorted by key fields or necessary attributes.

      NOTE: When you start optimizing this code might become invalid as we may do code pushdown.

      Thanks

      Prakash K

      (0) 

Leave a Reply