Skip to Content
Technical Articles
Author's profile photo Michael Keller

thoughts about the ABAP SQL Test Double Framework

Dear community, in this blog I’ve told you about my first experiences with the ABAP SQL Test Double Framework. It allows the creation of database table test doubles that contain defined records for unit tests. I’m still enthusiastic and I consider this technology as an important component to ensure code quality. Here are some thoughts I’ve had since then.

Uniform language

In general, I think it makes sense to use the same language among colleagues. This makes it easier to understand what the dialog partner means. Especially for complex test data, it could be a good idea to assign a synonym to special data records. For me, the behavior of a method is often of interest by one, two and many data records. Here are some proposals:

  1. one/two/many
  2. golden/silver/bronze
  3. strawberry/cherries/grapes

I thought about using the primary key to identify a special record in a dialog. That would work for classic database tables (“let’s talk about purchase order 0045008022“) but that would be no fun if a GUID is key.

Provide Test Double entries

In my experience, test data should be defined together. This creates different data constellations. I often exchange ideas with consultant colleagues. We look together at database tables and their content and talk about possible test cases. It would be great to have a tool with which you can mark a data record in transaction SE16, for example, and that generates the ABAP INSERT-statement with all the fields and values of this “real” record. Think of a database table like EKKO or VBAK, there are a lot of fields. Is there something like this available?

Unified way

In order to provide special test data in a uniform way, there should be a provider class per use case. As a result, the provision of the test data records is standardized, carried out at only one point in the application logic and can be reused. Using the example from my former blog, it could look like this.

CLASS zcl_system_clients_asql_td DEFINITION
                                 CREATE PUBLIC.

    METHODS constructor.

    METHODS provide_test_double_records
        VALUE(result) TYPE zcl_system_clients=>clients.


    DATA test_double_records TYPE zcl_system_clients=>clients.

    METHODS add_golden_record.
    METHODS add_silver_record.
    METHODS add_bronze_record.

CLASS zcl_system_clients_asql_td IMPLEMENTATION.
  METHOD constructor.
    add_golden_record( ).
    add_silver_record( ).
    add_bronze_record( ).

  METHOD provide_test_double_records.
    result = test_double_records.

  METHOD add_golden_record.
    INSERT VALUE #( mandt = '000' mtext = 'golden record' ) INTO TABLE test_double_records.

  METHOD add_silver_record.
    INSERT VALUE #( mandt = '001' mtext = 'silver record' ) INTO TABLE test_double_records.

  METHOD add_bronze_record.
    INSERT VALUE #( mandt = '999' mtext = 'bronze record' ) INTO TABLE test_double_records.

If you add a GET-method per record, every developer can check what data is provided for golden/silver/bronze records. Below is the refactored test class.

               FOR TESTING
               DURATION SHORT
               RISK LEVEL HARMLESS.

    METHODS get_client_000 FOR TESTING.
    METHODS get_client_001 FOR TESTING.
    METHODS get_client_999 FOR TESTING.

    CLASS-DATA osql_test_environment TYPE REF TO if_osql_test_environment.
    CLASS-DATA clients_test_data_provider TYPE REF TO zcl_system_clients_asql_td.
    CLASS-METHODS class_setup.
    CLASS-METHODS class_teardown.

  METHOD class_setup.
    osql_test_environment = cl_osql_test_environment=>create( VALUE #( ( 'T000' ) ) ).
    clients_test_data_provider = NEW zcl_system_clients_asql_td( ).
    DATA(clients_test_data) = clients_test_data_provider->provide_test_double_records( ).
    osql_test_environment->insert_test_data( clients_test_data ).

  METHOD class_teardown.
    osql_test_environment->destroy( ).

  METHOD get_client_000.
    DATA(cut) = NEW zcl_system_clients( ).
    DATA(result) = cut->get_client( '000' ).
    cl_aunit_assert=>assert_not_initial( result ).

  METHOD get_client_001.
    DATA(cut) = NEW zcl_system_clients( ).
    DATA(result) = cut->get_client( '001' ).
    cl_aunit_assert=>assert_not_initial( result ).

  METHOD get_client_999.
    DATA(cut) = NEW zcl_system_clients( ).
    DATA(result) = cut->get_client( '999' ).
    cl_aunit_assert=>assert_not_initial( result ).

That were my thoughts. In practice, I have not yet been able to try any of these. I’m still at the beginning 🙂 Whats your opinion?


Best regards, thanks for reading and please stay healthy



P.S.: Please support the virtual wishing well.

P.S.S.: Not tired of reading blogs? Check this blog by Rainer Winkler.





Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo S Abinath
      S Abinath

      Great Info

      Author's profile photo Rainer Winkler
      Rainer Winkler

      Hi Michael,

      thanks for your interesting blog and for mentioning me. Please check the link you provided in your P.S.S. it does currently link to a blog which is not from me.

      Cheers, Rainer

      Author's profile photo Michael Keller
      Michael Keller
      Blog Post Author

      Hi Rainer, corrected. I'm sorry for that. Thanks for the note and thanks for sharing your experiences in your blogs! 🙂

      Author's profile photo Suhas Saha
      Suhas Saha

      Hi Michael,

      My 2 cents…

      The “Unified Way” approach is what we call, in our team, Test Beds. These test beds mostly contain Customizing data (e.g., Companies, Tax Codes etc)

      I have the privilege of using the latest version of ADT running on the latest ABAP release. There you can generate the VALUE statement for the “real” data record from the context menu in the ADT SQL console. Not sure if this is available for On Prem releases yet. See the release notes

      BR, Suhas

      Author's profile photo Michael Keller
      Michael Keller
      Blog Post Author

      Thanks for the note with the ADT SQL console. I will check that. By the way: access to the latest ADT and ABAP environment - you're really lucky! 🙂

      Author's profile photo Wouter Peeters
      Wouter Peeters


      Didn't hear of 'Test Beds' before, good idea. How do you do Unit Testing in general? We always add a 'DAO' object which contains all external dependencies such as BAPI's and SQL statements. This is then used in our code by means of Dependency Injection, so we'll just replace it with a test double during unit test. For larger developments or OO objects, we do the same and mock it out during unit test, e.g. logger class.

      We like the ABAP SQL Test Double framework, but this won't replace our usage of DAO, so we don't see any added value yet. It will just come down to what is the fastest to mock. But some general Test Beds might seem useful!

      Regards, Wouter

      Author's profile photo Sandra Rossi
      Sandra Rossi

      Nice, thank you for the tip!

      Example of VALUE generated for SCARR table with only 3 columns selected, and filter on CARRID like ‘A%’:

      VALUE #( ( CARRID ='AA' CARRNAME ='American Airlines' CURRCODE ='USD'  )
       ( CARRID ='AC' CARRNAME ='Air Canada' CURRCODE ='CAD'  )
       ( CARRID ='AF' CARRNAME ='Air France' CURRCODE ='EUR'  )
       ( CARRID ='AZ' CARRNAME ='Alitalia' CURRCODE ='EUR'  )
       ( CARRID ='AB' CARRNAME ='Air Berlin' CURRCODE ='EUR'  )
      Author's profile photo Thales Batista
      Thales Batista

      Hi Michael, it's a good thing to have SQL Test Double framework as an another tool for testing, but I'll put my devil advocate suit and and say that this falls into same category as Test Seams: only meaningful for legacy code and when you really don't have time (or being lazy) to write (learn) modern code (classes/interfaces).

      I'm not saying to not learn it, it is useful. Sometimes is the only way to "replay" some complicated scenarios wighout regenerating the whole scenario again (we shoudn't update standard tables, SQL Double is a neat tool to workaround that golden rule without breaking it). But this become a dreadful task when you have to double a not so small data set, and eventually you'll surrender to mock classes: you'll have your required output without coding all double joins to obtain it. I never measured but I think a direct mock class will have less LOCs than all SQL Test Double preparation when you have a non-trivial data model that spans across multiple tables (99.9% of SAP data models).

      The sad thing about SAP (standard) code is some shiny gems hidden there with great potential but  (probably) not fully acknowledge by themselves.

      CRM module almost fulfilled your wish to have a easy way to extract data for testing: there is an entire package dedicated to mock master data. They provide a proxy class for each master data object, and when you want to mock it you first "record" the real call (generates a XML file) and you replay it in your test suite (feeding this XML into proxy class makes them run into "mock mode", returning exactly that extracted data on runtime). You could almost call it a "Function Module Double Framework" (those classes wraps function and BAPIs calls).

      There are some flaws, but I think if they put the efforts to improve this concept instead of creating both Double Frameworks, we would have now something better for testing code, specially when integrating into standard code like BAdI calls. We wouldn't be coding selct SQL statements anymore (you can live without it on CRM pretty fine), always relying on standard function/classes that could be easily switched to mock mode when running a test suite. As a side benefit your code is less likely to break on SAP major changes (like many companies learned in the worst way, with syntax erros, when migrating from Business Suite to S/4 HANA), because they almost never change Function Modules Interfaces (even not being released for public used). This point is even proved with this technical series from Jerry Wang about how SAP redesigned the CRM internal layer to handle the new S/4 tables. Simply put: who always relied on function calls instead of direct database SELECTs had a smooth transition with no syntax error at that matter during the process.

      After all that ranting, I still see a solid future for SQL Test Double (at least the concept itself) but I won't tell you now. Take your seat and wait for a (still far) future blog.

      Sad fact: 'CRM Mock framework', mainly about how to record mock data, was to be my first blog post, but I gave up as it is still not that useful as it could be for daily tests.

      Author's profile photo Shai Sinai
      Shai Sinai

      Isn't CRM dead? 🙂

      Author's profile photo Thales Batista
      Thales Batista

      As a evolving Business Suite product probably yes (yet they are still providing new features requested by customers through Customer Connection), but its spirit still lives in Solution Manager 7.2 (guaranteed until 2030, and maybe even more if they stick with it to support S/4 until 2040).

      Great things lasts forever ?

      Author's profile photo Wouter Peeters
      Wouter Peeters

      Interesting, do you use Unit Testing for all your CRM developments?

      Author's profile photo Thales Batista
      Thales Batista

      I’d be lying if I said yes (because unit tests for everything is a pitfall, but I would guess you already know that), but I am still behind my quota of what I consider an acceptable Lines of Code per Unit Test ratio. Not being an excuse, but development is just one of my daily activities, and they have more a fixed deadline than a fixed amout of working hours, so sometimes I have to choose only the things I consider really important to not going forward without unit testing it (biased by Extreme Programming maxim “Test Everything That Could Possibly Break”) to fulfill the dates.

      That doesn’t means the code is wild and untestable, I write all of them to be test-friendly (could be used as a playground by anyone wanting to learn how to write unit test classes). as I would be doing if I was writing the unit tests. I also use ASSERT statements, an agressive but quicker way to validate things, and one valuable knowledge that Unit Testing and TDD teaches you: write small and decoupled units to make them plain simple to read and understand (aka CleanABAP, yet I already did that when my workplace allowed due my background with other programming languages tingled me to put ABAP server on fire whenever I saw hungarian notation associated with cryptographic acronyms derived from table columns names) this eventually dismisses the need to unit testing them.

      Author's profile photo Michael Keller
      Michael Keller
      Blog Post Author

      Good note regarding the effort to generate a mock object that delivers certain data and the doubling of different database table. I would like to think about this a little bit and share it via blog. Then we could discuss a particular design. Could take some time. Summer should be visiting Germany this week ... 😉

      Author's profile photo Thales Batista
      Thales Batista

      (I already put the suit on wardrobe, no more ranting) A SQL Test Double data generator like a SE16, as you proposed, is something that has potential to going forward. We are very familiar with it and Functional guys like to explain directly with table data (instead of related Function Modules). Even my main development being on a system doesn't have SQL Double available, surpassing the pain to generate data turns it into a good addition to development toolbox when it become available on it.

      I remembered something today about how to possible programatically discover all tables from a specific SAP data model but I don't even know if this is actively maintained or is already obsolete: Data model from Business Engineering. I used it on my early adventures across ECC to locate some tables and relate them, yet I never saw it displaying relevant information on CRM module.

      Something like this would be a nice addition to help "dump" all relevant database tables in one call (like passing a Order number and flagging "I want that and that related data" to dump them too). That would be useful for codes under test that uses a quite complex standard object, but this is for a far later version.

      Author's profile photo Justin Loranger
      Justin Loranger

      I too have made great use of the SQL Test Double Framework for a while now.

      However, as noted previously, it can get tedious generating the test double records if you have many tables or large datasets to work with.  It is also especially challenging when with working with CDS views.

      That said, I am still working on shifting my own development to incorporate unit tests and TDD.

      Thanks for the blog!


      Author's profile photo Michael Keller
      Michael Keller
      Blog Post Author

      I understand the effort to reconstruct certain data records and to link them. However, I first have to gain practical experience. Sometimes I write ABAP applications that also have to work on lower versions than SAP NetWeaver 7.51. Mock Object always works, ABAP SQL Test Double Framework unfortunately only from 7.51 on (as far as I remember).

      Author's profile photo Alexandr Razinkin
      Alexandr Razinkin

      If you need Open SQL Test Double Framework but your version is below 7.51 you may look at library zsql_test_double_framework

      The lib also lets you to include SQL syntax in unit test like the sql test double framework itself

      I have written it for myself and shared with github to make it available for others. There is documentation on github