Skip to Content

Despite all of the advances in Web service and proxy technologies over the course of the past few years, I still find that many customers prefer to use the tried-and-true ALE/IDoc technology stack. Consequently, a frequent administrative headache is the upload of IDoc metadata using Transaction IDX2. In this blog, I will demonstrate a simple report program that can be used to automate this task.

h4. What is IDoc metadata, anyway?

If you haven’t worked with the PI IDoc adapter before, then a brief introduction is in order. As you know, all messages that flow through the PI Integration Engine pipeline are encoded using XML. Thus, besides providing raw connectivity, adapters such as the IDoc adapter also must perform the necessary transformations of messages so that they can be routed through the pipeline. In the case of the IDoc adapter, the raw IDoc data (you can see how the IDoc data is encoded by looking at the signature of function module IDOC_INBOUND_ASYNCHRONOUS) must be transformed into XML. Since the raw IDoc data does not provide information about segment field names, etc., this metadata must be imported at configuration time in order to enable the IDoc adapter to perform the XML transformation in an efficient manner.

From a configuration perspective, all this happens in two transactions:

    In Transaction IDX1, you create an IDoc Adapter Port which essentially provides the IDoc adapter with an RFC destination that can be used to introspect the IDoc metadata from the backend SAP ALE system.

      1. In Transaction IDX2, you can import IDoc types using the aforementioned IDoc adapter port. Here, you can import standard IDoc types, custom IDoc types, or even extended types.

    If you’re dealing with a handful of IDocs, then perhaps this isn’t such a concern. However, if you’re dealing with 10s or 100s of IDocs and a multitude of PI systems, then this process can become tedious in a hurry.

    h4. Automating the Upload Process

    Now, technically speaking, the IDoc adapter is smart enough to utilize the IDoc port definition to dynamically load and cache IDoc metadata on the fly. However, what it won’t do is detect changes to custom IDocs/extensions. Furthermore, if you have scenarios during cutover which block RFC communications, not having the IDoc metadata in context can lead to unexpected results. The report below can be used to automate the initial upload process or execute a kill-and-fill to pull in the latest and greatest changes. In reading through the comments, you can see that it essentially takes two inputs: the IDoc adapter port defined in IDX1 and a CSV file from your frontend workstation that defines the IDoc types to import. Here, you just need to create a two-column CSV file containing the IDoc type in column 1 and the extension type (if any) in column 2.

    REPORT zidx_idoc_load_metadata.


    &—-



    *& Local Class Definitions                                             *
    &—-



    CLASS lcl_report DEFINITION CREATE PRIVATE.
      PUBLIC SECTION.
        CLASS-METHODS:
          “Used in the selection screen definition:
          get_frontend_filename CHANGING ch_file TYPE string,


          “Public static method for running the report:
          execute IMPORTING im_idoc_types_file TYPE string
                            im_idoc_port TYPE idx_port.


      PRIVATE SECTION.
        “Class-Local Type Declarations:
        TYPES: BEGIN OF ty_idoc_type,
                 idoc_type TYPE string,
                 ext_type  TYPE string,
               END OF ty_idoc_type,


               ty_idoc_type_tab TYPE STANDARD TABLE OF ty_idoc_type.


        “Instance Attribute Declarations:
        DATA: idoc_port  TYPE idx_port,
              idoc_types TYPE ty_idoc_type_tab.


        “Private helper methods:
        METHODS:
          constructor IMPORTING im_idoc_port TYPE idx_port,
          upload_idoc_types IMPORTING im_idoc_types_file TYPE string
                              RAISING cx_sy_file_io,
          import_idoc_metadata,
          remove_idoc_metadata IMPORTING im_idoc_type TYPE string.
    ENDCLASS.


    CLASS lcl_report IMPLEMENTATION.
      METHOD get_frontend_filename.
        “Local Data Declarations:
        DATA: lt_files       TYPE filetable,
              lv_retcode     TYPE i,
              lv_user_action TYPE i.
        FIELD-SYMBOLS:
          idoc_port
          WITH idoctyp = im_idoc_type
          AND RETURN.
      ENDMETHOD.                 ” METHOD remove_idoc_metadata
    ENDCLASS.


    &—-



    *& Selection Screen Definition                                         *
    &—-



    PARAMETERS:
      p_idxprt TYPE idx_port OBLIGATORY,
      p_ifile  TYPE string LOWER CASE OBLIGATORY.


    &—-



    *& AT SELECTION-SCREEN Event Module                                    *
    &—-



    AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_ifile.
      CALL METHOD lcl_report=>get_frontend_filename
        CHANGING
          ch_file = p_ifile.


    &—-



    *& START-OF-SELECTION Event Module                                     *
    &—-



    START-OF-SELECTION.
      CALL METHOD lcl_report=>execute
        EXPORTING
          im_idoc_port       = p_idxprt
          im_idoc_types_file = p_ifile.

    h4. Final Thoughts

    I hope you’ll find this simple report program useful. Please feel free to try it out, modify it, or do with it what you will. If you have any questions, please feel free to contact me. Also, if you are interested in learning more about SAP NetWeaver PI development, then I would encourage you to check out my new book: SAP NetWeaver Process Integration: A Developer’s Guide . Also, if you’re more of an e-book kind of person, be on the lookout for the Kindle release of this book coming in the next few days.</p>

    To report this post you need to login first.

    3 Comments

    You must be Logged on to comment or reply to a post.

    1. Michael Nicholls
      From my understanding, the first time an IDOC arrives at the IDOC adapter, the metadata is loaded from the appropriate system via the IDX1 port definition.
      (0) 
        1. James Wood Post author
          Yes, you are both correct. I should have been more explicit in my description of the utility of this program. A core use case for this program is in dealing with changes to custom IDocs. Though it can be used to upload metadata the first time, it also supports kill-and-fill.

          Another off-the-beaten-path use case is in cutover scenarios in which connectivity to the backend SAP system is disabled while upstream system connectivity remains in effect. In this scenario, I usually lock the communication user used in the RFC destination in the backend SAP system such that IDocs remain queued up in the tRFC/qRFC layers. If the IDoc metadata is not there, and communication is blocked, then the IDocs error out further upstream in the PI IE pipeline.

          I have updated this blog to reflect these points, and appreciate your feedback.

          (0) 

    Leave a Reply