Skip to Content

How-to… Load a File into SAP NetWeaver BI Integrated Planning (Part 1)

Update (August 2014):

Version 3 of the how-to guide and solution has been released. Please see the blog at

Note: The old version 2 is not compatible with SAP BW 7.4 or higher. If you are planning to upgrade to 7.4, you will have to install version 3!


Hello Everyone,

it has finally arrived. The brand new how-to paper for loading files into BI Integrated Planning is available!


Version 2.1 of the solution has been released. Please see How-to… Load a File into SAP NetWeaver BI Integrated Planning (Part 2) for download.

Business Scenario

If you are using SAP NetWeaver BI Integrated Planning you quite often have the requirement to load external data into your planning application. Typically data is loaded into SAP NetWeaver BI using the data staging functionality. However, you are looking for an easier way to upload data from a file that can be performed by business planners.

The how-to paper describes a solution that allows business planners to load a file directly into their planning application using a web browser. The following screen shows the user interface. The planner is prompted for the file name and simply chooses “Upload”. The system will load the file and validate its content. If there are no errors the planner can save the changes. The newly loaded data is immediately available in planning applications and queries.

Screen Shots

The first screen shot shows the starting point for the upload. The variable – in this case “Planning Version” – are rendered based on variables you define in the filter. Typically you would include some organizational criteria like cost center, too. Then simply enter the selection criteria, pick a file name to load, and click “Upload”.

Upload Begin

Now the system will load the file from the PC and process it with a planning function. This includes various consistency checks and optional conversions. Also all characteristics relationships will be processes at this time (validation and derivations). In the log you can see how many records where loaded.

Upload Middle

Finally, you can choose to “Save” the changes or “Undo” the load in case of you want to start over.

Upload Finish

Additional Features
  • Various file formats: tab-delimited, fixed length, or XML
  • Various code pages: Unicode, UTF-8, UTF-16, and other code pages
  • Option for overwriting of or appending to existing data including delta mode
  • Text files with or without header line
  • Conversion of numbers and dates according to user settings
  • Conversion of characteristic value according to conversion routine (for example ALPHA conversion or external to internal material numbers)
  • Conversion of amounts according to currency format configuration
  • Flexible definition of fields and field order in upload file
  • Support for virus scanning (if installed with SAP NetWeaver)

After the initial implementation there are no program changes required in order to use these features. All options can be configured using the Planning Modeler.

Your Feedback

I hope you enjoy the new solution. In case of problems, please shoot me an e-mail. If you have requests for enhancements or ideas for additional features please post them as comments to this blog.

Bug Fixes

Previously published program errors have been fixed in version 2.1 of the solution. Please see How-to… Load a File into SAP NetWeaver BI Integrated Planning (Part 2) for additional information.

You must be Logged on to comment or reply to a post.
  • Hi Marc,
    I want to Close the request after the Save button is hit.

    Do you see any reason why I should not add function module ‘RSAPO_CLOSE_TRANS_REQUEST’ to the ONACTIONON_SAVE method or would it be better to trigger an event to start a process chain to close the request?

    * Save data
      lr_data_area->save( ).

    * Close Request
      IF sy-subrc = 0.
            I_INFOCUBE = ‘ZFCT_R01’.

    * Data saved successfully. You may close the window now.
      add_message( i_msgty   = ‘S’ i_message = ‘035’ ).


    • Almost, Sandy.

        data: lr_olap_area type ref to cl_rsr_olap_area.
        lr_olap_area = lr_data_area->get_olap_area( ).
        IF lr_olap_area->n_save_error = rs_c_false.
      *   Close request here…
      *   Data saved successfully. You may close the window now.
          add_message( i_msgty   = ‘S’ i_message = ‘035’ ).

      Please keep in mind that user might create many requests that way and if two users try to do it at the same time, you might run into dead locking issues.


  • Hi Marc,

    We are using the version 2.2 for the file upload and on SAPKW70016.

    When the data is loaded and user hits on save, the values get saved successfully. Perfect.
    But we could see a dump in the backend in ST22 for every Save the users does.

    Dump analysis:
    Information on where terminated:
    Termination occurred in the ABAP program “CL_WDR_MAIN_TASK==============CP” –
    The main program was “SAPMHTTP “.

    In the source code you have the termination point in line 6
    of the (Include) program “CL_WDR_MAIN_TASK==============CM00W”.

    Source Code:
        2   data: l_active_window     type ref to cl_wdr_window,
        3         l_resume_plug       type wdy_rr_iobound_plug.
        5 * get resume plug (there can only be one resume plug for window)
    >>>>>   l_resume_plug   = wdr_task=>application->application_window->view_manager->window_info->ge
        7   if l_resume_plug is not initial.
        8     is_suspended = abap_true.
        9   else.
       10     is_suspended = abap_false.
       11   endif.
       13   l_active_window = wdr_task=>application->application_window->get_active_modal_window( ).
       14   l_active_window->phase_model->set_appl_suspend_requested( abap_true ).
       15 *    if l_active_window->application_state_change->data_saved = abap_true.
       16 *      keep_session = abap_false.
       17 *    else.
       18   keep_session = abap_true.
       19 *    endif.
       20 endmethod.

    We will be upgrading to EHP 1, Stack 3 in next few weeks.

    Could you please let me know:
    1) If importing the version 2.4 would solve this problem or
    2) Upgrading to EHP1 would solve this issue or
    3) Apply any SAP notes / Correction instructions.

    This is not a show stopper as of now since the users could still save the plan data. Its only the dumps in the backend bothering me.

    Thanks in advance.

    • Hi Syam,

      I have not seen this before. It most likely will be solved with your upgrade. However, I don’t understand why you would upgrade to EhP1 SP 3 when SP 5 is already out. I highly recommend to upgrade to the most recent SP.


  • Hi Marc Bernard,
    I saw in your class ZCL_RSPLF_FILE_UPLOAD, there are static attributes.
    My question is, what if more than 1 user uploading the data at about the same time, wouldnt the static attributes in the class will always be updated by the latest uploader?
    If this is the case, then it gonna create problem because the planning function settings such as Overwrite Settings, Skip Header, Convert field settings are all declared as static attributes
    • Hi Alfonso,

      ABAP memory (i.e. static variables) is user specific and not shared. It would only be a problem if a user tries to run several file uploads in the same session and at the same time (which is very hard to do).


  • Hi Mark,
    you have helped us on previous occasions, so I would like to resource to your good advice once more.
    We are using the flat file load for a large number of applications but now we have the requirement to capture on a table a log of the files loaded using the IP Flat File load (including some key fields from the file like month, company code…).
    In principle this should be quite easy as we have most of the data available at the end of the EXECUTE or the FINISH_EXECUTION including the file name, and we can write this to a table.
    The problem comes that the user might not save the data loaded but the Log table would have already been updated with a record which is not true.
    So basically we would like to save this log table when the user presses “save” on the webdynpro screen. Would this be possible?
    • Hello Carles,

      you are right, the exit function is not a good place. Also consider that you might include other planning functions after the upload in the planning sequence. The best place is to change the ONACTIONON_SAVE method of the Web Dynpro view FILE_UPLOAD_VIEW. With wd_assist->* object you can get to file name, type, and even the data (wd_assist->n_r_data).

      I will consider putting some kind of audit log into a future version.


  • Hi Marc,

    I have a characteristic field called comment,it doesn’t have any master data or texts. When I am trying to upload this file using upload it is throwing message as “Invalid characteristic values”(RSPLF 017). This error message is also appearing when trying to upload characteristic values which are not available in the relevant master data tables. Please advice on this.


  • Hi,

    I am using Planning upload template (Mark’s – zrsplf_file_upload), and I use Hierarchy node variable as parameter so that the lock will be done only on hierachy node level.

    When I try to use cost element hierarchy node variable i get below msg.

    Characteristic restrictions “Costelmnt Hier Node …” for characteristic “Cost Element 0COSTELMNT” are complex or contain errors; direct entry for selections is no longer possible.

    It is a Warning message, but when I execute the http link to upload the file, I can feel the other filter variables (value on characteristics, not hierarchy) but I don’t see the Hierarchy node variable to feel.

    So did I make something wrong or is it impossible to use hierarchy node variable in the filter of Planning upload template?

    Thanks’s a lot.

  • Hello Mark,

    I followed your how to to create the file upload planning function. This error remains and I can’t find a solution : File Upload: Turn on “Log on with Attached SAPGUI” setting
    When I’m trying to test the planning sequence.

    Thank you for your help.

  • Hello everyone!
    I have a question about reading data by the planning function. There is an overwriting mode, and after uploading a file whith 2 records we have such message:

    28 records read, 0 generated, 2 changed, 26 deleted

    and i cannot understand, which data we read from our cube? is there any dependance between our filter, aggr.level, file structure and records, that are already stored in cube? and which data will be removed from cube after press the save button?

    the question is because i cannot find exactly 28 records in cube, using only filter or only file structure…could you explain this relations to me, please

    • Hello Elena,

      the upload works exactly like any other planning function. The data is read based on the filter which is applied to the aggregation level. You can replicate the selection using transaction LISTCUBE (be sure to select exactly the field of the aggregation level and enter the same filter criteria. Also turn on the option “DB aggregation”). The file structure or file data does not impact the selection at all.

      The overwrite mode replaces all existing records with the upload data. In your case, the two records already exists with different key figure values so these are “changed”. The other 26 records are “deleted”.


      • Marc, thanks a lot! but..can I ask one more question about it?
        For example, there is characteristic “Object” in my aggregation level. In filter there is no restriction for it.
        In cube we have several records, that matches our filter and aggr.level, and contains three different values of “Object” – GTSAM_01, GHPHG, GT_GS.
        In my file I’m trying to upload one record, with GTSAM_01 value of our characteristic. Will the other two records (GHPHG and GT_GS) be deleted from cube or function will compare all characteristic set and will not delete records, which contains values of characteristic, that isn’t matches the file?
      • Marc, thanks a lot! but..can I ask one more question about it?
        For example, there is characteristic “Object” in my aggregation level. In filter there is no restriction for it.
        In cube we have several records, that matches our filter and aggr.level, and contains three different values of “Object” – GTSAM_01, GHPHG, GT_GS.
        In my file I’m trying to upload one record, with GTSAM_01 value of our characteristic. Will the other two records (GHPHG and GT_GS) be deleted from cube or function will compare all characteristic set and will not delete records, which contains values of characteristic, that isn’t matches the file?
        • It’s described in chapter 4.5 of the how-to guide. Overwrite mode = X will delete the records for GHPHG and GT_GS. Mode = D will keep them.


    • Hello Orcamento,

      yes, the upload works with 7.3.

      If you want to install it to a new 7.3 system, you have to select the option “Ignore Invalid Component Version” when importing the transport request.


  • Hi Marc,

    We are trying to implement the file upload using Bex Analyser and we are struglling.

    Can we post our problem here, or do you prefer by email.

    If so please send your email to

    Your help is very appreciated.

    Thanks in advance,

  • Hello,

    we are trying to use the functionality on BW7.3 and are facing a problem when importing the transport:

    “The software component MDM_TECH is not installed on the system.”

    SAP NetWeaver Master Data Management is not licensed on the client’s system. All other components are existing and could be handled with option “Ignore Invalid Component Version”

    Is there a way for us to use this solution?



    • Hello Franz,

      I have been able to successfully import it into a 7.3 system that did not include MDM_TEXH using the “ignore invalid component version”. Sounds like this is not working for you. Please send me the log file(s) via email.



  • Hello Marc,

    thank you. With the option “ignore invalid component version” we could do it, also without MTM_TECH. And it works now on a 7.3 system.

    Another question: Do you have any experience with the integration of the solution in BO Analysis for Office intead of BEx? We are using BOA 1.2 SP7 at the moment.



    • Good question, unfortunately I have not had the time to look into this much but I doubt that there’s an easy solution since the option to “logon with SAPGUI” doesn’t exist in AO. All you can do is include a URL link in the workbook that will open the File Upload Web Dynpro.



  • Hi Marc,

    I am working on SAP-IP File upload functionality & would like to load a file with 2million records.
    I saw your answers to some of the discussions but, could not get straight answer for WHETHER THERE IS ANY FILE SIZE LIMITATION TO UPLOAD THROUGH SAP IP UPLOAD FUNTION’
    Appreciate your response


    • Hi Shirish,

      the file size is limited by the amount of memory available for the upload function on the BW application server. The file will be loaded completely before it’s processed with the planning function and therefore has to fit into memory completely (actually a couple times depending on the upload options). As is mentioned in the paper, the upload is therefore not meant for mass data. You can give it a try with that much data but if it fails with out-of-memory errors, there’s nothing I can do about it and you will have to use regular file upload with BW Admin Workbench.

      SAP Customer Solution Adoption (CSA)

      • Thanks Marc,

        Recently we updated GUI patch to SAP-SAP_GUI_Patch_3-7.20-B. After this application we are not able to load IP files with upload functionality. System is showing an error;

        ‘Query !!1ZSEM_C18 could not be opened’ (Where ZSEM_C18 is our Transactional cube) with Message no. BRAIN635.

        Before this was working fine.

        Is there any GUI patch compatilbility with Real time Infocubes?

        Could you please guide how this could be resolved?

        After further investigation it is observed that, even a query on this aggreagation level is also giving same error with additional error in program RSR_BAD_CODING, the type G_S_D004 unknown.

        The transactional cube contains two KFs which are of Data type Number & with exception aggregation as Last value.

        If I remove these KFs from the cube the solution works without error?

        Wondering if the dynpro definition does not comply with the special aggregation methods?


  • Hi Marc,

    I really love the feature. I got it all to work, but now struggeling with the fact that I have some constant values set in my planning cube. I get the message

    The characteristic Item category group[ZZITEMCAT] has to have a fixed value
    3 in InfoProvider ZTI1_SPPL

    Message no. BRAIN155


    InfoProvider ZTI1_SPPL , the characteristic Item category group[ZZITEMCAT] is
    uniquely set to the value 3 (COB_PRO-CHACONST = 3). A change data record with
    Item category group[ZZITEMCAT] = is thus not permitted.

    There is most likely an error in the input readiness controls.

    This field is forced into my aggregation level by the system, but I can not set value in the Filter (as there is the constant), but then when I load my file (without those fields) I get the message.

    Any Ideas?


    • Hello “Team BW”,

      this is indeed a new issue. I have developed a solution in the next version of the file upload which I can send you.


      SAP Customer Solution Adoption (CSA)

  • Hi Marc,

    the file upload stopped working in 7.4. I assume, the problem is due to the change from CHAR 60 to SSTRING 1333 (as described in note 1823174).

    When opening the upload dialog, the following happens (ST22):

    The current ABAP program “CL_WDR_CLIENT_COMPONENT=======CP” had to be terminated because it has

    come across a statement that unfortunately cannot be executed. In include “/1BCWDY/B_VMPBH76O4E2HX2VLEPZF          “, in line 1544 of pro “/1BCWDY/VMPBHU4QWZTF0R7WUM20==CP        “, the following syntax errors have occurred: “LOW_EXT” muß ein zeichenartiges Feld sein (Datentyp C, N, D oder T).

    The error occurs in line 108 of method ONACTIONON_UPLOAD in FILE_UPLOAD_VIEW:

    WRITE <member> TO ls_range-low_ext.

    LOW_EXT is now of type SSTRING 1333.

    Maybe you already have a new version which incorporates this change?



  • Hi Guys,

    I implemented this as per how to document.

    When i select Upload file,it is giving errors.Error is’Please enter a valid value for characteristics XXXXX.

    Please advise me on this

  • Hi Mark,

    A client of ours recently migrated to SAP BW on HANA and are using integrated planning functionality. We are getting the following error while using file upload function.

    Errors occurred while executing Planning sequence to upload scan file Test.

    Message no. RSPLS411

    BW lock server: No server with enqueue process.

    Can you please advise how to trace the functionality while loading file? Your quick response is much appreciated.

    The solution worked in Dev and QA environments. We are on BW 731 SP11.

    Thanks & Regards


    • Hello Govind,

      please check the settings in transaction RSPLAN > Goto > Manage Lock Server. Also the system must have an ENQ process configured in the system profile (you should see this process in SM50).

      SAP Strategic Customer Engagements (SCE)

  • Hi Marc,

    I have problem with the file upload prompt in Web template.  But it works in BexQuery Workbook.  Not sure if it’s a limitation or some settings need to be done for Web Template.  Can you advise?  Thanks.


    Upload File Planning Function4.jpg

    • Hi Yuen,

      you can’t use the same planning function for BEx workbooks and web template. For BEx workbooks you set the filename parameter to PROMPT (or a fixed filename). For web templates you have to leave the filename parameter empty. Just create two functions (and two sequences).


      SAP Strategic Customer Engagements (SCE)

  • Hi Marc,

    I try to found doc or explanations on Transformation and Multinines (LZRSPLF_FILE_UPLOADUXX), to understand how it works.

    I try to see if it is possible to add messages based on the abap code written in LZRSPLF_FILE_UPLOADUXX includes….




    • Hello Gerald,

      the code to parse the file data is based on the standard function module GUI_UPLOAD. You can get some documentation for this function via transaction SE37.

      I do not see any option to add messages, however. If you have a specific case, send me an example via email.

      Product Management SAP EDW (BW/HANA)

  • Hello Marc,

    we’re using the file upload via web dynpro. In addition we use BADI RSR_VARIABLE_F4_RESTRICT_BADI to restrict F4 help. F4 restriction works well in Excel or RSPLAN, but not in the web dynpro.

    Is there some service which needs to be switched on? The BADI just seems not to get called.



    • Hi Michael,

      unfortunately, the BADI is not supported in the upload WebDynpro. The WebDynpro uses the default search help for each InfoObject which is independent of the variables but the BADI is depending on the variable. Therefore it’s not possible to simply add a call to the BADI.

      I will put this on the to-do list to find some workaround (like a custom search help) but I can’t promise anything.

      Product Management SAP EDW (BW/HANA)

  • Hello Marc,

    I updated to the newest Version of your upload-/ download funtion.

    But it seems that there is an issue when downloading and uploading Floating Point kyfigures.
    In our Planning application we Need to plan values lower than one cent. In cause of this I had to use Floating Point keyfigures in the planning.

    When I now download data the keyfigures are exported in scientific form

    ( e.g. 5.9999999999999998E-02 ).

    I am not able to upload these keyfigures using upload function.

    Can your Planning Funcion Type handle Floating Point Keyfigure?

    I tried both Setting for Convert Amounts. But no one works.

    Is there some other setting to be done to get this data downloaded and uploaded in the right way.

    Thanks a lot for your Support.



    • Hi Timo

      floating point key figures can be tricky (they for example behave differently on different platforms). Anyway, I will have to test and debug the upload to find a solution that works for everybody. In the meantime, if you have some ABAP skills, you might be able to code a workaround. It would have to be in class ZCL_RSPLF_FILE_DOWNLOAD, method EXECUTE in the IF l_s_iobj_pro-datatp = n_c_curr branch. Add ELSEIF l_s_iobj_pro-datatp = ‘FLTP’ with your own logic.

      Product Management SAP HANA DW

    • Hi Timo,

      you need “Convert Fields” to be ON. This will trigger the formatting according to external display. To use the setting for decimal places as defined in the key figure, I added the following else branch in the location I indicated before:

                SELECT SINGLE kyfdecim FROM rsdkyf INTO l_kyfdecim
                  WHERE kyfnm = l_iobjnm AND objvers = rs_c_objvers-active.
                IF sy-subrc = 0 AND l_kyfdecim BETWEEN '0' AND '9'.
                  l_decimals = l_kyfdecim.
                  l_decimals = 0.

      Product Management SAP HANA DW

  • Hi Marc,

    After installing new version of loading program we have a problem with

    • Conversion of numbers and dates according to user settings.

    Any idea what format is expected for 0FISCPER IO? I tried like everything 009.2016, 2016009, 92016, and nothing.

    Best Regards


  • Hi Marc,

    We have problem that when upload the file from CSV, it said value for characteristic is not valid,  from the above comment, you said the value from CSV file must exist in corresponding infoobject’s master data table otherwise it will throw error, is it designed like this?  you also send the solution to the private mails who encountered the same problem, can you please share the solution to this post so that everyone can see it? Otherwise I have to report OSS message again to SAP regarding this, thanks for your help on this case.

    • Hi AMS,

      this is a how-to solution and not covered by standard support. An OSS message won’t help.

      In general, master data must exist for all planning functions to work properly. Please send me an email, if you want some code that you can use at your own risk.

      Product Management SAP DW