Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
lopatinvyu
Employee
Employee
SAC provides extensive capabilities for integration with different SAP and non-SAP solutions. These capabilities can be used for scheduled or manual upload of master and transaction data into planning models, including loading from flat files.

But for specific use cases it is required to load transaction data from flat files to SAC planning model specifically by end users as opposed to administrators of the solution. That can be, for example, bulk upload of financial data by financial planners, as an addition to manual entering data in SAC's tables widget. These users usually do not have extended authorizations for accessing model data management, as it can result in uncontrolled master / transaction data changes.

In this blog we will see how this can be solved using standard SAC capabilities. For this approach end users do not need to have an access to planning model data management. Keep in mind that this flat file upload solution is intended to be used for limited sets of data.

The solution logic

For creating the solution only with standard components of SAC, we will use a dataset and an analytic application.

Firstly, in step 1, the end user will load flat file or xls into a new or existing dataset object. Dataset has a standard capability of loading data from files and does not require master data and / or dimensions maintenance. Its interface is user friendly and does not require specific data modelling knowledge or extended authorizations (as it is in the case of planning model).In addition, data transformation is also possible on this step.

Once data is loaded in dataset, in step 2, we will use analytic application to review and copy data from dataset into the required planning model.


 

Let's have a look at the technical logic of this solution in details.

Step 1. Loading data from flat file into dataset

Dataset is used for the initial data upload into SAC. It can be created by end users themselves or can be pre-created. End users can also create several datasets, for example, for different versions of data.


Importing / Re-importing data into Dataset


In addition, dataset has the following useful characteristics in the context of our task:

  • Can be created via drag & drop of csv or xls file to SAC interface.

  • No master data, dimensions / measures maintenance needed.

  • Can contain data transformation / conversion rules (to enable logic similar to transformation / conversion files in BPC Standard).

  • Re-usable: data can be re-loaded and all earlier created transformation rules will be applied.



Transformation rules in Dataset


As a result of this step, we have data from flat file loaded into the dataset.

Step 2. Loading data from dataset into planning model with Analytics application

We have data from flat file loaded into dataset and transformed. Now we need to load this data into our planning model. We will use the analytics application for that.

The interface of analytic application contains 2 parts, logically split into 2 tabs using Tab Strip widget:

Tab 1. Opening dataset and preview


Open dataset and preview data tab


In this interface we have data preview tab and a button to execute open dataset selection dialog.

Once we click Open Dataset button, the Open Dataset / Model dialog appears. In this dialog we select source dataset with uploaded in step 1 data. Once it is done, the data is shown in preview table and relevant filters are applied to result table in Tab 2 (to show in result table only part of planning model, relevant for data uploading).

Tab 2. Copying data from preview dataset table to planning model table

When we click button Next, the preview interface if shown. Here we need to click Load button. After that data from preview table tblPreview is being copied into result table tblResult, record by record. Here we can review the data, edit it in the table, and Publish or Revert using standard SAC interface.


Result preview and Publish / Revert tab


Result of step 2 will be data published into planning model.

And now let's have a look at the code behind these interfaces. Keep in mind, that this code has illustrative purpose, hence, some parameters are hardcoded. In case needed, they can be derived dynamically via parameters or based on specific logic / mapping.

Tab 1 - Button "Open Dataset.." (btnOpenDataset), onClick event:
// Open dataset / model dialog to allow user to select the required source dataset
tblPreview.openSelectModelDialog();

// Making all dataset dimensions visible in the table
var dimList = tblPreview.getDataSource().getDimensions();
for (var i=0; i< dimList.length; i++)
{
tblPreview.addDimensionToRows(dimList[i],i);
}

// Making Date hierarchy flat, as by default it opens YQM
tblPreview.getDataSource().setHierarchy("Date","@FlatHierarchy");

// Showing preview table with dataset data
tblPreview.setVisible(true);

// Populating values from dataset into arrays for applying them as filters for the target table
var rsData = tblPreview.getDataSource().getResultSet();

var accMembers = ArrayUtils.create(Type.string);
var ccMembers = ArrayUtils.create(Type.string);
var dsMembers = ArrayUtils.create(Type.string);
var dateMembers = ArrayUtils.create(Type.string);
var baMembers = ArrayUtils.create(Type.string);

// Going through all records of dataset and assigning values to arrays
for (var k=0; k< rsData.length; k++)
{
accMembers[k] = "[Account].[parentId].&[" + rsData[k]["Account"].id + "]";
ccMembers[k]= "[Cost_center].[CC_H].&["+ rsData[k]["Cost_Center"].id + "]";
dsMembers[k] = "Input";
dateMembers[k] = rsData[k]["Date"].id.substring(0,4)+rsData[k]["Date"].id.substring(5,7);
baMembers[k] = rsData[k]["Business_Area"].id;
}

// Setting filters of Result table on the second page to show only relevant PoV
tblResult.getDataSource().setDimensionFilter("Account",accMembers);
tblResult.getDataSource().setDimensionFilter("Cost_center",ccMembers);
tblResult.getDataSource().setDimensionFilter("Data_source",dsMembers);
tblResult.getDataSource().setDimensionFilter("Date",dateMembers);
tblResult.getDataSource().setDimensionFilter("Business_Area",baMembers);

// Showing result table with applied filters
tblResult.setVisible(true);




 

Tab 2 - Button "Load" (btnLoad), onClick event:
// Copy lines & submit

// Get set of data from preview table with loaded dataset
var rsData = tblPreview.getDataSource().getResultSet();

// One by one copy values from source table into result (target) table
for (var i=0; i< tblPreview.getRowCount(); i++)
{
// Logic to read records from source table and enter into the target table

// Reading values of Account and Cost center and adjusting to include hierarchy elements
// In case needed, the code can be enhanced with dynamic hier
var Account = "[Account].[parentId].&[" + rsData[i]["Account"].id + "]" ;
var CostCenter = "[Cost_center].[CC_H].&["+ rsData[i]["Cost_Center"].id + "]";

// Writing values into target table with planning model
tblResult.getPlanning().setUserInput({
"Account":Account,
"Business_Area":rsData[i]["Business_Area"].id, // Business Area dimension
"Data_source":"Input", // Hardcoded for simplicity. In case needed, logic is the same as Account and Cost center
"Version":"public.Actual", // Hardcoded for simplicity. In case needed, logic is the same as Account and Cost center
"Date":rsData[i]["Date"].id.substring(0,4)+rsData[i]["Date"].id.substring(5,7), // Date dimension
"Cost_center":CostCenter,
[Alias.MeasureDimension]:"SignedData"},rsData[i][Alias.MeasureDimension].rawValue); // Value

}

// Submitting the data to show it in table
tblResult.getPlanning().submitData();

 

Additional extensions and limitations

The logic and code above illustrate the basic scenario. For making the application dummy-proof or more advanced, the following pieces of logic can be added:

1. Checking of master data when opening data set and showing the missing master data members and/or adding them. Example of the code for checking Accounts / Cost centers can look in the following way, and needs to be added to btnOpenDataset - onClick event:
	// Forming Accounts array
var arrDestAcctMembers = ArrayUtils.create(Type.MemberInfo);
arrDestAcctMembers = tblResult.getDataSource().getMembers("Account");
var strDestAcctMembers = ArrayUtils.create(Type.string);
for (k=0; k< arrDestAcctMembers.length; k++)
{
strDestAcctMembers[k] = arrDestAcctMembers[k].id;
}

// Forming Cost Centers array
var arrDestCCtMembers = ArrayUtils.create(Type.MemberInfo);
arrDestCCtMembers = tblResult.getDataSource().getMembers("Cost_center");
var strDestCCtMembers = ArrayUtils.create(Type.string);
for (k=0; k< arrDestCCtMembers.length; k++)
{
strDestCCtMembers[k] = arrDestCCtMembers[k].id;
}

// Account / Cost centers check check
for (k=0; k< rsData.length; k++)
{
if (strDestAcctMembers.indexOf("[Account].[parentId].&[" + rsData[k]["Account"].id + "]") === -1)
{
lbLog.addItem("A"+k.toString(), "Account " + rsData[k]["Account"].id + " does not exist in target model");
}

if (strDestCCtMembers.indexOf("[Cost_center].[CC_H].&["+ rsData[k]["Cost_Center"].id + "]") === -1)
{
lbLog.addItem("C"+k.toString(), "Cost center " + rsData[k]["Cost_Center"].id + " does not exist in target model");
}

}

lbLog.addItem("done", "Dataset check completed");
lbLog.setVisible(true);

Listbox lbLog is used here to show the log of checked master data members.

2. The explained approach logic works via frontend, hence there are data limitations. The above solution is suitable for data set up to 500 records without any additional actions. In case number of records >500, in the first table we need to edit drill limitation of preview table:


Editing drill limitation


In case number of records exceeds ~1.5k records, the logic should be adjusted to perform copying of data from tabPreview to tabResult in batches.

3. In addition to master data check, the extended validation rules can be added, e.g. for loading only selected combinations of master data members.

4. Adding filters on the Tab 1 can also make possible to copy just selected data subset.

Conclusion

This approach shows how combined functionality of dataset and analytics application can be used to create a solution for flat file upload by end users, without giving them an access to data management of planning model and without involvement of additional solutions and components.

Additionally, it can be easily extended with any extra logic based on specific business requirements.

 
14 Comments