SAP BW on HANA – Lookup Semantic Partitioning Object Dynamically
Applies to:
SAP Netweaver Business Intelligence 7.3 or above. It’s also applies to SAP BW powered by SAP HANA. For more information, visit the Business Intelligence homepage.
Summary
This is a document that describes a packaged-solution for industrialization of using dynamic data selection from Semantic Partitioning Objects based on DSOs and on Infocubes.
Author(s): Carlos Basto
Company: Accenture
Created on: 28 November 2013
Author Bio
Carlos Basto is a Consultant at Accenture. He has been involved in SAP Business Warehouse, SAP Business Objects and SAP HANA Consulting and Support Projects. |
Table of Contents
1. Introducing solution to Actual Context 3
3. Report of Tough Task using Semantic Partitioning Objects. 4
5.3 Method: GET_PART_TAB_NAME. 8
6. Calling of the method READ in transformation. 13
6.1 Calling method to read data from Semantic Partitioning Object based on DSO. 13
6.2 Calling method to read data from Semantic Partitioning Object based on Infocube. 15
1. Introducing the solution to Context
A semantically partitioned object is an InfoProvider that consists of several InfoCubes or DataStore objects with the same structure.
Semantic partitioning is a property of the InfoProvider. You specify this property when creating the InfoProvider. Semantic partitioning divides the InfoProvider into several small, equally sized units (partitions).
A semantically partitioned object offers the following advantages compared to standard InfoCubes or standard DataStore objects:
· Better performance with mass data:
The larger the data volume, the longer the runtimes required for standard DataStore objects and standard InfoCubes. Semantic partitioning means that the data sets are distributed over several data containers. This means that runtimes are kept short even if the data volume is large.
· Close data connection:
Error handling is better. If a request for a region ends with an error, for example, the entire InfoProvider is unavailable for analysis and reporting. With a semantically partitioned object, the separation of the regions into different partitions means that only the region that caused the error is unavailable for data analysis.
· Working with different time zones:
EDW scenarios usually involve several time zones. With a semantically partitioned object, the time zones can be separated by the partitions. Data loading and administrative tasks can therefore be scheduled independently of the time zone.
2. Analysis and Reporting
You can use the semantically partitioned object for reporting and analysis, as you can with any other InfoProvider. You can also choose to only update selected partitions in an InfoCube, for example, or include selected partitions in a MultiProvider and use them for analysis, as shown in the following graphic:
3. Report of Tough Task using Semantic Partitioning Objects: Look Up Data
The following solution is released for the customers and can be used to read data from Semantic Partitioning Objects based on DSOs and on Infocubes.
Nowadays selecting data from SPO have been a tough subject in many companies because of partition tables. SAP BW generates one table partition by partition of this new provider in a version 7.3 or above.
Thus, selecting data from this object results in the need of, manually, identify which data is coming from source and which partition is assigned to it. Furthermore, some tables should be read before to, correctly, identify these partitions to code in BW.
This document will describe how easily implement a new customized class and methods to get it dynamically through a suitable approach.
4. Solution: Main Class
For reusable thoughts, the solution is leverage by using OO ABAP context. This way, methods can be reused as many times as wanted in other contexts.
The class zcl_rsspo_infoprov should be created as general object type and having public instance.
Below, the next steps about how to implement this solution.
1) Set first tab (SE24) just like the picture below.
2) Tabs “Interface” and “Friends”, leave blank.
3) In tab “Attributes”, create [o_infoprov] to be used as the constructor method instance. It will pass the name of SPO as primary parameter.
4) Four (4) Methods, in the next tab, should be created (their specification will be presented on next sections in this documents):
· CONSTRUCTOR
· READ
· GET_IDPART
· GET_PART_TAB_NAME
5) Leave “Events” tab blank.
6) Four (4) types will be created as well.
§ TY_PART_LINES
§ TY_PART
§ L_TABNAME
§ L_FACT_VIEW
TYPES:
ty_part TYPE STANDARD TABLE OF RSLPOPARTRANGE.
TYPES:
ty_part_lines TYPE STANDARD TABLE OF RSLPOPART.
TYPES:
l_tabname TYPE RSINFOPROV.
TYPES:
l_fact_view TYPE RSINFOPROV.
7) And, finally, leave “Aliases” tab blank.
5. Solution: Methods
As described before, four (4) Methods should be created and to generate them some specification will be needed just like their codification.
Follow the next steps to implement these methods.
5.1 Method: CONSTRUCTOR
Below are the steps needed to implement this method and the codification is also provided.
Parameters:
methods CONSTRUCTOR
importing
!I_INFOPROV type RSINFOPROV .
Exceptions: leave empty or implement as you want.
Method Implementation:
* <SIGNATURE>—————————————————————————————+
* | Instance Public Method ZCL_RSSPO_INFOPROV->CONSTRUCTOR
* +————————————————————————————————-+
* | [—>] I_INFOPROV TYPE RSINFOPROV
* +————————————————————————————–</SIGNATURE>
method CONSTRUCTOR.
o_infoprov = i_infoprov.
endmethod.
5.2 Method: GET_IDPART
Below are the steps needed to implement this method and the codification is also provided.
Parameters:
methods GET_IDPART
importing
!I_LPO type RSLPONAME
!I_OBJVERS type RSOBJVERS
!I_IOBJNM type RSIOBJNM
!I_IOBJ_VALUE type RSCHAVL
exporting
!E_IDPART type RSLPO_IDPART .
Exceptions: leave empty or implement as you want.
Method Implementation:
method GET_IDPART.
* +————————————————————————————————-+
* Declaration
* +————————————————————————————————-+
* start-of-coding ->
* Internal tables & work areas
* +————————————————————————————————-+
DATA: l_logic_part TYPE TABLE OF rslpopartrange.
DATA: ls_logic_part TYPE rslpopartrange.
* +————————————————————————————————-+
* Fetch data from standard table
* +————————————————————————————————-+
SELECT *
FROM rslpopartrange
INTO TABLE l_logic_part
WHERE lpo = i_lpo and
objvers = i_objvers and
iobjnm = i_iobjnm.
IF sy–subrc <> 0.
MESSAGE e000(ZBW) WITH text–000.
ENDIF.
* Assign records partition object values to get part ID.
* +————————————————————————————————-+
LOOP ATl_logic_part INTO ls_logic_part.
IF i_iobj_value EQ ls_logic_part–low OR
i_iobj_value BETWEEN ls_logic_part–low and ls_logic_part–high.
e_idpart = ls_logic_part–idpart.
ELSE.
CONTINUE.
ENDIF.
ENDLOOP.
* +————————————————————————————————-+
* end-of-coding <-
* +————————————————————————————————-+
endmethod.
5.3 Method: GET_PART_TAB_NAME
Below are the steps needed to implement this method and the codification is also provided.
Parameters:
methods GET_PART_TAB_NAME
importing
!I_LPO type RSLPONAME
!I_IDPART type RSLPO_IDPART
exporting
!E_TABNAME type RSINFOPROV
!E_FACT_VIEW type RSINFOPROV .
Exceptions: leave empty or implement as you want.
Method Implementation:
method GET_PART_TAB_NAME.
* +————————————————————————————————-+
DATA: l_logo TYPE TABLE OF rslpocomp,
ls_logo TYPE rslpocomp.
* +————————————————————————————————-+
* Fetch data from SPO Classification
* +————————————————————————————————-+
SELECT *
from rslpocomp
INTO TABLE l_logo
where lpo = i_lpo and
objvers = ‘A’ and
role = ”.
* +————————————————————————————————-+
* Assign tabname to exporting parameter
* +————————————————————————————————-+
LOOP AT l_logo INTO ls_logo.
IF ls_logo–tlogo = ‘ODSO’.
” DSO-Based
CONCATENATE ‘/BIC/A’ i_lpo i_idpart ’00’ INTO e_tabname.
EXIT.
ELSEIF ls_logo–tlogo = ‘CUBE’.
” Infocube-Based
CONCATENATE ‘/BIC/F’ i_lpo i_idpart INTO e_tabname.
CONCATENATE ‘/BIC/V’ e_tabname+6 ‘J’ INTO e_fact_view.
EXIT.
ELSE.
CONTINUE.
ENDIF.
ENDLOOP.
* +————————————————————————————————-+
* End of Coding
* +————————————————————————————————-+
endmethod.
5.4 Method: READ
Below are the steps needed to implement this method and the codification is also provided.
Parameters:
methods READ
importing
!I_LPOX type RSLPONAME
!I_OBJVERSX type RSOBJVERS
!I_IOBJNMX type RSIOBJNM
!I_IOBJ_VALUEX type RSCHAVL
!I_T_FILTER type RSDRI_T_RANGE optional
!I_AGGR_MODE type STRING default ‘SUM’
exporting
value(E_DATA) type ref to DATA .
Exceptions: leave empty or implement as you want.
Method Implementation:
method READ.
* +————————————————————————————————-+
* Start Coding ->
** Declaration of Dynamic tables, structure and lines
* +————————————————————————————————-+
DATA: lo_table TYPE REF TO data,
lo_lines TYPE REF TO data.
DATA: s_r_infoprov TYPE REF TO cl_rsdri_infoprov.
FIELD-SYMBOLS: <lt_table_structure> TYPE ANY TABLE,
<l_fs_table> TYPE STANDARD TABLE..
* +————————————————————————————————-+
** Internal tables & work areas
* +————————————————————————————————-+
DATA: l_t_filter TYPE TABLE OF rsdri_s_range,
ls_t_filter TYPE rsdri_s_range,
l_idpart TYPE rslpo_idpart,
l_dynamid_where_field TYPE string,
l_tabname TYPE l_tabname,
l_fact_view TYPE l_fact_view.
** Call SPO Table metadata methods
* +————————————————————————————————-+
CALL METHOD get_idpart “get IdPart
EXPORTING
i_lpo = i_lpox
i_objvers = i_objversx
i_iobjnm = i_iobjnmx
i_iobj_value = i_iobj_valuex
IMPORTING
e_idpart = l_idpart.
CALL METHOD get_part_tab_name “get partTab Name
EXPORTING
i_lpo = i_lpox
i_idpart = l_idpart
IMPORTING
e_tabname = l_tabname
e_fact_view = l_fact_view.
* +————————————————————————————————-+
* Choose Data Store Object (X) or Infocube to pass to the logic Control
IF l_tabname(6) = ‘/BIC/A’.
* +————————————————————————————————-+
** Create data to assign structures dynamically (DSO).
* +————————————————————————————————-+
CREATE DATA lo_table TYPE TABLE OF (l_tabname).
ASSIGN lo_table->* TO <lt_table_structure>.
* +————————————————————————————————-+
* Start of assigning data set logic by dynamic tabname to DSO
* +————>
” Filter Treatment
LOOP AT i_t_filter INTO ls_t_filter.
TRANSLATE: ls_t_filter–low to upper case,
ls_t_filter–high to upper case.
APPEND ls_t_filter TO l_t_filter.
ENDLOOP.
” Compose Dynamic Where Clause
CONCATENATE LINES OF l_t_filter INTO l_dynamid_where_field
SEPARATED BY space.
” Fetch data to dynamic table
SELECT *
FROM (l_tabname)
INTO TABLE <lt_table_structure>
WHERE (l_dynamid_where_field).
* Assign export parameter of selection to output table data
*+———————————————————————-
CREATE DATA e_data TYPE TABLE OF (l_tabname).
ASSIGN e_data->* TO <l_fs_table>.
<l_fs_table> = <lt_table_structure>.
* +————————————————————————————————-+
* Choose Data Store Object or Infocube (X) to pass to the logic Control
ELSEIF l_tabname(6) = ‘/BIC/F’.
* +————————————————————————————————-+
TYPE-POOLS: rs, rsdrc.
” create dynamic table
CREATE DATA lo_table TYPE TABLE OF (l_fact_view).
ASSIGN lo_table->* TO <lt_table_structure>.
DATA: g_s_sfc TYPE rsdri_s_sfc,
g_th_sfc TYPE rsdri_th_sfc,
g_s_sfk TYPE rsdri_s_sfk,
g_th_sfk TYPE rsdri_th_sfk,
g_s_range TYPE rsdri_s_range,
g_t_range TYPE rsdri_t_range.
DATA: l_fields TYPE TABLE OF DFIES,
ls_fields TYPE DFIES.
DATA: v_num_lines TYPE i,
v_tabix TYPE sy–tabix.
** +————————————————————————————————-+
*** Get reference fields table (into l_fields)
*
CALL FUNCTION ‘DDIF_FIELDINFO_GET’
DESTINATION sy–sysid
EXPORTING
tabname = l_fact_view
langu = sy–langu
uclen = ’00’
TABLES
dfies_tab = l_fields
EXCEPTIONS
not_found = 1
internal_error = 2
OTHERS = 3.
IF sy–subrc <> 0.
MESSAGE ID sy–msgid TYPE sy–msgty NUMBER sy–msgno
WITH sy–msgv1 sy–msgv2 sy–msgv3 sy–msgv4.
ENDIF.
DESCRIBE TABLE l_fields LINES v_num_lines.
** +————————————————————————————————-+
*** Start: Filling characteristic and key figure tables
** +————>
CLEAR g_th_sfc.
CLEAR g_th_sfk.
LOOP AT l_fields INTO ls_fields.
v_tabix = sy–tabix.
IF ls_fields–domname(4) NE ‘RSKY’.
DO v_num_lines times.
CLEAR g_s_sfc.
g_s_sfc–chanm = ls_fields–fieldname.
g_s_sfc–chaalias = ls_fields–fieldname.
g_s_sfc–orderby = 0.
INSERT g_s_sfc INTO TABLE g_th_sfc.
ENDDO.
ELSEIF ls_fields–domname(4) EQ ‘RSKY’.
DO v_num_lines times.
CLEAR g_s_sfk.
g_s_sfk–kyfnm = ls_fields–fieldname.
g_s_sfk–kyfalias = ls_fields–fieldname.
g_s_sfk–aggr = i_aggr_mode. ” Default ‘SUM’.
INSERT g_s_sfk INTO TABLE g_th_sfk.
ENDDO.
ELSE.
EXIT.
ENDIF.
ENDLOOP.
** <————+
*** End: Filling characteristic and key figure tables
** +————————————————————————————————-+
” Filling Filter Table
LOOP AT i_t_filter INTO g_s_range.
TRANSLATE: g_s_range–low TO UPPER CASE,
g_s_range–high TO UPPER CASE.
APPEND g_s_range to g_t_range.
ENDLOOP.
** +————> The reading module is called:
DATA: g_end_of_data TYPE rs_bool,
g_first_call TYPE rs_bool.
** +————> this variable will be set to TRUE when the last data
** +————> package is read
g_end_of_data = rs_c_false.
g_first_call = rs_c_true.
DATA: l_infoprov TYPE rsinfoprov.
l_infoprov = l_tabname+6.
WHILE ( g_end_of_data NE ‘X’).
CREATE OBJECT s_r_infoprov
EXPORTING
i_infoprov = l_infoprov
EXCEPTIONS
illegal_input = 1
OTHERS = 2.
CALL METHOD s_r_infoprov->read
EXPORTING
* i_infoprov = l_infoprov ” Already instanced
i_th_sfc = g_th_sfc
i_th_sfk = g_th_sfk
i_t_range = g_t_range
i_reference_date = sy–datum
i_packagesize = 1000
i_authority_check = rsdrc_c_authchk–read
IMPORTING
e_t_data = <lt_table_structure>
e_end_of_data = g_end_of_data
EXCEPTIONS
illegal_input = 1
illegal_input_sfc = 2
illegal_input_sfk = 3
illegal_input_range = 4
illegal_input_tablesel = 5
no_authorization = 6
illegal_download = 8
illegal_tablename = 9
OTHERS = 11.
IF sy–subrc <> 0.
BREAK-POINT. “#EC NOBREAK
EXIT.
ENDIF.
ENDWHILE.
CREATE DATA e_data TYPE TABLE OF (l_fact_view).
ASSIGN e_data->* TO <l_fs_table>.
<l_fs_table> = <lt_table_structure>.
ELSE.
EXIT.
MESSAGE e398(00) WITH ‘This is not a valid semantic partitioning object name’.
ENDIF.
* <————+
* +————————————————————————————————-+
* End of coding <-
* +————————————————————————————————-+
endmethod.
6. Calling of the method READ in transformation
It’s very expected that this method is used in a transformations for lookups purposes. Thus, let’s take an sample of calling method to understand and apply our understanding of it in the SAP BW data model.
Start Routine is a good point and very common place of lookup data implementations. That’s the why the following sample is based on it.
6.1 Calling method to read data from Semantic Partitioning Object based on DSO.
** Declaration of Dynamic tables, structure and lines
*+———————————————————————-
DATA: r_data TYPE REF TO data.
FIELD-SYMBOLS: <fs> TYPE STANDARD TABLE.
*+———————————————————————-
** Internal tables & work areas
*+———————————————————————-
DATA: o_rsspo_infoprov TYPE REF TO zcl_rsspo_infoprov.
DATA: l_s_filter TYPE rsdri_s_range,
l_t_filter TYPE rsdri_t_range.
DATA: l_output_table TYPE TABLE OF /BIC/ASPOXXX0000.
*+———————————————————————-
CLEAR l_s_filter.
l_s_filter–chanm = ‘FISCPER’.
* l_s_filter-sign = rs_c_range_sign-including. “
*Field sign MUST NOT be inserted, because the logic used is diferent
*from
* Infocube version.
l_s_filter–compop = ‘>=’.
l_s_filter–low = `’2013001’`. “ Must pass the [] of field, that’s the why of `”`.
l_s_filter–high = ‘AND’.
*If more than one filter line value is used, the field “HIGH” MUST be
APPEND l_s_filter TO l_t_filter.
CLEAR: l_s_filter.
l_s_filter–chanm = ‘FISCPER’.
“Without ‘0’ in the begining of name
* l_s_filter-sign = rs_c_range_sign-including. “
*Field sign MUST NOT be inserted, because the logic used is diferente
*from
* Infocube version.
l_s_filter–compop = ‘<=’.
l_s_filter–low = `’2013006’`. “ Must pass the [] of field, that’s the why of `”`.
* l_s_filter-high = AND
APPEND l_s_filter TO l_t_filter.
CLEAR: l_s_filter.
* Instance class through CONSTRUCTOR Class.
*+———————————————————————-
*+————>
CREATE OBJECT o_rsspo_infoprov
EXPORTING
i_infoprov = ‘SPOXXX’.
* CALL METHOD
*+———————————————————————-
CALL METHOD o_rsspo_infoprov->read
EXPORTING
i_lpox = ‘SPOXXX’ ” Name of SPO
i_objversx = ‘A’
i_iobjnmx = ‘0FISCPER’
i_iobj_valuex = ‘2013008’
” Provide the Identifier IO value of Partition
i_t_filter = l_t_filter ” Filter table entries
IMPORTING
e_data = r_data.
* Assign export parameter of function module to outgoing data package
*+———————————————————————-
ASSIGN r_data->* to <fs>.
l_output_table = <fs>.
*+————> | Do your logic…
*+———————————————————————-
* End of coding <-
*+———————————————————————-
6.2 Calling method to read data from Semantic Partitioning Object based on Infocube.
** Declaration of Dynamic tables, structure and lines
*+———————————————————————-
DATA: r_data TYPE REF TO data.
FIELD-SYMBOLS: <fs> TYPE STANDARD TABLE.
*+———————————————————————-
** Internal tables & work areas
*+———————————————————————-
DATA: o_rsspo_infoprov TYPE REF TO zcl_rsspo_infoprov.
DATA: l_s_filter TYPE rsdri_s_range,
l_t_filter TYPE rsdri_t_range.
DATA: l_output_table TYPE TABLE OF /BIC/VSPOXXX00J.
*+———————————————————————-
* create range filter
CLEAR l_s_filter.
l_s_filter–chanm = ‘0FISCPER’.
l_s_filter–sign = rs_c_range_sign–including.
l_s_filter–compop = rs_c_range_opt–equal.
l_s_filter–low = ‘2013011’.
APPEND l_s_filter TO l_t_filter.
CLEAR: l_s_filter.
* Instance class through CONSTRUCTOR Class.
*+———————————————————————-
*+————>
CREATE OBJECT o_rsspo_infoprov
EXPORTING
i_infoprov = ‘ SPOXXX’.
* CALL METHOD
*+———————————————————————-
CALL METHOD o_rsspo_infoprov->read
EXPORTING
i_lpox = SPOXXX’ ” Name of SPO
i_objversx = ‘A’
” Version of Object (make sense from productive perspective,
” use it as “A” version only).
i_iobjnmx = ‘0COMP_CODE’
” Provide the Identifier IO of Partition
i_iobj_valuex = ‘SBSP’
” Provide the Identifier IO value of Partition
i_t_filter = l_t_filter ” Filter table entries
IMPORTING
e_data = r_data.
* Assign export parameter of function module to outgoing data package
*+———————————————————————-
ASSIGN r_data->* to <fs>.
l_output_table = <fs>.
*+————> | Do your logic…
*+———————————————————————-
* End of coding <-
*+———————————————————————-
7. Related Content
Creating a Semantically Partitioned Object
BW on HANA and Very Large Tables
Disclaimer and Liability Notice
This document may discuss sample coding or other information that does not include SAP official interfaces and therefore is not supported by SAP. Changes made based on this information are not supported and can be overwritten during an upgrade.
SAP will not be held liable for any damages caused by using or misusing the information, code or methods suggested in this document, and anyone using these methods does so at his/her own risk.
SAP offers no guarantees and assumes no responsibility or liability of any type with respect to the content of this technical article or code sample, including any liability resulting from incompatibility between the content within this document and the materials and services offered by SAP. You agree that you will not hold, or seek to hold, SAP responsible or liable with respect to the content of this document.