Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
pallab_haldar
Active Participant
0 Kudos
-----------------------------------------------------------------------------------------------------------------------------------
Cross container access
-----------------------------------------------------------------------------------------------------------------------------------
1. When Source and Target HDI container exists in the same HANA server -

A. Create a key and value in the Source-DB AND in Tagged-DB mention the Key and HDI Source Service in the service of Type -
type: com.sap.xs.hdi-container

B. Create two Hdbrole  - One for the object_owner which following with # and having grand option and one for application_user.
C. Create .hdbgrantfile using the roles on Cfg folder.
D. Create Synonym and mention the source object which need to be consume in target.
E. create a .hdb synonym file .hdbsynonymconfig . Same syntax as synonym only add grantor.
F. Build it and you can use the source DB object mentioned in synonym in the tagree DB.

2. When Source and Target HDI container exists in the different HANA server -

a. Add a external SAP HANA service in the MTA project as HDI container.
b. Create a key and value in the Source_DB and in Target-DB mention the Key and HDI Source Service in the service. Service Type -
  type: org.cloudfoundry.existing-service

 

Note : For external SAP HANA service you have to provide parameter key and value also the property key value.
From target service Key value the property key will set as a value.

 

---------------------------------------------------------------------------------------------------------------------------------------
-------------------------Syntax Difference in MTA --------------------------------------------------------

parameters:
service-name: pp.purmdb-container --- container name
properties:
cross-service-service: '${service-name}'
type: org.cloudfoundry.existing-service --- Type is different

Early case within HDI container -

properties:
user-container-name: '${service-name}'
type: com.sap.xs.hdi-container


----------------------------------------------------------------------------------------------------------

 

c.  Create two HDBrole - One for the object owner which following with # and having grand option and one for application use.
d. Create .hdbgrantfile using the roles.
e. Create Synonym and mention the source object which need to be consume in target.
f. create a .hdb synonym file .hdbsynonymconfig . Same syntax as synonym only add grantor.
g. Build it and you can use the source DB object mentioned in synonyms in the tagree DB.

 

----------------------------------------------------------------------------------------------------------------------------------

2. When Source and Target HDI container exists in the different HANA server and source is Classic schema -
Same steps only few changes -
Change-1:
in MTA.YML file
:
parameters:
service-name: plb_schema (Schema Name)
properties
cross-schema-service: '${service-name}'
type: org.cloudfoundry.existing-service.

Change-2: In .hdbsynonym files need to add schema": "SFLIGHT" along with object.

Change-3: No .hdbsynonymconfig is needed.

in .hdbgrants file define schema role instead of container role.




 

-------------------------------------------------------------------------------------------------------------------------------------
.hdb role file
-------------------------------------------------------------------------------------------------------------------------------------

 
-------------------------------------------------------------------------------------------------------

role1#.hdbrole
-------------------------------------------------------------------------------------------------------
{
"role":
{
"name": "role1#",
"object_privileges": [
{
"name": "TRGT_DB",
"type":"TABLE",
"privileges_with_grant_option": ["SELECT"]
}
]
}
}

-------------------------------------------------------------------------------------------------------
role.hdbrole
-------------------------------------------------------------------------------------------------------
{
"role":
{
"name": "role",
"schema_privileges": [
{
"privileges": ["SELECT METADATA"]
}
]
}
}


-------------------------------------------------------------------------------------------------------

 

#*************** CDS Artifacts to create tables***************************************


 
CDS Artifacts
-------------------------------------------------------------------------------------------------------
** Below CDS Artifacts will create following database table ****************************
namespace data.model;

context TireShop {

entity Tire{
key id : Integer;
Name : String(100);
Stock : Integer;
DealerID : Integer;
Tire : Association to one TireDealer;
}

entity TireDealer {
key DealerID : Integer;
DealerName : String(111);
}


}
** Below CDS Artifacts will create following database table *******************************************
** Similar to the DDL of Create two column Table*******************************************************


COLUMN TABLE "data.model::TireShop.Tire" (
ID integer,
NAME nvarchar(100),
Stock integer,
DealerID integer,
primary key ( ID )
)

COLUMN TABLE "data.model::TireShop.TireDealer" (
DealerID integer,
DealerName nvarchar(111),
primary key ( DealerID)
)



 

# Sample HDBTABLE file to Load data to table from Excel and CSV *************************************
-------------------------------------------------------------------------------------------------------
HDB Table
.hdbtabledata file
-------------------------------------------------------------------------------------------------------

{
"format_version": 1,
"imports": [{
"target_table" : "PLB_PPROJECT.DB_PLB::EMPLOYEE",
"source_data": {

"data_type" : "CSV",
"file_name" : ""PLB_PPROJECT.DB_PLB::EMPLOYEE.CSV",
"has_header" : true,
"no_data_import": false,
"delete_existing_foreign_data": false,
"dialect" : "HANA",
"type_config" : {
"delimiter" : ","
}
},
"import_settings" : {
"import_columns" : [
"EMP_ID",
"EMP_NAME",
"EMAIL_ID",
"ADDRESS",
],
"include_filter" : [ ]
} ,

"column_mappings" : {
"EMP_ID" : 1, "EMP_NAME" : 1, "EMAIL_ID" : 1, "ADDRESS" : 1
}

}]
}

 

 

 
Labels in this area