Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
jacekklatt
Product and Topic Expert
Product and Topic Expert
Note: This blog complements the main blog on SAP Business Technology Platform – Software Lifecycle with elements of Continuous Integration, Deliv....
The main blog provides a good starting point and it also includes video showing an end-to-end scenario using integration aspects discussed here.
This and associated blogs are result of collaboration between swiftc.

Introduction


In the main blog we have described our overall solution, design and build artefacts involved and the tooling we chose to orchestrate delivery of the artefacts to Production environment. Refer to the overview blog for details (including important analysis of interoperability between tools) - here is a reminder of what our DevOps flow looks like:


Automation flow


In this blog we wanted to share details on the integration between tools, artefacts, and solutions (end points) involved. While we are covering some well-travelled ground, we wanted to provide end-to-end view of how we built our scenario. Having said that, where our approach is identical to one well described elsewhere, we refer to existing material.

This blog only deals with the “HOW?” – if you are interested in the “WHY?” (why these tools, why these integrations etc.), please refer to the parent blog.

This blog is divided into sections dealing with integration of:

Development tools and source code management


SAP Business Application Studio and GitHub


This integration is well covered in the:

In our case we used GitHub.com as our git repository.

Visual Studio Code for SAP Data Intelligence and GitHub


In building our integration, we used material provided in the:

Video below provides step-by-step instructions on how the integration is set-up and what it looks like in practice.



Also refer to related video in section below - Project “Piper” and SAP Data Intelligence Cloud - which takes the story further and shows integration with build server and automation pipeline.

Source code management and automation pipeline


GitHub and Project “Piper”


The integration of GitHub and build server (in our case Jenkins with Project “Piper” libraries) is well documented in https://plugins.jenkins.io/git/. You can also see an example of integration of build server job with git repository in the video in section below - Project “Piper” and SAP Data Intelligence Cloud.

Automation pipeline and delivery management


Project “Piper” and SAP Cloud Transport Management


Project “Piper” provides built-in integration with SAP Cloud Transport Management service – the process is well documented in:

SAP Cloud TMS offers several APIs documented in SAP API Business Hub, for example (only sample subset listed):

  • /files/upload
    Uploads a file (application or content archive) to Cloud Transport Management and returns a file ID. You can reference the file ID in Node Export and Node Upload requests.

  • /nodes/export
    Creates a transport request with content specified by the File Upload operation or a by an application-specific reference. The export node is identified by its name. The transport request is added to the queues of the follow-on nodes of export node.

  • /nodes/upload
    Creates a transport request with content specified by the File Upload operation or a by an application-specific reference.
    The transport request is added to the queue of the upload node.

  • /nodes/{nodeId}/transportRequests/import
    Imports transport requests specified in request body.

  • /nodes/{nodeId}/transportRequests/importAll
    Import all importable transport requests of a node.


The built-in integration provided with Project “Piper” implements only the tmsUpload (equivalent of “/nodes/upload” API above) method which uploads solution package to an import buffer of the SAP Cloud Transport Management service (CTMS) node. Actual import into the target system/tenant must be controlled via CTMS. Alternatively, additional methods could be custom implemented on the build server using published APIs.


Automation pipeline and end point


Project “Piper” and SAP HANA Cloud or SAP BTP, Cloud Foundry runtime


In our scenario we are using Cloud Application Programming Model to build and deploy SAP HANA Cloud artefacts and application for SAP BTP, Cloud Foundry. Project “Piper” supports build and deployment automation in such scenario as per Build and Deploy SAP Cloud Application Programming Model Applications.

In this integration model, instructions for the build server are not preconfigured in the build job (as is the case in our example integration with SAP Data Intelligence – refer to next section), but rather provided within the CAP project.

These instructions are generated by @sap/cds-dk command line interface (CLI) (or based on existing project/template) and specifically using:
cds add pipeline

which will generate two artefacts - Jenkinsfile and .pipeline/config.yml.

Let’s have a look at their content in our project.

Jenkinsfile:
@Library('piper-lib-os') _
node() {
stage('prepare') {
cleanWs()
checkout scm
setupCommonPipelineEnvironment script: this
}
stage('build') {
mtaBuild script: this
}
stage('deploy') {
cloudFoundryDeploy script: this
}
stage('tmsUpload') {
tmsUpload script: this
}
}

The stages defined in this file will be executed by build server as illustrated in the build log:


Build log - stages


What happens in each stage:

  • prepare

    • cleanWS()
      Clean-up of working directory to remove any files related to previous executions – code as well as build artefacts (like MTA archive files). This function is provided by Workspace Cleanup plugin for Jenkins (needs to be installed).
      Alternatively, as per example, following function could be used (as documented😞
      deleteDir()

    • checkout scm
      Checkout code from source control; scm is a special variable which instructs the checkout step to clone the specific revision which triggered this Pipeline run.

    • setupCommonPipelineEnvironment
      As per documentation:
      Initializes the commonPipelineEnvironment, which is used throughout the complete pipeline.




This step needs to run at the beginning of a pipeline right after the SCM checkout. Then subsequent pipeline steps consume the information from commonPipelineEnvironment; it does not need to be passed to pipeline steps explicitly.

  • build

    • mtaBuild
      As per documentation:
      Executes the SAP Multitarget Application Archive Builder to create an mtar archive of the application.



  • deploy

    • cloudFoundryDeploy
      As per documentation:
      Deploys an application to a test or production space within Cloud Foundry.
      We recommend to define values of step parameters via config.yml file. In this case, calling the step is reduced to one simple line.



  • tmsUpload


tmsUpload
As per documentation:
This step allows you to upload an MTA file (multi-target application archive) and multiple MTA extension descriptors into a TMS (SAP Cloud Platform Transport Management Service) landscape for further TMS-controlled distribution through a TMS-configured landscape.

.pipeline/config.yml
steps:
mtaBuild:
buildTarget: 'CF'
cloudFoundryDeploy:
deployTool: 'mtaDeployPlugin'
deployType: 'standard'
cloudFoundry:
org: 'CF-ORG-XXXXXXXX'
space: 'CF-SPACE-XXXXXX'
credentialsId: 'BTP_CREDS'
database_id: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
tmsUpload:
credentialsId: 'BTP_TMS'
nodeName: 'XXXXXXXXX'
verbose: 'true'

This file provides additional parameters for execution of steps from Jenkinsfile, for example:

  • mtaBuild

    • buildTarget: 'CF'
      Specifies Cloud Foundry as the target platform where the built archive will be deployed.



  • cloudFoundryDeploy:

    • deployTool: 'mtaDeployPlugin'

    • deployType: 'standard'

    • cloudFoundry:

      • org: '<Cloud Foundry org>'

      • space: '<Cloud Foundry space>'

      • credentialsId: '<credentials used to perform cf deploy command>'

      • database_id: <SAP HANA Cloud database/instance identifier>



    • tmsUpload:

      • credentialsId: '<credentials used to call CTMS APIs>'

      • nodeName: '<name of the transport node in CTMS'

      • verbose: 'true'
        Print more detailed information into the log.






Video provided in the section Project “Piper” and SAP Cloud Transport Management above provides walk through the integration aspects and automation process.

Project “Piper” and SAP Data Intelligence Cloud


The process is well defined in the Continuous Integration and Delivery with SAP Data Intelligence and illustrated in Git Integration and CI/CD Process with SAP Data Intelligence – we have borrowed the diagram below from there:


Git integration and CI/CD process with SAP DI


The 3 key steps in the automation portion are:

  • Step#4: Bundle solution

    • Extract solution name from manifest file
      Read manifest file provided in associated git repository and extract name and version of the DI solution.

    • Prepare solution package
      Create package by using OS zip function on content extracted from git repository.



  • Step#5: Install solution

    • Identify current strategy configured for DI tenant.

    • Check if solution to be imported already exists in the tenant’s default strategy – if yes, delete.

    • Upload solution to DI tenant.

    • Add solution to current default strategy of the DI tenant.



  • Step#6: Test solution

    • Execute defined test pipeline.

    • Collect and check results.




Steps bundle and install solution use System Management Command-Line Client Reference for SAP Data Intelligence (aka vctl) called in OS shell of the container running CI/CD server (Jenkins in our case) with vctl  tool deployed. Refer to blog Project “Piper” – Jenkins on Docker for details on how we did Docker image packaging and deployment.

We have introduced one improvement in this process – based on new capabilities in the vctl tool we modified the bundle solution step as follows (original script, lines 35-44):
# Zip folder in temp space and move back
tmp_dir=$(mktemp -d -t vsol-XXXXXXXXXX)
mkdir -p "$tmp_dir/content/files/"
cp -R vflow "$tmp_dir/content/files/"
cp manifest.json "$tmp_dir/"
pushd . > /dev/null
cd $tmp_dir && zip -r $PACKAGE_NAME content/ manifest.json
popd > /dev/null
mv $tmp_dir/$PACKAGE_NAME .
echo "Created package '$PACKAGE_NAME'"

The above portion of the script prepares relevant files of the SAP DI solution (as sourced from git repository) and puts them into single ZIP archive using os-level zip tool. This approach requires inclusion of certain installation packages when preparing the CI/CD server image – as per Continuous Integration and Delivery with SAP Data Intelligence:

Dockerized Jenkins (for Testing)
For testing your build jobs, the following Docker container will provide a Jenkins server with all needed dependencies. You will need to have the vctl binary (linux) in the docker build path.
FROM jenkinsci/blueocean

USER root

RUN apk update \
&& apk add --no-cache linux-headers \
&& apk add --no-cache build-base \
&& apk add --no-cache zip \
&& apk add --no-cache python

COPY vctl /root/vctl

RUN chmod 777 root
ENV PATH="/root:${PATH}"

As stated, the above approach uses zip utility to prepare solution package. Conveniently, the vctl tool offers solution bundle feature:

vctl solution bundle
Synopsis
Bundle a solution located at <source-path> to a zip file.
vctl solution bundle <source-path> [flags]


We decided to use feature rather than os-level zip utility, which allowed us to change the bundle solution step as follows:
#!/bin/bash
#
# //////////////////////////////////////////////////////////////////////////////
# Preparing package of SAP Data Intelligence Pipeline Solutions
#
# This script bundles files of a solution that are tracked in
# a Git project into a vsolution package.
#
# Note: Solutions tracked in Git by following the "Git Workflow" guide
# have a slightly different structure than the native solution structure.
# This script re-organizes the file structure in a temporary folder before
# packaging the files. See the "Git Workflow" guide for more information.
#
# //////////////////////////////////////////////////////////////////////////////

echo "---------------------"
echo "@@ Bundle Solution @@"
echo "---------------------"

if [ ! -f "manifest.json" ] || [ ! -d "vflow" ]; then
echo "Error: Current path seems not to be a vsolution pipeline project (missing 'manifest.json' file or 'vflow' folder)"
exit 1
fi

BASE_DIR=$(pwd)

# Extract solution name from manifest file.
# Specify the relevant name of the Python executable – as per your CI/CD Docker image
DI_SOLUTION_NAME=$(python3 -c \
"import json; o = json.load(open(\"manifest.json\")); print(o['name'])") \
|| exit 1
DI_SOLUTION_VERSION=$(python3 -c \
"import json; o = json.load(open(\"manifest.json\")); print(o['version'])") \
|| exit 1
DI_SOLUTION="$DI_SOLUTION_NAME-$DI_SOLUTION_VERSION"
PACKAGE_NAME="$DI_SOLUTION.zip"

# Additional logging in the build trace
echo "BASE_DIR:" $BASE_DIR
echo "DI_SOLUTION_NAME:" $DI_SOLUTION_NAME
echo "DI_SOLUTION_VERSION:" $DI_SOLUTION_VERSION
echo "DI_SOLUTION:" $DI_SOLUTION
echo "PACKAGE_NAME:" $PACKAGE_NAME

# Setting temporary directory for solution bundle archive
tmp_dir=$(mktemp -d -t vsol-XXXXXXXXXX)
mkdir -p "$tmp_dir/content/files/"
cp -R vflow "$tmp_dir/content/files/"
cp manifest.json "$tmp_dir/"
pushd . > /dev/null
cd $tmp_dir
# Rather than using zip utility, we use native vctl “solution bundle” feature
vctl solution bundle . -t "$BASE_DIR/$PACKAGE_NAME"
cd "$BASE_DIR"

echo "Created package '$BASE_DIR/$PACKAGE_NAME'."

Also, since we no longer require os-level zip utility, we could simplify our Docker image – refer to blog Project “Piper” – Jenkins on Docker for details.

As a follow-on effect, and also to facilitate checking of the package in case of issues during deployment, we additionally introduced small changes to the install solution step – refer to the code below which we adopted for our needs:
#!/bin/bash

# Rather than storing credentials in the shell script, use CI/CD credentials
DI_PRD_URL='<HTTPs access point for your SAP DI tenant>'
DI_PRD_TENANT='<your SAP DI tenant name>'
#DI_PRD_USER='<<< specify user to be used for DI operations >>>'
#DI_PRD_PASSWORD='<<< specify password for DI_PRD_USER >>>'

echo "-------------------------"
echo "@@ Installing Solution @@"
echo "-------------------------"

# Prevent bash evaluation of single quoted arguments
# i.e., to prevent evaluation of $ characters in passwords
function CIRUN {
arr=("$@")
#echo "${arr[*]}"
"${arr[@]}"
return $?
}

if [ ! -f "manifest.json" ]; then
echo "Error: Current path seems not to be a vsolution pipeline project (missing 'manifest.json' file)"
exit 1
fi

# Extract solution name from manifest file.
# Specify the relevant name of the Python executable – as per your CI/CD Docker image
DI_SOLUTION_NAME=$(python3 -c \
"import json; o = json.load(open(\"manifest.json\")); print(o['name'])") \
|| exit 1
DI_SOLUTION_VERSION=$(python3 -c \
"import json; o = json.load(open(\"manifest.json\")); print(o['version'])") \
|| exit 1
DI_SOLUTION="$DI_SOLUTION_NAME-$DI_SOLUTION_VERSION"
PACKAGE_NAME="$DI_SOLUTION.zip"

if [ ! -f "$PACKAGE_NAME" ]; then
echo "Error: No solution bundle for solution '$DI_SOLUTION' exists. Please bundle the solution first!"
exit 1
fi

echo "- Sys login... "

CIRUN vctl login $DI_PRD_URL $DI_PRD_TENANT $DI_PRD_USER -p $DI_PRD_PASSWORD || exit 1

STRATEGY=$(vctl tenant get-strategy $DI_PRD_TENANT | head -n 1 | xargs)
echo "- Tenant '$DI_PRD_TENANT' is using strategy '$STRATEGY'"
echo "- Stopping pipeline modeler..."
vctl scheduler stop pipeline-modeler
#vctl scheduler stop --all

EXISTING_SOLUTION=$(vctl strategy get $STRATEGY -o json | python3 -c "import json, sys; o = json.load(sys.stdin); print('\n'.join([s for s in o[u'layers']]))" | grep $DI_SOLUTION)
if [ "$EXISTING_SOLUTION" != "" ]; then
echo "- Removing existing solution (maybe old version) '$EXISTING_SOLUTION' from strategy '$STRATEGY'..."
vctl strategy remove $STRATEGY $EXISTING_SOLUTION
fi

echo "- Removing solution '$DI_SOLUTION' from repository (if exists)..."
vctl solution get $DI_SOLUTION_NAME $DI_SOLUTION_VERSION && vctl solution delete $DI_SOLUTION_NAME $DI_SOLUTION_VERSION

echo "- Uploading solution '$DI_SOLUTION' to repository..."
vctl solution upload $DI_SOLUTION.zip

echo "- Adding solution '$DI_SOLUTION' to strategy '$STRATEGY'..."
vctl strategy add $STRATEGY $DI_SOLUTION

# Clean up the directory for subsequent pipeline executions.
# You can copy the content elsewhere if you wish to retain the content for checking.
BASE_DIR=$(pwd)
echo "- Cleaning up working directory '$BASE_DIR'... just to be on a safe side."
rm -rf "$BASE_DIR"

Please check the video below detailing the CI/CD pipeline set-up
has been configured.

https://sapvideoa35699dc5.hana.ondemand.com/?entry_id=1_cayouo5n

Delivery management and end point


SAP Cloud Transport Management and SAP HANA Cloud or SAP BTP, Cloud Foundry runtime


In case of SAP HANA Cloud or SAP BTP Cloud Foundry, deployment of design time artefacts is done using Multitarget Application archives (or MTAs). SAP Cloud Transport Management supports delivery of MTA entities in standard and configuration is well described in the documentation.

In the video published in the section above (Project “Piper” and SAP Cloud Transport Management), I’ve shown elements of the configuration of the SAP CTMS and use of its APIs.
3 Comments