Skip to Content
Technical Articles
Author's profile photo Laszlo Kajan

GitLab MTA Build Pipeline with Focused Build and SCP TMS Integration

tl;dr

  • Use two pipelines linked via a trigger token in GitLab to safeguard secrets such as passwords and service tokens.
  • Almost all APIs are available for the pipeline, except for assigning and decoupling transport requests to/from work items. The automation of these actions using puppeteer is surprisingly smooth.

Goal

Given:

When:

  • Developer pushes new commit to GitLab

Then:

  • Perform (automated) continuous integration and continuous deployment to development tier
  • Use a standardised, configurable, secure build pipeline
  • Track life cycle, and control transportation of MTAR build artefact using Focused Build

Development%20sub-account%2C%20GitLab%2C%20TMS%20and%20FB

Logical Pipeline

Background:

  • Control of MTAR transportation by Focused Build is implemented via the SCP Transport Management Service (TMS)
  • Transportation to further tiers is done manually through FB work item (WI) state transitions
  • Focused Build is used for on-premise S/4HANA configuration and development, also a 4-tier landscape

Abbreviations

FB Focused Build for SAP Solution Manager
MBT Cloud MTA Build Tool
MTA Multi-target Application (aka. ‘Multitarget’ Application)
MTAR Multitarget Application Archive
SCP SAP Cloud Platform
SolMan SAP Solution Manager
TMS, TM Service SAP Cloud Platform Transport Management Service
WI Work Item (in Focused Build)

Challenges

  • SolMan and TM Service credentials must be inaccessible to developers
  • Build pipeline must be configurable within controlled limits
  • Continuous integration and deployment pipeline must be fully automated, in spite of lack of certain Focused Build APIs (as of 10-Nov-2020)

Solution

In order to preserve the confidentiality of SolMan and TMS credentials and provide a standardised, configurable build pipeline, the logical pipeline is implemented using two GitLab pipelines: the ‘trigger’ pipeline, and the ‘standard’ pipeline.

  • The ‘trigger’ pipeline is activated by commits to the SCP application under development, and triggers the ‘standard’ pipeline. This pipeline and its variables are writable by application developers and maintainers, respectively.
  • The ‘standard’ pipeline builds, tests, deploys and registers (i.e. assigns) the build artefact. This pipeline and its variables are writable only by the pipeline developers and maintainers. Application developers can’t view or expose pipeline variables such as SolMan credentials and TMS keys.

Logical%20Pipeline%20Implementation

Pipeline Implementation

‘Trigger’ pipeline

Pipeline definition ‘.gitlab-ci.yml’ for MTA project under development:

variables:
  PROJECT_PATH: "$CI_PROJECT_PATH"

stages:
  - trigger

# https://docs.gitlab.com/ee/ci/triggers/README.html#trigger-token
# Note: latest commit of default branch of ${PROJECT_PATH} will be processed.
trigger mbt pipeline:
  stage: trigger
  rules:
    - if: '($CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH || $CI_COMMIT_BRANCH == "mbt-build") && $DEVOPS_MBT_PIPELINE_TRIGGER'
  script:
    - . .gitlab-ci-vars.sh
    - 'test "$WORK_ITEM"'
    - 'echo "WORK_ITEM=$WORK_ITEM"'
    - 'curl -X POST -F "token=${DEVOPS_MBT_PIPELINE_TRIGGER}" -F "ref=master"
      -F "variables[PROJECT_PATH]=${PROJECT_PATH}"
      https://***.com/api/v4/projects/4809/trigger/pipeline'

For ease of access from subsequent scripts, we set the work item number in a separate shell file, ‘.gitlab-ci-vars.sh’:

WORK_ITEM="3200000665"

Note the ‘DEVOPS_MBT_PIPELINE_TRIGGER’ variable above. This is a GitLab trigger token, obtained for the ‘standard’ pipeline. It is limited to pipeline triggering, and provides the solution to running pipelines without exposing (downstream) secrets or allowing modifications.

The above single-job pipeline will be triggered for every commit on the default, as well as the ‘mbt-build’ branch, as long as the pipeline trigger variable is defined, and will trigger the ‘standard’ pipeline with the GitLab project of the triggering pipeline.

Code to be deployed can be kept either on the default or on the ‘mbt-build’ branch. As shown below, in case the ‘mbt-build’ branch exists, this – and not the default – is used by the ‘standard’ pipeline.

Note that in case the trigger token is compromised, the worst that can happen is the unexpected re-build of the last commit.

Pipeline variables

Application project / Settings / ‘CI / CD’ / Variables:

DEVOPS_MBT_PIPELINE_TRIGGER trigger token for ‘standard’ pipeline

‘Standard’ pipeline

Pipeline definition ‘.gitlab-ci.yml’ for the ‘standard’ pipeline:

variables:
  ASSIGN_TRANSPORT_REQUEST_VERSION: 1.0.0
  DEVOPS_MBT_IMAGE: $CI_REGISTRY/***/scp/supporting-projects/devops-mbt-dockerfile:a0e78409c4d7538cf04f8591a0634db470cda729
  DEVOPS_PUPPETEER_IMAGE: $CI_REGISTRY/***/scp/supporting-projects/devops-puppeteer-dockerfile:f9eb7e51543e8150122de1c214e691dc7f0759f0
  DEVOPS_SCRIPTS_BRANCH: 1.2.1
  DEVOPS_SCRIPTS_PATH: '***/scp/supporting-projects/devops-scripts'
  DOCKER_DRIVER: overlay2

default:
  image: ${DEVOPS_MBT_IMAGE}
  interruptible: false

# https://docs.gitlab.com/ee/ci/yaml/#yaml-anchors-for-script
# Note: we can't move this to an include: aliases don't work across includes:
#   https://docs.gitlab.com/ee/ci/yaml/#anchors
#
# Sync with .gitlab-ci-test.yml
.clone_upstream: &clone_upstream
  - 'git clone -b master --depth 50 https://gitlab-ci-token:${CI_JOB_TOKEN}@${CI_SERVER_HOST}/${PROJECT_PATH}.git default'
  - 'cd default'
  # Is there an 'mbt-build' branch?
  - 'git checkout "mbt-build" || true'
  - 'git branch --show-current'
  - 'git rev-parse HEAD'
  - 'cd ..'

.clone_devops_scripts: &clone_devops_scripts
  - 'git clone -b "${DEVOPS_SCRIPTS_BRANCH}" --depth 50 https://gitlab-ci-token:${CI_JOB_TOKEN}@${CI_SERVER_HOST}/${DEVOPS_SCRIPTS_PATH}.git "$(basename ${DEVOPS_SCRIPTS_PATH})"'
  - 'export PATH="$PWD/$(basename ${DEVOPS_SCRIPTS_PATH}):$PATH"'

.if_ci_pipeline_triggered: &if_ci_pipeline_triggered
  if: '$CI_PIPELINE_TRIGGERED == "true"'

.if_not_ci_pipeline_triggered_then_never: &if_not_ci_pipeline_triggered_then_never
  if: '$CI_PIPELINE_TRIGGERED != "true"'
  when: never

# Triggered variables not showing when using workflow:rules
#   https://gitlab.com/gitlab-org/gitlab/-/issues/220793
#   * CI_PIPELINE_TRIGGERED bug
# kajanl: Especially because this bug keeps us from being able to see the triggering token, we can't use 'workflow'.
#   Copy the rules over to each job.
#workflow:
#  rules:
#    - if: '$CI_PIPELINE_SOURCE == "trigger" || $CI_PIPELINE_SOURCE == "api"'

stages:
  - prepare
  - build
  - test
  - deploy
  - assign

# Jobs
sanity check:
  stage: prepare
  needs: []
  rules:
    - *if_ci_pipeline_triggered
  script:
    # Sanity
    - 'test "${CI_COMMIT_REF_PROTECTED}" = "true"'
    - 'test "${PROJECT_PATH}"'
    - 'test "${SOLMAN_HOST}"'
    - 'test "${SOLMAN_PASS}"'
    - 'test "${SOLMAN_USER}"'
    - 'test "${TRANSPORT_SERVICE_INSTANCE_KEY}"'
    - *clone_upstream
    - sed -ne "/^WORK_ITEM=[\"']\?\([[:alnum:]]\+\)[\"']\?\$/{p;}" default/.gitlab-ci-vars.sh > ./.gitlab-ci-vars.sh
    # Find PROJECT_ID of PROJECT_PATH
    - 'unset PROJECT_ID'
    - 'PROJECT_PATH_URL=$(echo -n "${PROJECT_PATH}" | jq -sRr @uri)'
    - 'PROJECT_RESPONSE=$(curl --silent --show-error --header "PRIVATE-TOKEN: ${READ_API_PATOKEN}" "https://${CI_SERVER_HOST}/api/v4/projects/${PROJECT_PATH_URL}")'
    - 'if ! { PROJECT_ID=$(echo -n "$PROJECT_RESPONSE" | jq -er ".id"); } then
        echo "$PROJECT_RESPONSE";
        exit 1;
      fi;'
    - 'echo "PROJECT_ID=\"$PROJECT_ID\"" >> .gitlab-ci-vars.sh'
    # .gitlab-ci-vars.sh
    - 'cat .gitlab-ci-vars.sh'
    - '. .gitlab-ci-vars.sh'
    - 'test "${WORK_ITEM}"'
    - 'test "${PROJECT_ID}"'
  artifacts:
    expire_in: 1 week
    paths:
      - default/
      - .gitlab-ci-vars.sh

# Tests Tests Tests Tests Tests Tests Tests Tests Tests Tests Tests Tests
#
# kajanl: Temporary implementation while chile pipeline can't be used, see below.
.cd_default: &cd_default
  # kajanl: Unfortunately it's not possible to nest anchors.
  - 'cd default'
  - 'echo $PWD'

# kajanl: The problem here is that CODE_QUALITY_IMAGE from [1] tries to pull in codeclimate/codeclimate-structure,
#   and somehow hits the 'toomanyrequests' error because of Docker-in-Docker, in spite of the runners now being on the
#   enterprise license.
#   [1] https://gitlab.com/gitlab-org/gitlab/-/blob/v13.4.2-ee/lib/gitlab/ci/templates/Jobs/Code-Quality.gitlab-ci.yml
code_quality:
  needs:
    - job: 'sanity check'
  # kajanl: Still valid and needed in our version 13.4:
  dependencies:
    - 'sanity check'
  before_script:
    - *cd_default
    - export SOURCE_CODE="$PWD"
    # GitLab runners hit docker pull limit despite the authentication
    #   https://***.com/t/gitlab-runners-hit-docker-pull-limit-despite-the-authentication/5526
    - mkdir -p $HOME/.docker
    - echo "$DOCKER_AUTH_CONFIG" > $HOME/.docker/config.json
  rules:
    - *if_not_ci_pipeline_triggered_then_never
    - if: '$CODE_QUALITY_DISABLED'
      when: never
    - if: '$CI_COMMIT_TAG || $CI_COMMIT_BRANCH'

eslint-sast:
  needs:
    - job: 'sanity check'
  before_script:
    - *cd_default
    - if ! [ "$(find .
        -name '*.html' -o
        -name '*.js' -o
        -name '*.jsx' -o
        -name '*.ts' -o
        -name '*.tsx')" ]; then
        echo "Nothing to test."; exit 0;
      fi;
  rules:
    - *if_not_ci_pipeline_triggered_then_never
    - if: $SAST_DISABLED
      when: never
    - if: $CI_COMMIT_BRANCH &&
          $SAST_DEFAULT_ANALYZERS =~ /eslint/
      #exists: # kajanl: 'exists' can't be used: it checks existence in the mbt repo.
      #  - '**/*.html'
      #  - '**/*.js'
      #  - '**/*.jsx'
      #  - '**/*.ts'
      #  - '**/*.tsx'

nodejs-scan-sast:
  needs:
    - job: 'sanity check'
  before_script:
    - *cd_default
    - if ! [ "$(find .
        -name 'package.json')" ]; then
        echo "Nothing to test."; exit 0;
      fi;
  rules:
    - *if_not_ci_pipeline_triggered_then_never
    - if: $SAST_DISABLED
      when: never
    - if: $CI_COMMIT_BRANCH &&
          $SAST_DEFAULT_ANALYZERS =~ /nodejs-scan/
      #exists:
      #  - '**/package.json'

# Java
spotbugs-sast:
  needs:
    - job: 'sanity check'
  before_script:
    - *cd_default
    - if ! [ "$(find .
        -name '*.groovy' -o
        -name '*.java' -o
        -name '*.scala')" ]; then
        echo "Nothing to test."; exit 0;
      fi;
  rules:
    - *if_not_ci_pipeline_triggered_then_never
    - if: $SAST_DISABLED
      when: never
    - if: $CI_COMMIT_BRANCH &&
          $SAST_DEFAULT_ANALYZERS =~ /spotbugs/
      #exists:
      #  - '**/*.groovy'
      #  - '**/*.java'
      #  - '**/*.scala'

# kajanl: Switch this imported job off.
secret_detection_default_branch:
  needs:
    - job: 'sanity check'
  before_script:
    - *cd_default
  script:
    - /analyzer run
  rules:
    - when: never

secret_detection:
  needs:
    - job: 'sanity check'
  before_script:
    - *cd_default
  script:
    - /analyzer run
  rules:
    - *if_not_ci_pipeline_triggered_then_never
    - if: $SECRET_DETECTION_DISABLED
      when: never
    - if: $CI_COMMIT_BRANCH

# kajanl:
#   To do this properly, we may have to use the mta.yaml file, and go into each module found.
#   Could a dynamic child pipeline, defined based on the mta.yaml, be the answer?
#   
#   In our current version 13.4.2, test results that become artifacts /can't be accessed/ due to a bug:
#     #201784 Child pipeline artifacts cannot be downloaded via ref-based links
#
#     * Because of this, the tests are copied into the main pipeline here.
#
#test child pipeline:
#  stage: test
#  needs:
#    #- job: 'sanity check'
#    # test collector
#    - job: 'code_quality'
#    - job: 'eslint-sast'
#    - job: 'nodejs-scan-sast'
#    - job: 'secret_detection_default_branch'
#    - job: 'secret_detection'
#  rules:
#    - *if_ci_pipeline_triggered
#  #variables:
#  #  PROJECT_PATH: ${PROJECT_PATH}
#  #trigger:
#  #  include:
#  #    - local: .gitlab-ci-test.yml
#  #  strategy: depend
#
# Tests Tests Tests Tests Tests Tests Tests Tests Tests Tests Tests Tests

check WI in development:
  stage: test
  needs:
    - job: 'sanity check'
  rules:
    - *if_ci_pipeline_triggered
  script:
    - *clone_devops_scripts
    - '. .gitlab-ci-vars.sh'
    - 'work-item-is-in-development "${WORK_ITEM}" "${SOLMAN_HOST}"'

mbt build:
  stage: build
  needs:
    - job: 'sanity check'
  rules:
    - *if_ci_pipeline_triggered
  script:
    # source in upstream project variables
    - '. .gitlab-ci-vars.sh'
    # consider sourcing in some project script
    #- 'npm i -g @ui5/cli'
    - 'cd default'
    - 'mbt build'
# TESTING
#    - exit 1
  artifacts:
    paths:
      - 'default/mta_archives/'

deploy to TMService:
  stage: deploy
  needs:
    - job: 'check WI in development'
    - job: 'mbt build'
    - job: 'sanity check'
    #- job: 'test child pipeline'
  rules:
    - *if_ci_pipeline_triggered
  script:
    - *clone_devops_scripts
    - '. .gitlab-ci-vars.sh'
    - 'echo "TEST_SKIP_TMS_UPLOAD=${TEST_SKIP_TMS_UPLOAD}"'
    - 'if [ "${TEST_SKIP_TMS_UPLOAD}" ]; then
          echo "{\"transportRequestId\": ${TEST_SKIP_TMS_UPLOAD}}" > tms-upload-stdout.json;
        else
          tms-upload "$WORK_ITEM" "default/mta_archives" DCP 1 1 "${PROJECT_ID}" > tms-upload-stdout.json;
        fi;'
  artifacts:
    paths:
      - 'tms-upload-stdout.json'

# kajanl: It seems sometimes a search in SolMan doesn't yet see the new transport request.
#   Consider adding some wait time here.
assign to WI:
  stage: assign
  needs:
    - job: 'deploy to TMService'
    - job: 'sanity check'
  rules:
    - *if_ci_pipeline_triggered
  image: ${DEVOPS_PUPPETEER_IMAGE}
  script:
    - '. .gitlab-ci-vars.sh'
    - echo "@***:registry=https://${CI_SERVER_HOST}/api/v4/packages/npm/" >> ~/.npmrc
    - echo "//${CI_SERVER_HOST}/api/v4/packages/npm/:_authToken=${CI_JOB_TOKEN}" >> ~/.npmrc
    - "(cd / && npm i @***/assign-transport-request@${ASSIGN_TRANSPORT_REQUEST_VERSION});"
    - 'export PATH="/node_modules/.bin:$PATH"'
    - 'TRANSPORT_REQUEST_ID=$(jq ".transportRequestId" tms-upload-stdout.json)'
    - 'test "${TRANSPORT_REQUEST_ID}"'
    - 'echo ${TRANSPORT_REQUEST_ID} ${WORK_ITEM}'
    - 'assign-transport-request --solman-host "${SOLMAN_HOST}" -t "${TRANSPORT_REQUEST_ID}" -n "${WORK_ITEM}"'
  artifacts:
    when: on_failure # consider 'always'
    paths:
      - '**.png' # debugging screenshots

# kajanl: Temporary implementation while chile pipeline can't be used, see below.
include:
  # https://gitlab.com/gitlab-org/gitlab/blob/master/lib/gitlab/ci/templates/Auto-DevOps.gitlab-ci.yml
  #  - template: Auto-DevOps.gitlab-ci.yml
  - template: Jobs/Code-Quality.gitlab-ci.yml
  - template: Security/SAST.gitlab-ci.yml
  - template: Security/Secret-Detection.gitlab-ci.yml

This pipeline uses the following resources, also stored in GitLab. The links, for your convenience, are to GitHub clones:

Puppeteer “provides a high-level API to control headless Chrome or Chromium”. This is necessary because for the Focused Build work item ‘Assign Transport Request’ action, which links the TMS transport request to the work item, there is no known API (as of 10-Nov-2020).

FB%20WI%20Assign%20Transport%20Request

FB WI Assign Transport Request

Puppeteer is used to provide automation in absence of an API, via the (primarily human-focused) Web Dynpro interface. The automation is implemented in ‘assign-transport-request/index.js’. This Node.js script in run in Docker image ‘DEVOPS_PUPPETEER_IMAGE’, which is prepared specifically for this purpose. It is preloaded with the dependencies of the script, to prevent their repeated download during script installation. Also note how the version of ‘alpine‘, Chromium and puppeteer are kept in synchrony:

  • FROM node:12-alpine3.12
  • RUN apk add –no-cache chromium=86.0.4240.111-r0
  • RUN npm install … “puppeteer@chrome-86”

Download of another instance of Chromium, normally done when installing puppeteer, is prevented by setting ‘PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true’.

Pipeline variables

Standard pipeline project / Settings / ‘CI / CD’ / Variables:

READ_API_PATOKEN Personal access token with ‘read_api’ scope
SOLMAN_HOST
SOLMAN_PASS
SOLMAN_USER User authorised to assign and decouple transport requests
TRANSPORT_SERVICE_INSTANCE_KEY

Pipeline jobs

Standard%20Pipeline%20Jobs

Standard GitLab Pipeline Jobs 1

Standard%20GitLab%20Pipeline

Standard GitLab Pipeline Jobs 2

Job ‘test of some sort’ is only a placeholder for tests such as SAST, dependency scanning and secret detection.

Error handling

In case the ‘assign to WI’ transport request to work item assignment automation fails, a ‘png’ screenshot of the Chromium page is deposited in the ‘artifact archive’ of the job.

In case the wrong work item is set into ‘.gitlab-ci-vars.sh’ of the ‘trigger’ pipeline by mistake, the description of the TMS transport (unfortunately truncated in Focused Build) shows the project ID and commit at fault, e.g. “WI: 3200000665, commit: 8eb18752fe70781cd62b46ca83e156602f96, project: 4396”. This can be used to investigate the offending commit:

  1. GET https://***.com/api/v4/projects/:project_ID/repository/commits/:commit
  2. Follow the ‘web_url’ link

Unfortunately it is not possible (as of 11-Nov-2020) to search GitLab with the commit alone. Pipeline variable ‘READ_API_PATOKEN’ – a personal access token with ‘read_api’ scope – is used to query the project API.

Discussion

The logical pipeline presented above assigns the transport request of every new commit to the work item. This can easily lead to tens of transports assigned. With import times at least 3-5 minutes, moving these transports to subsequent tiers of the landscape can become very tedious.

The solution we favour is to keep only the latest transport request assigned, decoupling preceding ones. We recognise the risk that doing so may not recreate the development tier faithfully in subsequent tiers, as these tiers will not have received every, but only the last transport. Our developers must anticipate this pipeline behaviour.

The automation of the ‘Decouple Transport Request’ action is yet to be implemented, as well as testing such as SAST.

Author and motivation

Laszlo Kajan is a full stack Fiori/SAPUI5 expert, present on the SAPUI5 field since 2015, diversifying into the area of SCP development.

The motivation behind this blog post is to provide an example of a solution for using GitLab to orchestrate an MTA build pipeline for Focused Build cloud application development.

Further reading

Assigned Tags

      Be the first to leave a comment
      You must be Logged on to comment or reply to a post.