Skip to Content
Technical Articles
Author's profile photo Gaurav Abbi

Building CI CD pipelines for Kyma runtime extensions and applications

With more & more developers building extensions and applications using SAP BTP, Kyma runtime, it is but natural to implement CI-CD flows.

As a developer, I would like to ensure,

  • Code quality of my Kyma functions and microservices
  • All checks related to code styling, security best practices are followed
  • Sufficient code coverage to have confidence in the production deployments
  • Fewer surprises and fewer breaking changes

I would like to have seamless automated deployments

  • Creation of assets such as docker images
  • Functions & microservices are automatically deployed to my desired Kyma environment and namespace.
  • All necessary configurations are also applied
  • My Source control (e.g Git) to be source of truth for all my deployments.

Below is a example flow using Github actions (which can be easily replaced by any other similar service such as Jenkins)


Developer CI CD Flow


The initial part is a standard PR flow where certain checks are performed for a PR to ensure it can be safely merged to main branch.

Post merge, the assets (in this case docker images) are created for microservices and pushed to docker registry.

Functions are configured using Kyma GitRepository feature which automatically pulls the source code and updates the running function.


Deploying to Dev

Deployment resources are defined as Helm-charts which are available under k8s-resources

Deployment to a dev landscape is done via the Github workflow deploy-to-dev

Any dev landscape specific configuration is provided in the values-dev.yaml.

For any confidential data, it is possible to create secrets and use environment variables with Github workflow.

The not-expiring Kubeconfigs are obtained following the instructions kubeconfig-for-sa

It uses various actions from Microsoft azure to set up the Kubernetes environment such as:

  • Setting the K8S context
  • Setting up Helm

The KUBECONFIG is configured safely as an environment secret in the repository settings.


Below are the steps in deploy-to-dev workflow.

  # This workflow contains a single job called "build"
    # The type of runner that the job will run on
    runs-on: ubuntu-latest
    # Steps represent a sequence of tasks that will be executed as part of the job
      # Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
      - uses: actions/checkout@v2
      - uses: azure/k8s-set-context@v1
          method: kubeconfig
          kubeconfig: ${{ secrets.DEV_KUBECONFIG }}
      - uses: azure/setup-helm@v1
          version: 'v3.5.1'
      - name: Helm Deployment
        working-directory: k8s-resources
        run: helm upgrade k8s-resources . -f ./values-dev.yaml --install
      - name: Run smoke tests on dev environment
        run: echo "running smoke tests"


The workflow will deploy the workloads as well all the necessary configurations such as:

  • API Rules
  • ServiceBindingUsage
  • K8s Deployments, Services
  • and any other required resources.

The only exception is to create the Service Instances for required services and events. For Services, Secrets are also required to be created. In the future releases, these manuals steps will also be obviated.



The corresponding details then need to (secret name and gateway url variable name), then needs to configured in values-dev.yaml & values-prod.yaml for dev and prod landscapes respectively.


Promote to production


The workflow to promote to production is very similar to the one used for deploying to dev landscape.
The only differences are it uses a different kubeconfig and values-prod.yaml for helm installation.


Deployed Samples

The deployed samples demonstrate a typical extension scenario.

Extension or function is triggered based on an event (order.created) from a SAP system (in this case, mock SAP Commerce Cloud). The logic then retrieves the order details by making an API call via API gateway & stores it.
The stored details are then available an API exposed using API Rule.





The sample project can be accessed at SAP-Samples Github

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Sergei Chevtsov
      Sergei Chevtsov

      Does SAP plan to support this pipeline as part of SAP Continuous Integration and Delivery - SAP Help Portal?

      Author's profile photo Thorsten Duda
      Thorsten Duda

      Hi Sergei,

      yes it is on our roadmap.

      Best regards,