Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
uwevoigt123
Participant
Edit on Jan 31,2020: Source is available here: https://github.com/uvoigt/scpi-ci-tools

Introduction


When we think of a development cycle of a cloud application, this has certain common ingredients:

  • We store development artifacts within a code repository like Git.

  • We have an integrated continuous build and delivery pipeline which targets multiple deployment stages.

  • We can use a CLI for the runtime platform which gives us the chance to use common shell-based tooling for some monitoring tasks.


Since the SAP Cloud Platform Integration still does not fully support this scenario, there are the public OData API’s to fill in the gap, at least partly.

 

This blog first explains how we defined our development and staging workflow and then shows how I used these API’s by putting them all together within a Bash-based CLI.

Usage and workflow


We have three stages (three sub accounts) on the Cloud Platform. This is useful for restricting access to required roles and for staging specific scaling.

Development of integration flows


This takes place in the development environment. It is the only stage integration flows require to be in the design workspace because we have to use the integration flow editor. Each developer may have an own package to visibly separate from each other.

Start of the development


When starting the development on an integration flow, it is either created from scratch or copied from another package from the ‘Discover’ section.

When copied, the first action has to be to execute
scpi design download -p <artifact-id>

which downloads the integration flow, creates a git repository locally and remotely via Bitbucket API and finally pushes the newly created repository to the remote.

Since there is already a bitbucket-pipelines.yml inside, it takes a couple of seconds and the integration flow is deployed within the development environment runtime.

Modify an integration flow or fix a bug


The integration flow might have been removed from the design workspace (we do not have to have them here because we stored them in the git repository). If that is the case, clone the remote repository or do
git checkout develop
git pull

and then execute
scpi design create <artifact-id>

Apply the modifications to the integration flow. If done, execute
scpi design download -p <artifact-id>

to push the changes to the remote repository. The integration flow might have been deployed in the meantime by the developer for testing purposes. Nevertheless, when the push occurs the pipeline finally deploys it to the development environment.

Bring an integration flow to the test stage


If this is the first time the integration flow is pushed to the test stage, a branch ‘test’ has to be created and prepared. This is done by cloning the repository or do
git checkout develop
git pull

Then
git checkout -b test

Now the file src/main/resources/parameters.prop has to be modified so that all staging specific variables like endpoints to the Commerce Cloud match those of the test stage.
That file contains all externalized parameters that are defined using the integration flow editor.

Then
git push –set-upstream origin test

Now the pipeline deploys this integration flow to the testing stage for the first time.

 

For subsequent merges into the test branch execute
git merge develop

instead of creating a new branch.

If modifications on the parameters.prop file have been applied in development, this always requires a manual merge conflict resolution. The developer should be careful when doing this!

Bring an integration flow to the production stage


This is almost the same as to the test stage except that we name the branch ‘release-<version>’. We can work the supportive function of the git hosting environment like branch protection or manual intervention when executing pipelines.

Parts of the CLI and their API counterparts


There is the main entry point called scpi which directs to concrete scripts with the functionality. We also have an scpi-completion.sh to support the user by typing tab twice to get sub-command or argument support in the shell.

The following arguments are necessary to enable authentication via OAuth2. Once given, they are saved for subsequent invocations within $HOME/.scpi/config


OAuth authentication


The scripts support to kinds of the OAuth flow, one that can be used by human users from their shells and the second which is meant for the continues integration pipeline.

OAuth authentication of a human user


The flow contains a redirect URI that is hit after the user has authenticated himself at the Cloud Platform. It is a very simple web server in C that is launched by the authentication part of the scripts and stopped after the authorization code has been grabbed from the log file.


OAuth authentication within the CI


This scenario requires a secret and a password which is configured within the pipeline execution environment. In my case this is Bitbucket.



Below is how it looks in Bitbucket. I use a custom docker registry to pull a thin image with the CLI built in and the variable DEPLOY_CONFIG that contains a base64 encoded file with all the necessary secrets and passwords for the stages.



This is a sample content of the deploy config file:

 

#!/bin/sh
# der Variablenpräfix muss mit dem Deployment-Environment übereinstimmen
export DEVELOPMENT_ACCOUNT_ID='devAccountId'
export DEVELOPMENT_OAUTH_PREFIX='devOauthPrefix'
export DEVELOPMENT_CLIENT_CREDS=' odata_ci:password'
export TEST_ACCOUNT_ID='testAccountId'
export TEST_OAUTH_PREFIX='testOauthPrefix'
export TEST_CLIENT_CREDS=' odata_ci:password'
export PRODUCTION_ACCOUNT_ID='prodAccountId'
export PRODUCTION_OAUTH_PREFIX='prodOauthPrefix'
export PRODUCTION_CLIENT_CREDS=' odata_ci:password'

 

scpi design packages


This lists the packages of the design workspace.
Unfortunately, there is no public OData API to get the design time artifacts. Therefore, I used the same API which is used by the Cloud Platform Integration web application. Since OAuth authentication is impossible here, we have to type user name and password.
While listing the packages, their ID’s additionally are stored within $HOME/.scpi/packages so that they can be used by the bash completion script. (If those API’s would support OAuth2, this won’t be necessary.)


scpi design artifacts


This lists the design time artifacts in all packages. A package ID can be specified to list only the artifacts of that package.
While listing the artifacts, their ID’s additionally are stored within $HOME/.scpi/artifacts so that they can be used by the bash completion script.

scpi design download


With this command a design time artifact can be downloaded. The corresponding API is GET /IntegrationDesigntimeArtifacts(Id=’{Id}’, Version=’{Version}’)/$value

The downloaded zip is extracted and by default copied to the directory $GIT_BASE_DIR/iflow_<artifact_id>. The zip program asks you to confirm overriding existing files which we normally do because this is a git repository which gives us all tooling to compare with previous commits.

scpi design create


This can be used to create an artifact within the design workspace (the used API is POST /IntegrationDesigntimeArtifacts).

Options


<Artifact ID> – when pressing tab for completion, all local artifacts (directories starting with iflow_ within the configured workspace root)

<Package ID> – when pressing tab for completion, the packages saved in $HOME/.scpi/packages are presented

[Artifact Name] – if not specified, this is the same as the artifact ID

[Folder] – the directory to read the artifact from. By default, this is $GIT_BASE_DIR/iflow_<artifact_id>

 

With this command, it is possible to only have the integration flows within the design workspace which you currently work on.

scpi design delete


This lets you delete an artifact from the design workspace. It corresponds to the API DELETE /IntegrationDesigntimeArtifacts(Id=’{Id}’, Version=’{Version}’)

scpi design deploy


Only there for completeness. This corresponds to the API POST /IntegrationDesigntimeArtifact.DeployIntegrationDesigntimeArtifact

scpi runtime artifacts


This lists all deployed artifacts corresponding to the API GET /IntegrationRuntimeArtifacts


scpi runtime deploy


Triggers the deployment of the artifact uploaded. This corresponds to the API POST /IntegrationRuntimeArtifacts

This enables us to deploy artifacts without having them within the design workspace.

 

Conclusion


This was an overview how the OData API's can be used to create a developer workflow which better fits to what one might be used to from other cloud environments.

There are more functions included like message processing logs display or endpoint invocation (with another OAuth client involved because this needs other permissions), but that would exceed the scope of this article.
3 Comments
Labels in this area