Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
nunomcpereira
Explorer
This blog is part of a blog series, so you can find the first page here (https://blogs.sap.com/2023/02/02/sap-cpi-ci-cd-from-from-zero-to-hero/). This is the agenda we're following:

Code inspection and quality control


One of the key aspects of our interfaces is code quality and consistency. Since we have a lot of interfaces we need to make sure that all of them follow our development guidelines framework. After some investigation, I found CPI Lint, an open source github to bring lint into CPI from 7a519509aed84a2c9e6f627841825b5a


CPILint checking code compliance with development guidelines


I think the code is great so a special thanks to Morten for providing such tool. More details about CPI Lint on his github https://github.com/mwittrock/cpilint and also on this sap blog https://blogs.sap.com/2019/02/01/meet-cpilint/
Despite Morten released version 1.0.4, we're currently using version 1.0.3, and I'm in the process of migrating it to 1.0.4. In summary, the current tool is capable to read the code of an integration flow and check it against the rules defined on a xml file.

Unfortunately, our company has specific guidelines/rules that the "standard" tool does not cover, so I forked Morten repository to allow it to register extension rules by providing these extra rules on a separate project. The forked code (https://github.com/nunomcpereira/cpilint) is now able to search for extension jars on the classpath containing extra rules. I also had to make some of the methods public here and there to make it available to use on these extra rules.


CPILint custom (https://github.com/nunomcpereira/cpilint_custom) was then born to provide our custom rules only used at our company. Follows the extension rules created and what they do:


  • default-names-not-allowed-rule: Since CPI has no concept of comments for each component, we want to make sure that we have meaningful names on components that describe the logic of the iflow, so we check that for all possible CPI components we don't have default names such as "Content Modifier 1", "Content Modifier 2", "Request Reply 1" naming or "groovy1" for filenames. I heard from Morten that this is now supported on his version 1.0.4, so I'll follow up on that (example below).

  • unused-parameters-rule: How many times have you defined some external parameters that in the end were not used? CPI provides the "Remove unused parameters" button which would work in a similar fashion as this rule. This rule just asserts that all your defined parameters are being used (example of the externalized parameters screen below).

  • allowed-headers-empty: We have main iflows (reached from outside) and internal iflows communicating via process direct. In both scenarios, the "Allowed headers" setting being empty might be a problem because the headers would get lost between process direct calls if so. In case of main iflows, there are some headers that we allow to receive like the SapAuthenticatedUserName for instance. Right now according to our rule configuration we're only validating on purpose the communications via process direct, not making it mandatory to receive headers on the main iflow but this is configurable on the rule (example of allowed headers empty below).

  • response-headers-allowed: During developments, we were faced with an issue where a target system was called and returned an invalid header for CPI. I don't remember the details but if I recall it was because the header exceeds the maximum size that CPI can handle. With this error, we learn not to accept * by default on the response headers of our http calls. This rule is enforcing that (example of usage of response header on http adapter below).

  • undeclared-data-type: During developments we realized that we had a property defined on a content modifier without a type specified and for that particular scenario this resulted on a runtime error since CPI assumed that the property was somehow a complex object when we wanted it to be a regular String. So this rule checks all your properties and make sure that for the ones asking for a type (which is not mandatory on cpi), enforces it to be filled in when checking it via this rule (example of a property without datatype below).


 

All of these rules can be combined with the regular ones provided by the cpilint base code.

Example of a valid rules.xml file containing both cpi lint standard rules as well as custom ones.
<?xml version="1.0"?>
<cpilint>
<rules>
<default-names-not-allowed-rule>
<exclude>Exception Subprocess</exclude>
</default-names-not-allowed-rule>
<iflow-matches-name>
<naming-pattern>FER_(S2P|F2I|M2C|Common|InterfaceName|MessageMappingsUnitTest|)(.*)(Publish|Subscribe|MAIN)$</naming-pattern>
</iflow-matches-name>
<disallowed-scripting-languages>
<disallow>javascript</disallow>
</disallowed-scripting-languages>
<cleartext-basic-auth-not-allowed/>
<!-- <matching-process-direct-channels-required/> We can't use this since we have cross package references and cpilint runs on a package basis -->
<disallowed-receiver-adapters>
<disallow>facebook</disallow>
<disallow>ftp</disallow>
<disallow>twitter</disallow>
</disallowed-receiver-adapters>
<disallowed-sender-adapters>
<disallow>ftp</disallow>
</disallowed-sender-adapters>
<unencrypted-data-store-write-not-allowed/>
<unencrypted-endpoints-not-allowed/>
<csrf-protection-required-with-exclude>
<exclude>FER_DUMMYVALUE_MAIN</exclude>
<exclude>FER_DUMMYVALUE_MAIN2</exclude>
</csrf-protection-required-with-exclude>
<iflow-description-required/>
<unused-parameters-rule/>
<allowed-headers-empty><include>(.*)(Publish|Subscribe)$</include></allowed-headers-empty>
<response-headers-allowed><exclude>\*</exclude></response-headers-allowed>
<undeclared-data-type/>
</rules>
</cpilint>

To run it, you can go into the jenkins directory where you're zip files for the iflows are located and run:
def call(String packageId, String reponame, boolean changesDone){
script {
dir("./IntegrationContent"){
def localPackage = packageId.startsWith('Test')
if(!localPackage)
{
def files = findFiles glob: "**/*.zip"
boolean exists = files.length > 0

if(exists)
{
catchError(buildResult: 'UNSTABLE', stageResult: 'UNSTABLE') {
bat 'cpilint -rules %CPILINT_HOME%rules.xml -directory ./'
}
}
}
}
}
}

In our case, this file is stored inside the vars folder of the Jenkins git repository and it's interpreted as a shared library on Jenkins, just for the sake of reusability.


CPILint as a custom jenkins shared library


To register custom shared libraries on jenkins you need to go to jenkins homepage->Manage Jenkins->Configure system


Shared library registration on Jenkins


If any of the rules fail, we then we end the pipeline for the package in warning and notify the responsible developer.


CPILint Jenkins ending as warning


I'm currently in discussions with Morten (which is currently working on CPILint 1.0.5), so once it's available, I plan to migrate these custom rules to the new version as well.

On top of these checks we are also checking (outside of this tool):

  • That a documentation link is created on a package level (with a predetermined name) pointing to our document management system. For this we rely on the resources.cnt file that is available inside the zip file of the package once exported. There you can find references for URL maintained in the package
    def packageInfo = readFile(file: 'resources.cnt')
    def slurper = new JsonSlurper().parseText(new String(packageInfo.decodeBase64()))
    def documentationUrl = ''
    def hasCustomTags = false
    slurper.resources.each
    {
    if(it.resourceType=="Url" && (it.name=="YOURPREDETERMINEDNAME" || it.displayName=="YOURPREDETERMINEDNAME"))
    {
    try{
    documentationUrl = it.additionalAttributes.url.attributeValues[0]
    }
    catch(Exception e)
    {
    error("Error while reading documentation: ${e}")
    }
    }
    }​


  • Validate that the standard tags defined on package level are defined (example below for the LineOfBusiness tag)
    if(it.resourceType=="ContentPackage")
    {
    List<String> raiseErrorTags = [];
    def customAttribute = it.additionalAttributes.LineOfBusiness;
    if(customAttribute!=null)
    {
    def list = customAttribute.attributeValues;
    def hasValues = false;
    for(int i=0;i<list.size();i++)
    {
    if(list[i]!="")
    {
    hasValues=true
    }
    }
    if(!hasValues)
    {
    raiseErrorTags.push("LineOfBusiness");
    }
    }
    else
    {
    raiseErrorTags.push("LineOfBusiness");
    }
    }​


  • We have a custom tag which is a JIRA Reference that is mandatory to fill up so that we can associate JIRA User Stories to CPI Packages, so we check that this custom tag is filled out with the user story id


Next steps



  • Use sonarqube or codenarc to lint the groovy syntax used.

  • Migrate to CPILint 1.0.4 and then to 1.0.5 once released



Summary


In this topic, we introduced the main tool we used to check up for code issues (CPILint) as well as additional checks done to make sure the documentation and tags were properly maintained.

I would invite you to share some feedback or thoughts on the comments sections. I'm sure there are still improvement or ideas for new rules that would benefit the whole community. You can always get more information about cloud integration on the topic page for the product.

Labels in this area