Skip to Content
Event Information
Author's profile photo Maximilian Streifeneder

Annotated links: Episode 20 of Hands-on SAP dev with qmacro

This is a searchable description of the content of a live stream recording, specifically “Episode 20 – Continuing with CAP and Java” in the “Hands-on SAP dev with qmacro” series. There are links directly to specific highlights in the video recording. For links to annotations of other episodes, please see the “Catch the replays” section of the series blog post.

This episode, titled “Continuing with CAP and Java”, was streamed live on Fri 19 Apr 2019 and is approximately one hour in length. The stream recording is available on YouTube.

Below is a brief synopsis, and links to specific highlights – use these links to jump directly to particular places of interest in the recording, based on ‘hh:mm:ss’ style timestamps.

Brief synopsis

After DJ Adams has been working with the Cloud Application Programming Model in a local development environment so far, we have started to take a look at development possibilities in the SAP Cloud Platform. We have created a new SAP Cloud Platform Business Application (that’s the name of the template in SAP Cloud Platform WebIDE) based on CAP. Therefore we used an already known data model, which we take over more or less identically without concentrating too much on the data modeling activities. After the deployment process of our DB and service modules, we look into the SAP Cloud Platform Cockpit, where we locate our corresponding components.

Finally we implement some Custom Handlers using Java!

00:00:30: DJ Adams again introduces me and explains his sort of unusual (running) outfit for today’s live stream. Also welcoming our very important viewers – without you and your dedication, it wouldn’t be the same!

00:02:10: Audio issues for two audio inputs at the live stream, nothing too relevant except you are curious about DJs twitch streaming setup at a glance 🙂

00:03:00: Cap Friday – Cap/CAP? Yeah, that’s the reason why I forced DJ to wear a Cap (does not stand for Cloud Application Programming Model exceptionally) on his head.

00:04:00: Introduction words about my SAP Community Profile and my twitter handle.

00:05:10: Going over the content we created in the previous Friday stream: We basically started a CAP project with some custom logic written in Java using SAP Cloud Platform WebIDE.

00:05:35: If you are looking for another twitch streamer: David alias roberttables who touches all sorts of topics. Hardware + Software.

00:06:30: Lucia Subatin has been at Google Next and experimented with Google Cloud Run and SAP HANA. Interested? Click here for the blog.

00:07:30: We are always super grateful for every kind of feedback. So, if you do have any feedback, please provide it via the according feedback form 

00:08:10: DJ presents his nice integration of LSP (Language Server Protocol) in Vim.

00:09:45: What have we done so far, what’s coming today? The general concept of CAP is already familiar to the majority of our live stream viewers, but we will dive into some Java specifics on the basis of our already existing data model and CAP knowledge we have. So it’s mostly just a new perspective on concepts we have already learned.

00:11:20: Starting with going through the runtime and design-time artifacts we have touched in the last episode. We traditionally start with the already existing folder structure (divided into DB and Service module) with the CDS files it contains, which form the core of our entity and service definition. After deploying the resulting .hdbcds files again, we will focus on the SAP Cloud Platform Cockpit. There we are able to observe the corresponding HDI container as a so-called service instance and our service module as a so-called application with a lot of further information, such as logs, environment variables and much more.

00:20:30: Trying to demonstrate what the SAP Cloud Platform Cockpit does on a UI level with using the Cloud Foundry command-line interface. Unfortunately, my proxy settings caused by my WIFI connection hinders us to do so.

00:21:40: Double-check which lines of Java coding have been created and check if the application is running properly before we want to debug our Java code.

00:22:50: Before we can debug a Java application, some configurations are needed. The default “Run > Java Application” settings don’t enable us that the Debugger stops at my breakpoints. Therefore an additional Run Configuration is needed:

00:24:40: Side conversation taking place while the Java application is starting in debug mode: Do you use email signatures? Do you have images in there? And are you ashamed of that? 😉

00:25:20: Debugger is attached to our application and trying to execute OData services so that we’ll stop at the breakpoint we have just set in our BeforeReadHook method. At that time point, we use the opportunity to look at what our method parameters do transport.

 

00:28:00: Even though we already have a Java Class with annotated methods for implementing custom logic we create again a full-blown Java class with all kind of methods for Entity operations (CRUD+Q) using the advantageous SAP Cloud Platform WebIDE wizard. Can be triggered via “srv > New > Entity Operation Hooks”

00:29:40: Fixing our compilation errors caused by unresolvable Java Classes because of missing import statements using the Problems View.

00:30:40: We start to implement a custom logic for the “AfterRead” operation and manipulate the original data coming from the SAP HANA database. After the data is read from the database, we are able to execute a certain custom logic in this case caused by the “AfterRead” annotated Java Method. To explain this flow again, we take a look at the official documentation of the execution flow and the way to register Custom Handlers to Hooks. And demystifying for fun the original source of 4711 usages.

 

@AfterRead(entity = "Orders", serviceName = "CatalogService")
public ReadResponse afterReadOrders(ReadRequest req, ReadResponseAccessor res, ExtensionHelper helper) {
   EntityData ed = res.getEntityData();
   EntityData ex = EntityData.getBuilder(ed).addElement("amount", 4711).buildEntityData("Orders");

   return ReadResponse.setSuccess().setData(ex).response();
}

00:36:40: Execute and test our recently developed custom logic to overwrite a certain attribute of the requested single entity.

00:40:50: DJ created a node.js implementation in one of his earlier live stream episodes with a more sophisticated logic. Like reducing the amount in stock for a certain book by the ordered amount of a book and additionally validate the incoming order request. (is there an amount provided at all, etc.) In the following minutes, we are trying to reconstruct that with Java.

First of all, we check if there’s a valid order amount provided by the request sender and returning an HTTP 400 if this is not the case.

@BeforeCreate(entity = "Orders", serviceName = "CatalogService")
public BeforeCreateResponse beforeCreateOrders(CreateRequest req, ExtensionHelper helper)
    throws DatasourceException {
  // EntityData data = req.getData();

  // Validation of order-amount:
  EntityData orderItem = req.getData();
  Integer amount = (Integer) orderItem.getElementValue("amount");
  if (amount == null || amount <= 0) {
    return getErrorResponse(400, "it's your fault: order at least one book");
  }

  return BeforeCreateResponse.setSuccess().response();

}

private static BeforeCreateResponse getErrorResponse(int statusCode, String msg) {
  return BeforeCreateResponse
      .setError(ErrorResponse.getBuilder().setStatusCode(statusCode).setMessage(msg).response());
}

 

00:48:20: We test the recent implementation via postman (a tool used to send HTTP requests and receive responses).

 

00:51:00: Validation implementation if there is still enough books stocked for the incoming order. If not, return an error that the order amount cannot be fulfilled. The enhanced Java method now looks like:

@BeforeCreate(entity = "Orders", serviceName = "CatalogService")
    public BeforeCreateResponse beforeCreateOrders(CreateRequest req, ExtensionHelper helper)
            throws DatasourceException {
        // EntityData data = req.getData();
        // //TODO: add your custom logic / validations here...
        //

        // Validation of order-amount:
        EntityData orderItem = req.getData();
        Integer amount = (Integer) orderItem.getElementValue("amount");
        if (amount == null || amount <= 0) {
            return getErrorResponse(400, "it's your fault: order at least one book");
        }

        Integer id = (Integer) orderItem.getElementValue("book_ID");
        EntityData stockBook = helper.getHandler().executeRead("Books", Collections.singletonMap("id", id),
                Arrays.asList("id", "stock"));

        int stock = (Integer) stockBook.getElementValue("stock");

        // not enough books in stock
        if (stock < amount) {
            return getErrorResponse(409, "I'm sorry, you are too late. not anymore enough in stock");
        }

        return BeforeCreateResponse.setSuccess().response();

    }

 

00:58:10: Re-Run the application again to check if our recent implementation is fine. And apparently, it is not…We will have a look at the logs and the according exception in the next Friday live stream 😉

Assigned Tags

      Be the first to leave a comment
      You must be Logged on to comment or reply to a post.