Skip to Content
Technical Articles
Author's profile photo Roshan Kumar Gupta

SAP Leonardo Machine Learning Foundation-Bring Your Own Model(Part-2)

Introduction

This is the continued of Deploying ML Model on SAP Cloud Platform series if you have not gone through first blog post. I would recommend please go through first part. In this blog post i will talk about how to deploy model on cloud foundry that we have created in previous part, doing inference from deployed model using tensorflow serving API. 

Figure : Directory structure after training model

1.1 Prerequisites

Familiar with SAP Cloud Foundry , creating service instance and service keys, I recommend please go through SCP tutorial if you don’t know how use ML Foundation Services. For this blog post i am assuming you know the process of service instance and keys creations. you must also have access in production sub-account because this service is not available in trial account.

Figure : Service key of ML Foundation service

I would also suggest please go through the first blog post for getting clear picture.

1.2 Dependencies

import os
import requests
import grpc
import tensorflow as tf
from tensorflow.keras.datasets import fashion_mnist
from tensorflow_serving.apis import predict_pb2
from tensorflow_serving.apis import prediction_service_pb2_grpc

please install and import all these dependencies, If you are using local environment type pip install <Dependency-Name> or conda install <Dependency-Name> for anaconda environment.

  1. grpcio
  2. tensorflow
  3. tensorflow-serving-api
  4. requests

Deployment

 

1.1 Getting Token

def get_request_header():
    XSUAA_BASE_URL = "<YOUR XSUAA_BASE_URL>"
    CLIENT_ID = "<YOUR CLIENT_ID>"
    CLIENT_SECRET = "<CLIENT_SECRET>"

    response = requests.post(url=XSUAA_BASE_URL + '/oauth/token',
                             data={'grant_type': 'client_credentials',
                                   'client_id': CLIENT_ID,
                                   'client_secret': CLIENT_SECRET})
    access_token = response.json()["access_token"]
    return {'Authorization': 'Bearer {}'.format(access_token), 'Accept': 'application/json'}

This utility function will fetch you bearer token for authorization , open your ML foundation service key at the bottom you will find URL property, at the end of URL append ‘oauth/token’  and put your CLIENT_ID and CLIENT_SECRET. It will fetch you token now we are good to go further.

1.2 Upload Model

def upload_model():
    MODEL_REPO_URL = "<YOUR MODEL_REPO_URL>/api/v2/models/fashionClassifier/versions"
    file = open('fashionClassifier.zip', 'rb')
    headers = get_request_header()
    files = {'file': file}
    response = requests.post(MODEL_REPO_URL, files=files, headers=headers)
    print(response.json())

We have oauth token, now specify model_repo_url from the service key. Zip the saved model and specify the path(to load model) in file parameter, send the request you will be able to see modelName and fileName (name of model and file should be same or it will throw error) in the response if its successful.

1.3 Deploy Model

def deploy_model():
    DEPLOYMENT_API_URL = "<YOUR DEPLOYMENT_API_URL>/api/v2/modelServers/"
    headers = get_request_header()
    data = {
        "specs": {
            "models": [
                {
                    "modelName": "fashionClassifier",
                    "modelVersion": 1
                }
            ],
            "modelRuntimeId": "tf-1.8",
            "resourcePlanId": "standard"
        }
    }
    response = requests.post(DEPLOYMENT_API_URL, json=data, headers=headers)
    print(response.json())

We have successfully uploaded model on CF, specify you deployment_api_url from the service key , get request token and model specifications like Model Runtimes  and Resource Plans according to your need, There are different Model Runtimes  and Resource Plans are available for deploying model you can check out here Link. Once it is deployed successfully on cf it will start the  TensorFlow Serving Model Container.

1.4 Model Status

def model_status():
    DEPLOYMENT_API_URL = "<DEPLOYMENT_API_URL>/api/v2/modelServers"
    headers = get_request_header()
    payload = {'modelName': 'fashionClassifier'}
    response = requests.get(
        DEPLOYMENT_API_URL,  params=payload, headers=headers)
    print(response.json())   
    return response.json()

Specify deployment_api_url and create payload with modelName as parameters , send the request in the response check model status it will be ‘SUCCEEDED’ and you will receive model credentials.  In the ‘endpoints’ property you will have a details like ‘host’ , ‘port’ and ‘caCrt’ , We will use these details with gRPC  to classify images which is present in remote server.

1.5 Inference

def apply_inference():
    metadata = []
    headers = get_request_header()
    model_details = model_status()
    metadata.append(('authorization',  headers['Authorization']))

    MODEL_NAME = model_details['modelServers'][0]['specs']['models'][0]['modelName']
    MODEL_SERVER_HOST = model_details['modelServers'][0]['endpoints'][0]['host']
    MODEL_SERVER_PORT = int(model_details['modelServers'][0]['endpoints'][0]['port'])
    ROOT_CERT = model_details['modelServers'][0]['endpoints'][0]['caCrt']

    credentials = grpc.ssl_channel_credentials(
        root_certificates=ROOT_CERT.encode())
    channel = grpc.secure_channel('{}:{}'.format(
        MODEL_SERVER_HOST, MODEL_SERVER_PORT), credentials)
    stub = prediction_service_pb2_grpc.PredictionServiceStub(channel)

    data = open('image.jpg', 'rb').read()
    
    decode_img = tf.image.decode_jpeg(data, channels=1)
    img = tf.image.convert_image_dtype(decode_img, tf.float32)
    img = tf.image.resize(img, [28, 28])
    img = tf.reshape(img, [-1, 28, 28, 1])

    request = predict_pb2.PredictRequest()
    request.model_spec.name = MODEL_NAME
    request.model_spec.signature_name = 'serving_default'


    request.inputs['input_1'].CopyFrom(
        tf.compat.v1.make_tensor_proto(img, shape=[1, 28, 28, 1]))
    print(stub.Predict(request, 100, metadata=tuple(metadata)))

if __name__ == "__main__":   
    upload_model()
    deploy_model()
    apply_inference()

Lets understand two terms here :

gRPC : It is a modern, open source remote procedure call (RPC) framework , which allow us to call our model which resides in remote server through RPC. For details visit this link.

Tensorflow Serving : It is a flexible, high-performance serving system for machine learning models, designed for production environments. For details visit this link.

We successfully deployed model on CF, get the model credentials like ‘MODEL_NAME’ , ‘MODEL_SERVER_HOST’ , ‘MODEL_SERVER_PORT’  and ‘ROOT_CERT’ by calling model_status. Open any sample image , reshape and convert into tensor. In signature_name give the signature  of model which was specified during training the model and input name(input_1) if you don’t understand please check out my first blog post. Make the request you will see the probabilities of each class in response.

                                                                                                                                   Figure : Sample image (label ankle boot)

Figure : Predictions

 

Additional Notes

This is the news regarding MLF integration with Data Hub

In close collaboration with the Data Hub Team, we combined ML Foundation with Data Hub and added new capabilities for managing ML models end-to-end. We delivered our new joint product SAP Data Intelligence as General Available (GA) in July 2019. SAP Data Intelligence is our comprehensive solution to deliver data-driven innovation and intelligence across the enterprise, unifying scalable enterprise AI and intelligent information management.
Consequently, we have removed ML Foundation from the subscription price list. New sales via Cloud Platform Enterprise Agreement (CPEA) is possible at least until end of 2019 as a transition period. Existing customers will receive support and can continue to use ML Foundation until the end of their contract term.

https://jam4.sapjam.com/groups/PRO7Ysv9nskZpmoqalOi7T/overview_page/wyuvZ5AITgmBSgQlkpW7DM

Assigned Tags

      Be the first to leave a comment
      You must be Logged on to comment or reply to a post.