Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
former_member197425
Active Participant
 

Machine Learning got a lot of buzz these days which grabbed my attention.After getting my hands dirty with various machine learning concepts & libraries it made me to get a good acquaintance with the usage.In this blog i will be showing an overview of a machine learning model which can classify any image into a 24 categories of Apparels ,developed in Tensorflow integrated in SAP HANA using SAP HANA External Machine Learning library(SAP EML).

To get the further understanding on SAP EML architecture go through the below good blog by Ian Henry

https://blogs.sap.com/2017/12/15/bringing-machine-learning-tensorflow-to-the-enterprise-with-sap-han...

 

Overview


Below are the various steps involved or i had followed:

1.Installation of SAP HANA Express ( server + applications ) edition and configurations for SAP EML.

2.Developed a Neural Network Model in Kaggle which can classify any image into 24 categories of Apparels.

3.Installation of Tensorflow Model Serving machine and deployment of developed model.

4.Python-HANA connector for Test data upload in HANA Table.

5.SQL script code in HANA to call Tensorflow Model.

 

1.Installation of SAP HANA Express ( server + applications ) edition and configurations for SAP EML:


I had installed SAP HANA Express ( server + application ) edition which is offered as a virtual machine in Google cloud platform.

Below is the link for installation steps from SAP in GCP

https://developers.sap.com/tutorials/hxe-gcp-getting-started-launcher.html

But while installing make sure you select express server + application edition since with Application edition we can access all XS Apps like HANA Cockpit Administration,WebIDE .

Below are some of the commands which can come handy ,make sure you login with OS user in HANA system for these commands to work.

i.  HDB start --> To start HANA DB.

ii. HDB stop --> To stop HANA DB.

iii.xs login    --> To login into xs apps

iv.xs apps   --> To get entire list of XS apps like WebIDE,Cockpit etc fiori URL links.

v. HDB info --> To check the status of HANA DB.

Once HANA DB is started we need to  connect to SYSTEM Database using user SYSTEM,to check whether SAP EML is  installed or not.Run below SQL script in SQL Console.
--------------------------------
-- run against systemdb database
--------------------------------

-- check EML AFL component is installed
SELECT * FROM "SYS"."AFL_AREAS" WHERE AREA_NAME = 'EML';
SELECT * FROM "SYS"."AFL_PACKAGES" WHERE AREA_NAME = 'EML';
SELECT * FROM "SYS"."AFL_FUNCTIONS" WHERE AREA_NAME = 'EML';

-- check tenant database exists and is started
SELECT * FROM SYS.M_DATABASES;

Check whether Tenant Database is up and running as we will be using Tenant DB going forward.


 

Now login to Tenant DB using SYSTEM user to add Script Server and also to create a new user MYEML which we will be using to connect SAP HANA Tables .Run the below SQL in console.
------------------------------
-- run against tenant database
------------------------------

-- check script server
SELECT * FROM SYS.M_SERVICES;
-- add script server to tenant database --scriptserver is required for EML
ALTER DATABASE HXE ADD 'scriptserver';
-- create user
CREATE USER MYEML PASSWORD Testpassword1;

-- authorize EML administration
GRANT MONITORING TO MYEML;
GRANT CREATE REMOTE SOURCE TO MYEML;
GRANT SELECT, UPDATE, INSERT, DELETE ON _SYS_AFL.EML_MODEL_CONFIGURATION TO MYEML;

-- authorize creation & removal of EML procedures
GRANT AFLPM_CREATOR_ERASER_EXECUTE TO MYEML;

-- authorize execution of EML procedures
GRANT AFL__SYS_AFL_EML_EXECUTE TO MYEML;

Using new user MYEML login to SAP HANA Tenant DB and run below SQL in console to create a table which can hold images in Bytes format.
-------Table for Image Data----------
CREATE TABLE APPAREL_IMAGES (
ID INTEGER,
IMAGE BLOB
);

 

2.Developed a Machine Learning Neural Network Model in Kaggle which can classify any image into 24 categories of Apparels


I had developed a deep neural network model which can classify any image into 24 categories of below Apparel categories.


 I had used kaggle since it provides free GPU's for training the model ,below is my kaggle Jupyter Notebook with Neural Network Model development.

https://www.kaggle.com/nandikishore/image-classification-model-with-image-byte-input

Dataset used for Training model : https://www.kaggle.com/trolukovich/apparel-images-dataset

Here the key part is generally Tensorflow Keras Image processing Model generate input Image Tensor of Rank 3 but SAP HANA can support upto Rank 2 Tensors only ,I had changed the Model signature to accept String of Image Bytes which can work with SAP HANA since we had already created a table "APPAREL_IMAGES" in HANA which will have Image info Bytes.

Below is the Python code to change the Signature of Model.
class ConvertModel(tf.keras.Model):
def __init__(self,model):
super().__init__(self)
self.model = model

@tf.function(input_signature=[tf.TensorSpec([None],dtype=tf.string)])
def img_serve(self,images):
def input_to_actual_shape(img):
img = tf.io.decode_jpeg(img,channels=3)
img = tf.image.convert_image_dtype(img,tf.float32)
img = tf.image.resize_with_pad(img,200,200)
return img

img = tf.map_fn(input_to_actual_shape,images,dtype=tf.float32)
op = self.model(img)
return {"OUTPUT":op}

All the above python code you can find in my Kaggle Jupyter Notebook as well which is explained with a test image.Dowload the Model into a folder which we need to upload in Tensorflow serving machine for deployment make sure model is downloaded into below folder structure.
CustomMulticlass/1/saved_model.pb file,assets,variable folders.

3.Installation of Tensorflow Model Serving machine and deployment of developed model:


I had installed a Ubuntu OS Virtual Machine in Google Cloud Platform to serve my above created Tensorflow model and i had opened ports 7000 for Rest API calls & 7500 for HANA gRPC calls.Ports can be opened by setting Firewall Rules in GCP.


Firewall Rules


SSH into the Ubuntu machine and upload the Model which is created into the VM


Run below command to deploy the model
tensorflow_model_server --model_name=Apparel\
--model_base_path="give absolute path of Uploaded Model Folder"\
--port=7500 --rest_api_port=7000

Now the Model is ready for HANA gRPC calls to get the prediction output and the Model name is given as "Apparel" which will be used in SAP HANA while calling this Model.

 

4.Python-HANA connector for Test data upload in HANA Table:


Inorder to call the Tensorflow Model from HANA let us upload some image data in Bytes into HANA Table "APPAREL_IMAGES".I had uploaded two images into HANA Table using Python library HDBCLI.

I used IBM Watson Jupyter Notebook for this purpose ,first i uploaded images into my IBM Cloud Storage Bucket and used those in the Notebook converted then into Bytes and uploaded into the HANA Table.

Below is the link for my Notebook which contains the code:

Python-HANA DataUpload Notebook Code

I just uploaded a Blue Shirt & Black Shoe into the HANA table.


HANA TABLE


 

5.SQL script code in HANA to call Tensorflow Model:


Using Web IDE connect to SAP HANA with user MYEML and run the below SQL code to test the Images in the HANA Table.

First check whether from SAP HANA we are able to connect Tensorflow Serving Machine.
-- clean up
DROP REMOTE SOURCE "TensorFlowServing";
DROP TABLE "PARAMETERS";
DELETE FROM "_SYS_AFL"."EML_MODEL_CONFIGURATION" WHERE "Parameter"='RemoteSource' and "Value"='TensorFlowServing';


CREATE REMOTE SOURCE "TensorFlowServing" ADAPTER "grpc" CONFIGURATION 'server=ip address of Tensorflow serving machine;port=7500';

-- register model with Name "Apparel"
INSERT INTO "_SYS_AFL"."EML_MODEL_CONFIGURATION" VALUES ('Apparel', 'RemoteSource', 'TensorFlowServing');

SELECT * FROM "_SYS_AFL"."EML_MODEL_CONFIGURATION";

-- create parameters table (used in subsequent calls)
CREATE TABLE "PARAMETERS" ("Parameter" VARCHAR(100), "Value" VARCHAR(100));

-- apply registered models
CALL "_SYS_AFL"."EML_CTL_PROC" ('UpdateModelConfiguration', "PARAMETERS", ?);

-- verify model is up and running on remote source
INSERT INTO "PARAMETERS" VALUES ('Model', 'Apparel');
CALL "_SYS_AFL"."EML_CHECKDESTINATION_PROC" ("PARAMETERS", ?);

If connection to Tensorflow serving machine is successful we get below message.


Now Run below SQL script to get predictions for Image to fit in any of 24 categories of Apparel.
-- clean up
DROP TYPE "T_PARAMS";
DROP TYPE "T_DATA";
DROP TYPE "T_RESULTS";
DROP TABLE "SIGNATURE";
DROP TABLE "PARAMS";
DROP TABLE "RESULTS";
DROP VIEW "V_DATA";
CALL "SYS"."AFLLANG_WRAPPER_PROCEDURE_DROP" ('MYEML', 'APPAREL');

CREATE TYPE "T_PARAMS" AS TABLE ("Parameter" VARCHAR(100), "Value" VARCHAR(100));
CREATE TYPE "T_DATA" AS TABLE ("images" BLOB);
CREATE TYPE "T_RESULTS" AS TABLE ("black_dress" FLOAT,"black_pants" FLOAT,
"black_shirt" FLOAT,"black_shoes" FLOAT,"black_shorts" FLOAT,"blue_dress" FLOAT,
"blue_pants" FLOAT,"blue_shirt" FLOAT,"blue_shoes" FLOAT,"blue_shorts" FLOAT,
"brown_pants" FLOAT,"brown_shoes" FLOAT,"brown_shorts" FLOAT,"green_pants" FLOAT,
"green_shirt" FLOAT,"green_shoes" FLOAT, "green_shorts" FLOAT,"red_dress" FLOAT,
"red_pants" FLOAT,"red_shoes" FLOAT,"white_dress" FLOAT,"white_pants" FLOAT, "white_shoes" FLOAT,"white_shorts" FLOAT);

CREATE TYPE "T_FINAL" AS TABLE ("Image.Sno" integer,
outcome VARCHAR(20),accu FLOAT);
CREATE COLUMN TABLE "SIGNATURE" ("POSITION" INTEGER,
"SCHEMA_NAME" NVARCHAR(256), "TYPE_NAME" NVARCHAR(256),
"PARAMETER_TYPE" VARCHAR(7));

INSERT INTO "SIGNATURE" VALUES (1, 'MYEML', 'T_PARAMS', 'IN');
INSERT INTO "SIGNATURE" VALUES (2, 'MYEML', 'T_DATA', 'IN');
INSERT INTO "SIGNATURE" VALUES (3, 'MYEML', 'T_RESULTS', 'OUT');
CALL "SYS"."AFLLANG_WRAPPER_PROCEDURE_CREATE" ('EML', 'PREDICT',
'MYEML', 'APPAREL', "SIGNATURE");

-- create tables
CREATE TABLE "PARAMS" LIKE "T_PARAMS";
CREATE TABLE "RESULTS" LIKE "T_RESULTS";
CREATE TABLE "DATA" LIKE "T_DATA";
CREATE TABLE "OUTPUT" LIKE "T_FINAL";

DROP VIEW "V_IMAGE_DATA";
CREATE VIEW "V_IMAGE_DATA" AS
SELECT "IMAGE" AS "images"
FROM "APPAREL_IMAGES" ;

TRUNCATE TABLE "PARAMS";
INSERT INTO "PARAMS" VALUES ('Model', 'Apparel');

---Call the Tensorflow serving with the wrapper procedure APPAREL
CALL "APPAREL" ("PARAMS", "V_IMAGE_DATA", "RESULTS") WITH OVERVIEW;

SELECT * FROM "RESULTS";

We will get the predictions as percentage of accuracy/confidence for each classification as below.


 

It may be hard to understand by looking at the percentage of 24 columns ,we need to pick the column which has high percentage of accuracy then our images fits into that category.So i had written a SQL script procedure which will give final output for the Images prediction.
DO
BEGIN
DECLARE V_I INTEGER;
DECLARE V_OP VARCHAR(20);
DECLARE V_ACCU FLOAT;

LT_GREATEST = SELECT GREATEST("black_dress","black_pants","black_shirt","black_shoes","black_shorts","blue_dress","blue_pants","blue_shirt",
"blue_shoes","blue_shorts","brown_pants","brown_shoes","brown_shorts","green_pants","green_shirt","green_shoes",
"green_shorts","red_dress","red_pants","red_shoes","white_dress","white_pants","white_shoes","white_shorts") AS GREATEST
FROM "RESULTS";
LT_RESULTS = SELECT * FROM "RESULTS";

FOR V_I IN 1..RECORD_COUNT(:LT_RESULTS) DO
IF :LT_RESULTS."black_dress"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'black_dress';
V_ACCU = :LT_RESULTS."black_dress"[V_I];
ELSEIF :LT_RESULTS."black_pants"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'black_pants';
V_ACCU = :LT_RESULTS."black_pants"[V_I];
ELSEIF :LT_RESULTS."black_shirt"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'black_shirt';
V_ACCU = :LT_RESULTS."black_shirt"[V_I];
ELSEIF :LT_RESULTS."black_shoes"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'black_shoes';
V_ACCU = :LT_RESULTS."black_shoes"[V_I];
ELSEIF :LT_RESULTS."black_shorts"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'black_shorts';
V_ACCU = :LT_RESULTS."black_shorts"[V_I];
ELSEIF :LT_RESULTS."blue_dress"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'blue_dress';
V_ACCU = :LT_RESULTS."blue_dress"[V_I];
ELSEIF :LT_RESULTS."blue_pants"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'blue_pants';
V_ACCU = :LT_RESULTS."blue_pants"[V_I];
ELSEIF :LT_RESULTS."blue_shirt"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'blue_shirt';
V_ACCU = :LT_RESULTS."blue_shirt"[V_I];
ELSEIF :LT_RESULTS."blue_shoes"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'blue_shoes';
V_ACCU = :LT_RESULTS."blue_shoes"[V_I];
ELSEIF :LT_RESULTS."blue_shorts"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'blue_shorts';
V_ACCU = :LT_RESULTS."blue_shorts"[V_I];
ELSEIF :LT_RESULTS."brown_pants"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'brown_pants';
V_ACCU = :LT_RESULTS."brown_pants"[V_I];
ELSEIF :LT_RESULTS."brown_shoes"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'brown_shoes';
V_ACCU = :LT_RESULTS."brown_shoes"[V_I];
ELSEIF :LT_RESULTS."brown_shorts"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'brown_shorts';
V_ACCU = :LT_RESULTS."brown_shorts"[V_I];
ELSEIF :LT_RESULTS."green_pants"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'green_pants';
V_ACCU = :LT_RESULTS."green_pants"[V_I];
ELSEIF :LT_RESULTS."green_shirt"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'green_shirt';
V_ACCU = :LT_RESULTS."green_shirt"[V_I];
ELSEIF :LT_RESULTS."green_shoes"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'green_shoes';
V_ACCU = :LT_RESULTS."green_shoes"[V_I];
ELSEIF :LT_RESULTS."green_shorts"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'green_shorts';
V_ACCU = :LT_RESULTS."green_shorts"[V_I];
ELSEIF :LT_RESULTS."red_dress"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'red_dress';
V_ACCU = :LT_RESULTS."red_dress"[V_I];
ELSEIF :LT_RESULTS."red_pants"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'red_pants';
V_ACCU = :LT_RESULTS."red_pants"[V_I];
ELSEIF :LT_RESULTS."red_shoes"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'red_shoes';
V_ACCU = :LT_RESULTS."red_shoes"[V_I];
ELSEIF :LT_RESULTS."white_dress"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'white_dress';
V_ACCU = :LT_RESULTS."white_dress"[V_I];
ELSEIF :LT_RESULTS."white_pants"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'white_pants';
V_ACCU = :LT_RESULTS."white_pants"[V_I];
ELSEIF :LT_RESULTS."white_shoes"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'white_shoes';
V_ACCU = :LT_RESULTS."white_shoes"[V_I];
ELSEIF :LT_RESULTS."white_shorts"[V_I] = :LT_GREATEST.GREATEST[V_I] THEN
V_OP = 'white_shorts';
V_ACCU = :LT_RESULTS."white_shorts"[V_I];
END IF;

INSERT INTO "OUTPUT" VALUES (V_I,V_OP,V_ACCU);
END FOR;
END;

SELECT * FROM "OUTPUT";

 

Now if we check the output it had correctly predicted the Images into their respective categories as below.


For Blue shirt it predicted with accuracy/confidence percentage of 86% & in case of Black shoe it is 99% confidence prediction from model.

 

SAP EML is a very good feature by which we can easily integrate machine learning model to call inside SAP HANA itself.

 

References:

https://help.sap.com/viewer/42668af650f84f9384a3337bcd373692/2.0.05/en-US/b2a37c7ecec2416bbf1889b2f2...

https://github.com/saphanaacademy/EML

 
Labels in this area