Skip to Content
Technical Articles
Author's profile photo Fumito Hiratsuka

How to build the iOS App that displays SAP S/4HANA data as AR 3D object (2/2)

This is the continuation of the my previous post.
There are three steps, and that the first two steps can be read there .

I build the iOS app that displays SAP S/4HANA data as AR 3D object. I’d like to share detail steps I took.

ar.gif

Architecture

Procedure

1. Implement GetEntitySet method of OData on SAP S/4HANA
2. Generate an Xcode Project with SCP SDK for iOS Assistant
3. Implement the AR to generated Xcode Project on step2

This post is introduce Step 3.
Please refer to the previous article for Step1-2.

3. Implement AR with Xcode

In the previous article, an Xcode project connected to SAP S4HANA was automatically generated by the SCP SDK for iOS. This time, I will implement the AR function by updating the generated Xcode project.

Add a Storyboard

First, create a folder named “App” under the main directory of the project.
Resources to be newly implemented will be put in this folder.
Add a storyboard. Right-click on the App folder and select [New File] and add the storyboard named “MyStoryboard”.
スクリーンショット 2018-12-31 22.29.20.png

Click the upper right [Library] button and drag and place the view controller on the storyboard.
スクリーンショット 2018-12-31 22.29.44.png
スクリーンショット 2018-12-31 22.30.05.png

From the upper right [Library] button, drag [ARKit SceneKit View] on the view controller in the storyboard.
スクリーンショット 2018-12-31 22.30.23.png

Expand the placed Scene View to the view controller. Next, add a Constraint from the bottom right button to prevent View shifting depending on the screen size of the device. Let’s set the distance from the view controller to 0 and click [Add 4 Constraints].
スクリーンショット 2018-12-31 22.30.51.png


Add a Scene file

Right-click on the App folder, select [SpriteKit Scene] from [New File] and add it. In this case, I will name it “Scene.sks”.
スクリーンショット 2018-12-31 22.31.10.png

Add Swift file

Right-click on the App folder> select “New File”> “Cocoa Touch Class” to add it.
スクリーンショット 2018-12-31 22.31.30.png

Enter the name “Scene” and select the class “SKScene”.
スクリーンショット 2018-12-31 22.31.51.png

Next, add a Swift file for the view controller. Right-click on the App folder> select “New File”> “Cocoa Touch Class” to add it.
スクリーンショット 2018-12-31 22.32.21.png

Enter the name MyViewController” and select the class “UIViewController”.
スクリーンショット 2018-12-31 22.32.38.png

Select the class in Storyboard and link to view controller.
スクリーンショット 2019-01-02 17.25.59.png

Prepare images

I will implement a action that places a 3D object in AR space when an image is detected by the camera using ARKit Recognizing Images.

First, prepare an image file. Place an image for the Recognizing Images in the Assets folder. Select [New AR Resource Group] from the [+] button to create a folder.
スクリーンショット 2018-12-31 22.32.58.png
I set material codes on S/4HANA as image names for data linking. Also remember to set the actual size of the thing to be recognized.
スクリーンショット 2018-12-31 22.53.38.png

Update plist

In order to use the camera, set “Privacy-Camera Usage Description” in plist. It is displayed as a message when asking for permission of the camera when the app is launched for the first time. Without this description, the app will crash.
スクリーンショット 2018-12-31 22.33.38.png

Change the initial Storyboard

At the default code, “Main.storyboard” is specified as the initial display, so change it to Storyboard I created this time.

Enter StoryboardID (“App”) in the Identify information of the created view controller.
スクリーンショット 2019-01-02 4.35.28.png
Next, modify AppDelegate.Swift. Modify the default code as below and specify the Storyboard I created.

AppDelegate.swift
    private func setRootViewController() {
        DispatchQueue.main.async {

            //デフォルトコード→コメントアウト
            //let splitViewController = UIStoryboard(name: "Main", bundle: Bundle.main).instantiateViewController(withIdentifier: "MainSplitViewController") as! UISplitViewController
            //splitViewController.delegate = self
            //splitViewController.modalPresentationStyle = .currentContext
            //splitViewController.preferredDisplayMode = .allVisible

            //今回作成したStoryboardを指定
            let viewController = UIStoryboard(name: "MyStoryboard", bundle: Bundle.main).instantiateViewController(withIdentifier: "App")
            self.window!.rootViewController = viewController

        }
    }

Implement session start method

I will implement the main view controller (MyViewController.swift).
First, make outlet connection of SceneView from Storyboard.
スクリーンショット 2018-12-31 22.34.26.png

Import the framework used in this implementation.


MyViewController.swift
import ARKit
import SAPFiori
import SAPOData

Next, implement the startSession method.

MyViewController.swift
    //method:セッション開始処理
    func startSession(){

        //ToastMessageを出力
        FUIToastMessage.show(
            message: "商品をスキャンしてください",
            icon: FUIIconLibrary.map.legend.zoomExtent.withRenderingMode(.alwaysTemplate),
            inView: sceneView,
            withDuration: 3.0,
            maxNumberOfLines: 1)

        //sceneViewの設定
        sceneView.delegate = self
        sceneView.showsStatistics = false
        let scene = SCNScene()
        sceneView.scene = scene

        //ImageDetection用の画像ファイル読込
        let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources", bundle: nil)
        let configuration = ARWorldTrackingConfiguration()
        configuration.detectionImages = referenceImages

        //sceneViewのセッション開始
        sceneView.session.run(configuration, options: [.resetTracking, .removeExistingAnchors])

    }

Implement data acquisition.

MyViewController.swift

    //在庫情報格納用変数
    var stockInfos = [StockInfo]()

    //method:在庫情報の取得処理
    func fetchStockInfoSet(_ completionHandler: @escaping () -> Void) {

        //AppDelegate
        let appDelegate = UIApplication.shared.delegate as! AppDelegate

        //クエリの作成
        let query = DataQuery().selectAll()
        do {
            //クエリ実行
            appDelegate.ytest001SRVEntities!.fetchStockInfoSet(matching: query) { StockInfoSet, error in
                if error == nil {
                    print("在庫情報の取得成功!")
                    self.stockInfos = StockInfoSet!
                } else {
                    print("在庫情報の取得失敗!だっふんだ!")
                }
                completionHandler()
            }
        }
    }

In initial process (viewDidLoad), Get stock information and start session.

MyViewController.swift
    override func viewDidLoad() {
        super.viewDidLoad()
        //在庫情報の取得処理
        self.fetchStockInfoSet(){
            //セッション開始処理
            self.startSession()
        }
    }

Implement image recognition process

Prepare red and blue images for 3D objects as preparation. These are the images displayed on the surface of the 3D object (sphere).
スクリーンショット 2018-12-31 22.50.01.png

Implement logic at the time of image recognition in delegate method of ARSCNView.

MyViewController.swift
    //method:画像認識時の処理
    func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
        let node = SCNNode()
        if let imageAnchor = anchor as? ARImageAnchor {
            //ノード作成処理
            let markNode = getMoonNode(imageName: imageAnchor.referenceImage.name!)
            node.name = imageAnchor.referenceImage.name
            node.addChildNode(markNode)
        }
        return node
    }

    //method:ノード作成処理
    func getMoonNode(imageName: String) -> SCNNode {
        // 3Dオブジェクト(球体)の生成
        let sphere = SCNSphere(radius: 0.02)
        let material = SCNMaterial()
        let intStock = stockInfos.filter { $0.matnr == imageName }[0].quan
        if (intStock?.intValue())! < 5
        {
            //店舗在庫の数が5個よりも少なかったら?の球体を生成
            material.diffuse.contents = UIImage(named: "moon_red.jpg")
        }else{
            //店舗在庫の数が5個以上だったら?の球体を生成
            material.diffuse.contents = UIImage(named: "moon_blue.jpg")
        }
        sphere.materials = [material]
        let node = SCNNode()
        node.position = SCNVector3(0.0, 0.0, 0.03)
        node.geometry = sphere
        sceneView.scene.rootNode.addChildNode(node)
        return node
    }

Check

Click the upper left build button to check Xcode project.
スクリーンショット 2018-12-31 22.49.37.png

I can confirm the AR object when images is recognized like this.(Using my iPhone)
daffunda2.GIF

comparing it with the stock information of S/4HANA, I can confirm that blue spheres are displayed if items are five or more stocks and red balls are displayed if items are fewer than five stocks.
data.png

Summary

I created a simple application that displays S/4HANA information in AR using SCP SDK for iOS. By using SDK for iOS, it becomes possible to easily use the latest UI/UX technology of iOS. Mobilization and the accompanying UI/UX transformation are very important factors in promoting digital transformation. It would be interesting to think about what kind of business change AR can be applied to.

Assigned Tags

      2 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Nabheet Madan
      Nabheet Madan

      Wow very nice Fumito Hiratsuka, I was thinking of doing something with the AR Kit for some time.. I must say great Stuff.

      Author's profile photo Ignacio Soriano Hernandez
      Ignacio Soriano Hernandez

      Phantastic content Fumito Hiratsuka !

      This is an awesome sample on how a digital transformation can be used in the industry. Please keep up the good work.

      Thanks from Germany

      Ignacio