CRM and CX Blogs by SAP
Stay up-to-date on the latest developments and product news about intelligent customer experience and CRM technologies through blog posts from SAP experts.
cancel
Showing results for 
Search instead for 
Did you mean: 
joris_quenee
Product and Topic Expert
Product and Topic Expert

Introduction

IT is an endless succession of reshaping.
Our parents knew mainframe. A big machine that centralises
all the power that serves many brainless terminals. Then personal computer has replaced it
by decentralizing power and data.

Our generation was born with internet, data servers and now cloud.
Which looks like similar to what our parents have known under name as mainframe...

From this point, IT is trying to back on decentralize and robustness
approach through new technologies : block-chain, micro-service, and headless.

However, in parallel, compagnies are proposing to re-centralize business into
Serverless, No Code, SaaS and narrow AI.

And now, AGI is coming (but not yet available). Another, potential decentralized solution where each person can unleash their own capacities. But today, this new technology has two barriers : capacity to learn itself on few samples and power efficiency.

 

Big misunderstanding

AI is not smart (Current AI of 2024 as we know it). But it is still impressive. Better term should be bluffing.

When Neil Alden Armstrong was going on the moon in 1969. His calculator was a simple slide rule.
In comparaison to what we can handle in our hands, he would have been amazed.
But at this end, it is still just a calculator like the slide rule.

Point is current AI needs (still) human to be useful. Calculator won't never solve a simple elementary mathematical problem by itself.

Here lies the misunderstanding, LLM (large language model) is bluffing. And it has been designed for. It is mumbo jumbo generator. It calculates the most probable next word from a given text. In other word, it is (a very complex) answer standardizer machine: no deep analysis, no initiative, no internal self goal, no self evaluation, no internal deep model reshaping, no judgment on the user skill, and no short term memory based on abstract understanding. It is probably the most advanced interactive Wikipedia machine. A poorly asked question/input will leads to misleading/inaccurate output.

Another thing, we can read on internet that prompt engineering doesn't requirement skill, degree or qualification. It's actually the complete opposite in software development. When generative AI is creating code on demand, it will require high / senior software engineering skill from the user to review/correct/fine tune the output. Generative AI's outputs can sometimes be incorrect (hallucinations), which can present challenge/trap for novice/junior users.

 

Use Case

Let take an example to deeper understand AI limit. Imagine, you've web shop and your search result is not great on your product catalog. Of course, you don't know why and you just want to add intelligence into the engine.

Naive requirement will be to say: when user is satisfied by his search result, AI should improve search query by boosting/promoting the most relevant product.

Great but how do we determine what's a relevant/satisfied result ? 1st product selection after a search ? product selected where user spent long time than other ? ordered product ? how do we bond product ordered with a search query in complex user journey ? 

And how to we boost/promote the most relevant product by adding keywords ? categories ? and what will be potential side effect on the global search ?

For now, AI is additional technology that can help to classify, standardize, optimize and interact. AI is good tool for NP-completeness problems. But determining criteria for solving a problem still need to be setup by humain. You need to find what's the best question to ask to solve your issue.

 

Higher complexity grows energy consumption

Our world and our economy are constrained by energy availability. Adding complexity to a system subsequently requires additional energy consumption.

AI is an overlay to build on the top of existing solution. It therefore becomes necessary to fully understand the issues and the goal of constructing such system.

Keep in mind that over automation can ultimately be much less productive/efficient than seeking to simplify the business process itself.

To give you an idea on current AI power consumption, one request (inference) is roughly equivalent to 300 Wh. Where internet search is roughly equivalent to 0,3 Wh. And for comparaison, humain brain is consuming power electricity equivalent around 20Wh in 10h. Then, AI usage can be 1000x more expensive than deterministic computable process.

 

Prerequisites

Before to apply any AI technology, you should have a clear view on what's your business process. Think first, how can be solved by classic technology / deterministic programming (highly more computable than AI). And then, at this end, if a part of your problem is an NP-completeness. you should seek for AI technology that can fit your needs : Regression, Classification, Machine Learning, Deep Learning, LLM, and so on.

If you want to know in detail different types of AI and their usage, you can follow OpenSAP training on Generative AI at SAP.

 

Run AI with SAP

As you understand, AI must be applied in mature and standardize solution where problems to solve are clearly identified. Fortunately, SAP is proposing several ready-to-use solutions within AI technology:

You can also build your own AI model in SAP BTP

ai-foundation-on-sap-btp-q4-2023-release-highlights

 

Conclusion

AI is very complex topic and it requires deep technology and business understanding. To be efficient, it should be applied in the right context to solve a specific use case.

SAP Expert Services can advice you when and how AI technology should be integrated in your company.