Connecting the dots – Using SAP Business Technology Platform to make GPT a better product
Great technology does not always result in great product
I know this blog series will be addressing one of the most (if not the most) hyped and polemic topics nowadays. But before we enter in any philosophical disscussion about whether our jobs are at risk or any GDPR concepts, I just want to let you know that this is not the focus. However, some contextualization is needed to understand better deep dive that we are about to do.
Here in Brazil there was a revolutionary carmaker called Gurgel. In 1975, when Elon Musk was 4 years old, still playing in his backyard and the interest in electric cars was rising due to the oil crisis, this carmaker created one the first models of electric vehicles. The Gurgel Itaipu had an autonomy of 80KM with a single charge on its battery. A pretty impressive feat considering that this happened almost half a century ago. This was cutting edge technology at the time. However, this great technology inherently did not attend the needs from the customers. The autonomy was insufficient to be daily used, the battery took a long time to be charged (9 hours to be precise) and as you may see below, let´s just say it did not look as nice as a Ferrari. It was bad a product.
You might be asking yourself: ok, what does this have to do with GPT and LLM technologies? The similarity is that as of today, we are being presented to a great technology that has the capability of changing the way we work and society itself. However, we need to find ways to integrate this technology with other existing technologies and even more important: put it to execute and automate tasks for us. Otherwise, GPT is only an overly priced deluxe chatbot and… a bad product. Before you argue with me, I´m just agreeing with a statement that Sam Altman, OpenAI´s CEO, gave during an interview.
Using SAP Business Technology Platform to enhance GPT and OpenAI with custom context
Before starting the development for this scenario, I was planning to do a single blog. But doing it on a single blog would make it way too long, as this scenario has a high complexity involved in it. Therefore, I broke it down in the blogs below:
- Using SAP Data Intelligence to generate embeddings with custom data in OpenAI
- Using SAP Data Intelligence and python to embed custom data in OpenAI prompts
- Using API Management to expose OpenAI´s completion APIs for applications (Coming soon…)
You can find the overall architecture for this scenario below:
Another important disclaimer that I would like to make: this blog series is just an example to provide insights about the possibility of embedding existing business context into LLM technologies. It will be using SFLIGHT database as the data source for the example, but you will see that with a little bit of prompt and embedding engineering, it is adaptable to other data sources as well, whilst maintaning the data governance with features that SAP Data Intelligence provides us. However, data governance is not the focus of this blog series.
The goal of this blogs series is to give you insights about how to use business contextual data (provided and governed by the IT department, a capability enabled by SAP Data Intelligence) to provide users with valuable information that will help them in their daily tasks.
That being said, I wish you a happy reading!
I really enjoy your post and the example of Gurgel. Congrats
Very good blog series, very useful content! Congrats