Skip to Content
Personal Insights
Author's profile photo Bare Said

Data Processing is All Around Us

Long before video streaming was so widespread, and before we had SAP HANA, I was a computer science student in the Department of Databases and Information Systems at Saarland University. During my studies at the end of the 1990s, we were looking into how to optimize video streaming for maximum throughput with the maximum concurrent users. For my diploma thesis, I researched the best way to distribute data for video streaming. For example, if I have a video that I want to stream, is it better to save the entire video five times on one disc or is it better to break it down into 100 two-minute sections and distribute these sections multiple times on multiple discs? We had different algorithms to determine access, how to schedule and collect the video, and when to start streaming to get the maximum throughput to the highest number of users possible. I remember the big new thing back then for us was getting access to a computer in Switzerland that had eight different processors. We’ve definitely come a long way since then. Today, whether it’s in our private lives or enterprise software, data processing is all around us.

After my studies, when I joined SAP in 1999, I started in CRM development, and it took me almost two decades to get back to the core of data processing. I joined the SAP HANA team in November 2019. But I have to be honest, I have admired the SAP HANA team since they got started 10 years ago, and I am so proud to be part of this team of world-class database experts. In my role as Head of Database, it’s my job to make sure that SAP HANA, as well as our entire database portfolio, continue to meet and exceed the high expectations our customers have for us and to make sure that we are delivering the most innovative products on the market to meet the demands of the future.

One of my favorite innovations in SAP HANA is SAP HANA native storage extension. SAP HANA can only process as much data as fits in memory. If you have 1TB memory in SAP HANA, you can only process 1TB of data. So now imagine that you have 2TB of data but a 1TB SAP HANA, and you probably don’t always need to process the entire 2TB. Maybe you only need to process 10GB at a certain point in time. Native storage extension makes this possible. You can use it to specify how to store hot and warm data, which can help improve your memory footprint, and, in turn, reduce costs. Since hot and warm data are stored in the same logical table, the implementation is transparent to developers, so there is no need to change the application or operational procedures, including backups, monitoring, or disk layout. This allows for simplified data management and easy and dynamic balancing of the price-performance ratio.

Looking back, we’ve seen incredible innovation and customer adoption. We now have over 31,000 customers on SAP HANA who rely on it to provide the data management foundation for the Intelligent Enterprise and to run the world’s mission-critical business processes. The in-memory, hybrid transaction and analytical database totally disrupted the market. Now, we’re taking it up one more level with SAP HANA Cloud. The transition to the cloud is not without challenges, but we want to enable our customers to make the move at their own pace.

We added new features to SAP HANA 2.0 SPS 05 to help our on-premise customers on their own journey to the cloud with, for example, data replication with SAP HANA Cloud to help facilitate hybrid scenarios. However, we also know that there are some customers who already want to take advantage of the benefits of the cloud, so we are also putting a strong focus on cloud qualities while providing all the features our customers have come to know and love about SAP HANA.

The past 10 years with SAP HANA have been an incredible journey with our customers. It has been amazing to see how far we have come since the beginning with just an idea and a prototype. I cannot wait to see what the next 10 years with SAP HANA and SAP HANA Cloud will bring, and I know they will be just as successful as the last.

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Manfred Klein
      Manfred Klein

      Sorry Bare Said to adress you with this, but a question nags on me ever since I heard of HANA.

      You said HANA was before Native storage extension only able to handle as much data as fitted into memory. SAP purchased in-memory technology back in 2004. They could handle far more storage than the available memory. The hot and not so hot data concept is at least 16 years old.

      So my question is: Did the HANA team ever talk to those guys SAP bought back in 2004?

      A colleague of mine stated that SAP(ABAP) learned everything from the purchased technology(Java/C++) and I doubted it. Who of us two is right?




      Author's profile photo Bare Said
      Bare Said
      Blog Post Author

      Thanks for your comment, Manfred! As you suggested, SAP acquired Transact in Memory (TIM) in 2005. The P*TIME technology from TIM has played an important role in the journey towards what became SAP HANA. Many colleagues continue as part of the SAP HANA team including the SAP Labs Korea location which remains a cornerstone of SAP HANA development.


      Author's profile photo Manfred Klein
      Manfred Klein

      Ok, this explains a lot. Actually I spoke of the MDM(E) guys from Israel and Palo Alto.