Krishna Tangudu, HANA Distinguished Engineer and Poet
Again, with today’s more ad-hoc strategy of announcing HDEs when they are ready, we watched Krishna Tangudu create a number of interesting pieces of technical content on SCN like SAP HANA: Using “Dynamic Join” in Calculation View (Graphical), SAP HANA: Handling Dynamic Select Column List and Multiple values in input parameter and Using Multiple Values in Input parameter for filtering in Graphical Calculation View. Particularly interesting in Krishna’s content is that he seems to treat technical limitations of the HANA platform as a challenge to find a creative solution. So please welcome Krishna to the HDE Program!
Tell us a little around your background in the industry?
I have about 3.5 years of work experience, which involved working with SAP BW/HANA/BO-WebI in the areas of full life cycle management and application/production support projects for clients in Utility, FMCG and Hi-Tech industries. I also hold certifications in SAP HANA 1.0, BO-WebI 4.0, SAP BODI 3.x and SAP BI 7.0.
My career started with Tech Mahindra, which gave me an opportunity to work as an SAP BI support consultant for an FMCG customer. The work involved monitoring of daily loads like master and transaction data and solving tickets regarding missing data, data mismatch and incorrect data records. Working on this project helped me to understand difficulties faced while loading huge volumes of data and perceive the other challenges faced in EDW landscape like reporting performance and storage issues.
My next assignment was to implement SAP BI Solution for a client in the Utility industry. The focus here was to reduce the operational costs, and improve decision making over cash collection & recovery strategies. This helped the customer to gain better understanding of the cost to serve different types of customers and to identify the areas of optimization (e.g. most effective billing and contract channels etc.,). I have also implemented SAP BI as a solution, which includes Reporting, analysis, and information delivery and data warehousing capabilities for an Oil & Gas client in Nigeria.
In the year 2012, I was part of the team implementing a couple of POCs in the areas of Telecom and Retail where we were essentially trying to develop our RDS (similar to SAP HANA live) content based on the IPs that we had previously developed. The retail POC is an Analytics use case for which we decided using PAL to implement the solution. Here are some of the links for the blogs where I shared my experiences while using PAL & R on top of SAP HANA.
- SAP HANA: My experiences on using Predictive Analysis Toolset with HANA
- SAP HANA: My Experiences with SAP Business Objects Predictive Analysis tool
My career took a new step when I moved to Infosys as an SAP BW / HANA consultant. I have implemented 3 dashboards (Close to 22 reports) on SAP HANA for a Hi-Tech Major in US, whose legacy platform was on Teradata. I got quite acquainted with SQL scripting while working on this project. This knowledge helps me a lot in my work while designing optimized models in SAP HANA.
How did you get into the SAP HANA space? How did you transition from BW to HANA?
After working for a while in SAP BW, I decided to get certified in it and chose SAP TechEd event to attempt the SAP BW Certification exam in 2011. That year’s TechEd was focusing more on SAP’s new innovation i.e. SAP HANA. I received free server access to SAP HANA on AWS cloud, which encouraged me to engage myself in learning and trying new things to understand the different possibilities and limitations with SAP HANA. As I started exploring, I felt that it would benefit Developers, if I blog the experiences working with SAP HANA at SCN. My active participation at SCN earned me recognition and a Topic Leader badge for SAP HANA -2012 which I would rate as one of my best achievements. And eventually I found myself to be a part of SAP HANA COE Team, working and developing POCs in the areas of Telecom and Retail on SAP HANA. Currently in my new role as SAP HANA Consultant at Infosys, I have worked for 2 implementations in native SAP HANA and currently working on the next phases of implementing SAP HANA for a Hi-tech Major
What advice would you give to people looking to transition from classic SAP to HANA?
Coming from a BW & BO background, I had to excel in SQL to be a good HANA Modeler. Coincidentally, I was presented with an apt opportunity to work on an SAP HANA implementation, where in more than 30% of the models were developed using SQL scripts due to high complexity of the logic. With the boom of recent SAP HANA innovations like SAP River, SAP Predictive and text analytics one must strongly consider learning SQL to excel in these advanced topics in a better way.
SCN has been a platform for many people for posting queries regarding tough problems faced while working on HANA implementations. I advice one must “FOLLOW” the SAP HANA spaces in SCN properly for recent and quick updates, in order to be on par with the rapidly developing technology.
SAP HANA Academy tutorials, free guides on SAP HANA and especially the free access of 30 days (which can be extend by a minimal price) is the best combination for any developer to get started with SAP HANA.
Can you tell us a little about the projects you are working on right now?
Currently, I am working on implementing SAP HANA to a Hi-tech Major whose legacy platform is on Teradata. The client is currently facing various EDW challenges like scalability, velocity of the data and the performance of the reports. In order to overcome these challenges, we proposed a sidecar approach with a fall back mechanism on Teradata to continue an uninterrupted service for the users of the dashboard. We developed a custom replication framework to load data into SAP HANA, you can see more details about it in this blog SAP HANA: Replicating Data into SAP HANA using HDBSQL. This framework handles different delta refresh types that the customer is currently using. On the failure of the data load it raises an email to the support team with the error code and error message.
As the client prefers HTML5 to SAP BO as a reporting tool, the project became even more interesting. With HTML5 as a reporting tool, we couldn’t use variables for filtering and hence we used input parameters to apply filtering on the data at the lowest level possible before performing joins to get the optimal performance.
In the next phase of this project, we received an ad-hoc reporting requirement for which we are considering different reporting tools like SAP BO Analysis office, Lumira and Tableau. Currently discussions are going on to decide the reporting tool, I would blog more in these areas in the near future on my understanding and experiences while addressing these requirements.
Tell us about one of your HANA war wounds!
With versions changing at “HANA” speed it is challenging to actually propose a “standard” approach while modeling. We usually aim at designing multiple solutions for a single reporting requirement, by taking into account different modeling approaches (including SQL script based views) to determine the most optimized model. An instance of this problem, is when in previous HANA versions, distinct (count) on Analytic view resulted in bad performance for which we framed the distinct count query on an attribute view. But in the later versions Analytic View started giving a better performance and is a better solution for count distinct.
Another instance of a war wound, is while handling commits and rollbacks inside a procedure while loading data in HANA. The usage of Commit or Rollback is not supported inside a procedure, which was a showstopper for us while implementing the Replication framework. We came up with a work around as I mentioned in this blog SAP HANA: Workaround for using Commit & Rollback in exception handling block of a Stored Procedure , to handle commits and rollback properly. Some of the other hurdles while working for the same project were to handle dynamic filtering using Single/Multiple values. As we were using HTML5 and we couldn’t use variables, we had to filter the data using input parameters. I explained the work around in this blog: Using Multiple Values in Input parameter for filtering in Graphical Calculation View
Another client had heavy requirements, that could be easily addressed by SAP BI . As of now SAP HANA is not robust enough in modelling custom hierarchy requirements when compared with BI, so we proposed a solution addressing Master data modeling using BW and Transaction data modeling using SAP HANA which led to multiple debates on how efficient is it to separate master data and transaction data.
I assume that with numerous revisions happening in HANA, we will find answers for the above questions and more exciting new features.
What do you see in the future of HANA?
I see a possibility of replacing BW with HANA, but this may not in the near future. Recently, I see an increase in the number of Open HANA projects. It is very interesting to see how the customers from non-SAP landscape are more interested in getting SAP HANA to optimize their business processes and also to improve the performance of their reports.
SAP HANA has a capability of revolutionizing the entire Business Intelligence landscape. SLT makes the real-time acquisition on BIG data (high volumes of data) possible and the effective marriage of SAP HANA with HADOOP helps to address the challenges faced while handling BIG data.
SAP HANA XS, OData capabilities added with the new invention of RIVER makes it an exciting developer experience while designing any native applications on SAP HANA. Analytics is one more area where I see SAP HANA to play a pivotal role.
If there were one change you could make to HANA, what would it be?
Improvements in terms of “Stability” and “Scalability” would help to increase the number of customers adopting SAP HANA to run their business. Another aspect is to add capabilities like complex hierarchy modeling similar to SAP BW.
Having code inspector for SQL similar to ABAP will be helpful. Performance tuning by enabling traces or analyzing Viz plans is still not a straightforward approach because they cannot give a complete information of the production system. It would help a lot, if there is a tool which monitors the entire system and points to the code snippet which is actually effecting the performance and thus enabling developer to invest more time to work on optimizing only those pieces of code, rather than spending time on manually finding the part of code effecting the performance.
Regarding the scope of input parameters and variables, it is rather confusing to have both options to apply filters. It would be simpler to have either one of them with complete capabilities and thus making the developer’s life simpler, rather than limiting the usage of variables to only specific reporting tools (Ex: cannot be consumed by HTML5 dashboards) for filtering and input parameters to take input from the user to compute the calculations. (but also can be used to filter the data)
Tell us a bit about Krishna outside of HANA and work
Apart from work, I used to write poems and essays. I like to travel with my family, especially to heritage sites. I am interested and fascinated by the History of societal evolution, different ages and their kings & kingdoms. I read a lot of stories and watch movies related to epics and mythology. Like any other youth in India, I mostly pass my leisure time playing cricket or badminton, listening to Indian music and partying hard in the weekends.
I would rate my “curiosity” as the best weapon in my sleeves. I aspire to become a social entrepreneur. Also currently I am working and trying to develop apps that can help to solve major social issues in India using SAP HANA. I love to solve questions and puzzles, which challenge me.
I have always been lucky to find people with positive energy around me; being an only child I never felt lonely, because of the company of my relatives and friends. I recently got engaged to the love of my life and will be getting married on August 15th (The Independence day of India is when, I will start sharing my independence with my love).