Skip to Content

SAP HANA Table Functions

Recently I spoke to multiple developers working on SAP HANA Native system for developing reports. What really surprised me was the fact, that most of them are still using calculation views of a SQL Script type instead of Table Functions. What is more, some of them were even not aware that these type of views are deprecated and replaced by Table Functions. I also haven’t found any step by step tutorial describing when to create Table Functions, how to do that, what are the benefits and how to consume them in calculation views.

This inspired me to write the article about Table Functions and share my experience on that topic.

What are Table Functions?

Since SP11 calculation views of SQL Script type are deprecated. As an alternative SAP introduced new development artifact called Table Function. HANA provides Migration tool, which enables automatic conversion script-based Calculation View into Table Function.

Table functions are used whenever the graphical views are not sufficient i.e. for loops, executing custom functions or complex queries.

What are the Pros and Cons of using Table Functions?


  • SQL Script gives much more functionality and flexibility (more functions available, complex logic can be implemented in easier way, SQL can be combined with Application Function Library – AFL giving even more functions for complex analysis)


  • Maintenance is much more difficult (preserving order and data types of output columns; data preview for part of code is not so straightforward comparing to previewing nodes in graphical views; when changing output – both Table Function and view on top of it needs to be adjusted; etc.)
  • Multiple-value input parameters are not supported (there is no easy way of passing multiple values into single input parameter to Table Function, however there is workaround for that)
  • By definition graphical views provide better performance thanks to HANA’s optimizer
  • No GUI available – SQL Script knowledge is necessary

Generally Table Functions should are used in case if the logic cannot be covered by graphical view or the logic is too complex to model it graphically.

How to create Table Function?

Scenario: Business wants to display a report showing number of working days between “ORDERDATE” and “SHIPPEDDATE“. To calculate it we need to use WORKDAYS_BETWEEN SQL function, which is not available in graphical view. Additionally this function should display orders only for specific period of time which will be provided by the user.


  1. To start developing database objects, go to Development Perspective:


  1. Go to Repositories tab:

  1. Right click on destination package and select New -> Other

  1. In wizard type Table Function, select it and click Next

  1. Provide Table Function name and click Finish</em>

  1. Table function window will open.

  1. Adjust the code to the requirements

I. [Optional] Provide input parameters including data types. This is needed when you want to parametrize Table Function.

II. [Mandatory] Set TABLE as the output type.

III. [Optional] Provide default schema for the query. This is needed when you transport the function between instances with different schemas i.e. in development system you are using “DEV” schema, but on production all tables are placed in “PROD” schema. If you provide the default schema in the definition of Table Function, then while transporting this schema will be automatically replaced with the schema of target system (based on schema mapping). When applying default value for schema mapping, in the query you should use table names without schemas i.e simply use “ORDERS” instead of “DEV“.”ORDERS“.

IV. [Mandatory] Add RETURN phrase before the final select statement.

V. [Mandatory] Add select statement.

VI. [Optional] Apply input parameters in the WHERE clause. Add leading colons (“:”) when calling each parameter.

VII. [Mandatory] Add semicolon (“;”) at the end of the statement.

VIII. [Mandatory] List all the output columns. Preserve the columns order, column names (case sensitive) and their data types consistent with the select statement


  1. After writing Table Function definition activate it and make sure that there is no error after activation.

Be aware that any of Data Definition (CREATE, ALTER, etc.) or Data Manipulation (INSERT, UPDATE etc.) operators are not allowed in Table Functions.

How to query Table Functions?

Calling table Functions in SQL console is very easy. They are executed in the FROM clause. When executing the function, provide whole name of the function. In brackets provide values for input parameters. If there are no input parameters – leave the brackets empty ( they are always mandatory!!! ).

How to consume Table Function in Calculation View?

Using Table Function as a source in Calculation View is also very simple. Create a new calculation view, and when adding object for the projection, type name of the Table Function:

If Table Function is using input parameters you need to recreate them also in the view. Go to Parameters/Variables tab, click on Input Parameter Manage Mapping and select Data sources.

Click Auto Map By Name button to automatically copy input parameters from the Table Functions to the view.

Now when previewing data on the view, there will be pop up with input parameters. Once values are inputted they will be passed directly to the table function and results will be displayed.

What are the limitations of Table Functions?

The main limitation of table function is that you can pass only single values for each input parameter, which can be very annoying, because in most cases user wants to have possibility to pass multiple values in selection criteria of his report.

There is a workaround for that limitation, which can help to fulfill requirement of passing multiple values directly to the query. I will describe it in the next post of my blog.


To report this post you need to login first.


You must be Logged on to comment or reply to a post.

  1. Damian Zurawski

    Great article Konrad! I would add two small details regarding parameters in TF’s – usage of DEFAULT values when user (or another CV which is triggering this TF), does not apply any values. So it can looks like this:

    FUNCTION “_SYS_”.”workshop.test::MY_TEST_TF” (userId NVARCHAR(10) DEFAULT ‘USER123’, isTest NVARCHAR(1) DEFAULT ‘0’)


    Another thing is usage of PLACEHOLDERS in TableFunctions. So if we want to pass Input Parameters as a placeholder and then use them in TF, it can looks like this:

    FUNCTION “_SYS_”.”workshop.test::MY_TEST_TF” (userId NVARCHAR(10) DEFAULT ‘USER123’, isTest NVARCHAR(1) DEFAULT ‘0’)

    […body of function / SQL SELECT statement]


    FROM “_SYS_”.”workshop.test2/CV_TEST” (PLACEHOLDER.”$$InputParam1$$” =>  :userId, “$$InputParam2$$” => :isTest)


    Anyway, it’s really great that you’re sharing your knowledge! Keep doing that, community needs people like you 🙂

  2. Lars Breddemann

    Thanks for the nice write-up!

    Would be good if you had a look at the current SAP HANA releases (HANA 2 SP03) where you can use table functions in graphical calculation views as a kind of “pass-through” element. That is, you can “feed” a table parameter in and return a table – that’s very neat.

    This is of course also possible in HANA 1 when you build a data flow from table function to table function (outside of graphical calc views). One can build very powerful semantic expressions with that.
    The problem of keeping the signature correct can be handled by using table types instead of inline table definitions.

    Concerning the statement “By definition graphical views provide better performance thanks to HANA’s optimizer“: that’s just plain wrong.
    Graphical Calc views nowadays are processed through the same optimization stack (when possible) as common SQL and so are table functions. Especially for more complex scenarios like multi-store tables, virtual table access or long join chains the SQL optimizers produce better execution plans. Have a read on ESX and HEX in the SAP notes if that’s of interest to you.

    Looking forward to reading more about your modelling experiences.


Leave a Reply