Skip to Content
Technical Articles

Snowflake for SAP BusinessObjects 4.2 SP08

It’s Here!

Starting with SAP BI 4.2 SP8, connecting to Snowflake Cloud Data Platform is now officially supported!  Read more here.

Connectivity is available using Windows (ODBC / JDBC) and Linux (JDBC) via the Information Design Tool (IDT) for Universes of type UNX only.


Make sure you are on SAP BI 4.2 SP8 (Server and Client)!

If you have done a fresh installation of SAP BI 4.2 SP8 Server and Client, you shouldn’t have any problems and you can proceed to the next step.

If, like me, you have patched from a previous version, you may need to do extra steps.

Either there was a problem during the patching process or there are undocumented steps to do.  But Snowflake wasn’t available in IDT and the expected snowflake.sbo wasn’t there either.

Thankfully it’s an easy fix.  Basically, the newly added Snowflake drivers need to be manually installed.

Important: The following steps are required on your BOBJ Server(s) and the Workstation(s) where you use IDT.

1. Go to Programs and Features:

This can be done using the Control Panel:

Or the Run window using appwiz.cpl:

Or via Add or Remove  Programs (aka Apps & Features) but make sure to click on Programs and Features as this window only allows you to Uninstall as opposed to Modify.

2. Select the first package of SAP BusinessObjects BI platform 4.2 or SAP BusinessObjects BI platform 4.2 Client Tools you have installed.  In this case it’s SP7 then click Uninstall/Change:

Note: If you select something else than the original install (e.g. SP8 update) you do not get the option to Uninstall/Change.

3. Click Modify, Next and Next:

4. Enable and click Next until it’s done:

SAP BusinessObjects BI platform:

SAP BusinessObjects BI platform Client Tools:

Configuring ODBC

This section demonstrates the steps in the SAP Note: 2900085 – BI IDT Universe for Snowflake ODBC.

1. Download and Install the 32-bit (e.g.: snowflake32_odbc-2.20.2.msi) and 64-bit (e.g.: snowflake64_odbc-2.20.2.msi) ODBC drivers from the official Snowflake Repository.

As you can see I have used version 2.20.2 for this test.

2. Create a 32-bit System DSN on the IDT Workstations(s) and a 64-bit System DSN that is identical on the SAP BI server(s).

Click Add:

Select SnowflakeDSIIDriver and click Finish:

Fill the Snowflake Configuration Dialog and click OK:

Snowflake Manual: ODBC Configuration and Connection Parameters


Configuring using JDBC (Windows)

This section demonstrates the steps in the SAP Note: 2900046 – BI IDT Universe for Snowflake JDBC Connection.

Note: As of today, this SAP Note has errors and is not usable as is.

1. Download the JDBC driver (e.g.: snowflake-jdbc-3.9.2.jar) from the official Snowflake Repository.

As you can see I have used version 3.9.2 for this test.

2. Create a folder named Snowflake in the file location: <Installed Directory>/SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\dataAccess\connectionServer\jdbc\drivers

E.g.: C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\dataAccess\connectionServer\jdbc\drivers\snowflake

3. Copy the driver (.jar) in this new snowflake folder.

4. Locate make a copy of the file snowflake.sbo in the folder: file location <Installed Directory>/SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\dataAccess\connectionServer\jdbc

5. Open the file snowflake.sbo in a text editor.

6. Insert the following lines above “<Parameter Name=”JDBC Class”>net.snowflake.client.jdbc.SnowflakeDriver</Parameter>” at line 36:


<Path><Installed Directory>/SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\dataAccess\connectionServer\jdbc\drivers\snowflake-jdbc-3.9.2.jar</Path>




<Path>C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\dataAccess\connectionServer\jdbc\drivers\snowflake\snowflake-jdbc-3.9.2.jar</Path>


7. Save and Close the file.

8. Restart the Server Intelligence Agent (SIA).

Creating a Connection in IDT

This one is easy.  Same as always:

1. Open Information Design Tool (IDT)

2. Open a Session

3. Click: Create a Relational Connection

4. Give it a name…

5. Select: Snowflake > Snowflake > JDBC or ODBC

6. JDBC: Fill the Connection details.  And Test Connection.

7. ODBC: Fill the Connection details.  And Test Connection.

Creating a Universe

You are now ready to create a Universe…

Creating a Web Intelligence

Final step to test this is creating a Webi report.





Keep in Touch!

Hope this blog was useful.   Please do share your thoughts and comments.

Feel free to “like” and post it on social media!

Always happy to connect:

Take care… A+

You must be Logged on to comment or reply to a post.
  • Thanks Patrick, that was of great help. Had one request if you could please help.

    Before SP8, we were using Generic ODBC connector to connect to Snowflake. We upgraded to SP8 as soon as it was released so that we can utilize the goodness of the direct Connector.

    However, when using the Snowflake connector, we see that Parallel query generation is no longer working when we have report with multiple Data Providers. We switched connection to use one with Generic ODBC connection, and the same queries can be seen running in parallel in the Snowflake query History.

    The Maximum Parallel Queries is the default 4. The Webi Processing Server too has the setting intact.

    Please help with what can we do to make sure the queries are generated/executed in parallel.

    Thanks once again

    • Hello Nilim,

      I saw your email from Scott 🙂 I replied already, he will send it over.  But for the benefit of others:

      First I mentioned your concerns about parallel queries to Gregory BOTTICCHIO (Director, Analytics Product Management at SAP) and he confirmed that drivers and the processes running queries should be unrelated.  Ie Snowflake or other technologies shouldn’t make a difference.

      I have personally tested today running a Webi with 5 Queries.  Each reading 100k rows.  See below:

      All 5 Queries started at the same time.  FYI The last one was quick as it was already cashed of course.

      Hope that helps.  If you have further concerns you should raise a support ticket with SAP.

      Take care.

      • Thanks so much Patrick 🙂 Yes Scott did reply to me.

        But I wanted to find out why it is not working in our case.

        Same Universe/report when I change to another connection that uses Generic ODBC driver, connecting to the same database and same DNS, starts running the queries in parallel. The only variable in my test case is the Snowflake Connector vs ODBC Generic driver, everything else is the same.

        That is the part that is intriguing me, what is it in our environment that is not making the queries to run in parallel!

  • Thanks for the detailed article Patrick..! I could set up the ODBC connection successfully using the instructions.

    However for JDBC, I need to configure proxy. If you can include those details of how to set up/pass proxy details in connections it would be really helpful.


    For ODBC, I see the option directly in ODBC DNS configuration window.



    • Hi,

      The SAP Notes ( says this:

      Setting proxy parameters in JDBC driver properties while creating the connection by following below steps:

      1. After clicking test connection (and it will obviously fail), click next
      2. In JDBC driver properties put the proxy settings as following: useProxy=true,proxyHost=<ip_address>,proxyPort=<port>,proxyUser=test,proxyPassword=test


      I didn’t have to configure proxy for my setup here…

      Let me know and I’ll add it in the blog.


      • I tried that approach, as you mentioned, added the proxy in the JDBC Properties step next to user name and password step and it worked..!! 


        But in the next step, when I add the table to the universe and tried to view data, I get the below error, Any idea what is the issue ?


        • Okay, I figured out the issue. Looks like snowflake(unlike other databases) expects us to specify the warehouse and role either in connection parameter or in begin SQL.

          I added the ‘USE ROLE <role_name>’ as BEGIN_SQL in Data foundation parameters (warehouse was specified in the connection set up itself).


          Hope it helps someone who’s newly setting up snowflake on BO..!