Snowflake for SAP BusinessObjects 4.2 SP08
Starting with SAP BI 4.2 SP8, connecting to Snowflake Cloud Data Platform is now officially supported! Read more here.
Connectivity is available using Windows (ODBC / JDBC) and Linux (JDBC) via the Information Design Tool (IDT) for Universes of type UNX only.
Make sure you are on SAP BI 4.2 SP8 (Server and Client)!
If you have done a fresh installation of SAP BI 4.2 SP8 Server and Client, you shouldn’t have any problems and you can proceed to the next step.
If, like me, you have patched from a previous version, you may need to do extra steps.
Either there was a problem during the patching process or there are undocumented steps to do. But Snowflake wasn’t available in IDT and the expected snowflake.sbo wasn’t there either.
Thankfully it’s an easy fix. Basically, the newly added Snowflake drivers need to be manually installed.
Important: The following steps are required on your BOBJ Server(s) and the Workstation(s) where you use IDT.
1. Go to Programs and Features:
This can be done using the Control Panel:
Or the Run window using appwiz.cpl:
Or via Add or Remove Programs (aka Apps & Features) but make sure to click on Programs and Features as this window only allows you to Uninstall as opposed to Modify.
2. Select the first package of SAP BusinessObjects BI platform 4.2 or SAP BusinessObjects BI platform 4.2 Client Tools you have installed. In this case it’s SP7 then click Uninstall/Change:
Note: If you select something else than the original install (e.g. SP8 update) you do not get the option to Uninstall/Change.
3. Click Modify, Next and Next:
4. Enable #feature.Snowflake.name# and click Next until it’s done:
SAP BusinessObjects BI platform:
SAP BusinessObjects BI platform Client Tools:
This section demonstrates the steps in the SAP Note: 2900085 – BI IDT Universe for Snowflake ODBC.
1. Download and Install the 32-bit (e.g.: snowflake32_odbc-2.20.2.msi) and 64-bit (e.g.: snowflake64_odbc-2.20.2.msi) ODBC drivers from the official Snowflake Repository.
As you can see I have used version 2.20.2 for this test.
2. Create a 32-bit System DSN on the IDT Workstations(s) and a 64-bit System DSN that is identical on the SAP BI server(s).
Select SnowflakeDSIIDriver and click Finish:
Fill the Snowflake Configuration Dialog and click OK:
Snowflake Manual: ODBC Configuration and Connection Parameters
Configuring using JDBC (Windows)
This section demonstrates the steps in the SAP Note: 2900046 – BI IDT Universe for Snowflake JDBC Connection.
Note: As of today, this SAP Note has errors and is not usable as is.
1. Download the JDBC driver (e.g.: snowflake-jdbc-3.9.2.jar) from the official Snowflake Repository.
As you can see I have used version 3.9.2 for this test.
2. Create a folder named Snowflake in the file location: <Installed Directory>/SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\dataAccess\connectionServer\jdbc\drivers
E.g.: C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\dataAccess\connectionServer\jdbc\drivers\snowflake
3. Copy the driver (.jar) in this new snowflake folder.
4. Locate make a copy of the file snowflake.sbo in the folder: file location <Installed Directory>/SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\dataAccess\connectionServer\jdbc
5. Open the file snowflake.sbo in a text editor.
6. Insert the following lines above “<Parameter Name=”JDBC Class”>net.snowflake.client.jdbc.SnowflakeDriver</Parameter>” at line 36:
<Path><Installed Directory>/SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\dataAccess\connectionServer\jdbc\drivers\snowflake-jdbc-3.9.2.jar</Path>
<Path>C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\dataAccess\connectionServer\jdbc\drivers\snowflake\snowflake-jdbc-3.9.2.jar</Path>
7. Save and Close the file.
8. Restart the Server Intelligence Agent (SIA).
Creating a Connection in IDT
This one is easy. Same as always:
1. Open Information Design Tool (IDT)
2. Open a Session
3. Click: Create a Relational Connection
4. Give it a name…
5. Select: Snowflake > Snowflake > JDBC or ODBC
6. JDBC: Fill the Connection details. And Test Connection.
7. ODBC: Fill the Connection details. And Test Connection.
Creating a Universe
You are now ready to create a Universe…
Creating a Web Intelligence
Final step to test this is creating a Webi report.
Keep in Touch!
Hope this blog was useful. Please do share your thoughts and comments.
Feel free to “like” and post it on social media!
Always happy to connect: https://www.linkedin.com/in/pperrier/
Take care… A+
Thanks Patrick, that was of great help. Had one request if you could please help.
Before SP8, we were using Generic ODBC connector to connect to Snowflake. We upgraded to SP8 as soon as it was released so that we can utilize the goodness of the direct Connector.
However, when using the Snowflake connector, we see that Parallel query generation is no longer working when we have report with multiple Data Providers. We switched connection to use one with Generic ODBC connection, and the same queries can be seen running in parallel in the Snowflake query History.
The Maximum Parallel Queries is the default 4. The Webi Processing Server too has the setting intact.
Please help with what can we do to make sure the queries are generated/executed in parallel.
Thanks once again
I saw your email from Scott 🙂 I replied already, he will send it over. But for the benefit of others:
First I mentioned your concerns about parallel queries to Gregory BOTTICCHIO (Director, Analytics Product Management at SAP) and he confirmed that drivers and the processes running queries should be unrelated. Ie Snowflake or other technologies shouldn't make a difference.
I have personally tested today running a Webi with 5 Queries. Each reading 100k rows. See below:
All 5 Queries started at the same time. FYI The last one was quick as it was already cashed of course.
Hope that helps. If you have further concerns you should raise a support ticket with SAP.
Thanks so much Patrick 🙂 Yes Scott did reply to me.
But I wanted to find out why it is not working in our case.
Same Universe/report when I change to another connection that uses Generic ODBC driver, connecting to the same database and same DNS, starts running the queries in parallel. The only variable in my test case is the Snowflake Connector vs ODBC Generic driver, everything else is the same.
That is the part that is intriguing me, what is it in our environment that is not making the queries to run in parallel!
Thanks for the detailed article Patrick..! I could set up the ODBC connection successfully using the instructions.
However for JDBC, I need to configure proxy. If you can include those details of how to set up/pass proxy details in connections it would be really helpful.
For ODBC, I see the option directly in ODBC DNS configuration window.
The SAP Notes (https://launchpad.support.sap.com/#/notes/2910771) says this:
Setting proxy parameters in JDBC driver properties while creating the connection by following below steps:
I didn't have to configure proxy for my setup here...
Let me know and I'll add it in the blog.
Hi. Did you have time to test? How did it go?
I tried that approach, as you mentioned, added the proxy in the JDBC Properties step next to user name and password step and it worked..!!
But in the next step, when I add the table to the universe and tried to view data, I get the below error, Any idea what is the issue ?
Okay, I figured out the issue. Looks like snowflake(unlike other databases) expects us to specify the warehouse and role either in connection parameter or in begin SQL.
I added the 'USE ROLE <role_name>' as BEGIN_SQL in Data foundation parameters (warehouse was specified in the connection set up itself).
Hope it helps someone who's newly setting up snowflake on BO..!
Thanks for your article.
Re Snowflake ODBC versus JDBC - is there any benefit using one over the other in terms of performance or reliability?
Honestly I don't know.
As I'm on Windows and I guess also by habit I only use ODBC...
I tested both and I didn't get any issues. I can't say I noticed any performance differences. But since then, I'm only using ODBC.
If you or anybody else reading this have any thoughts or results, please do share!
Hi Patrick, Have you tried creating a Multisource Universe with one connecting to Snowflake. As i am trying out I am able to add the first connection to any source, but when try to select the snowflake connection as secondary connection it is greyed out.
Thanks in Advance
I can replicate the same thing.
After checking the PAM and especially this page, it turns out that multi-source universes (MSU) with Snowflake is not supported.
Same behavior with SAP BI 4.3 SP00.
I try to setup odbc connection to snowflake. Interestingly integrity check for connection, dfx and blx are working fine. However it gives me an error ‘Error on Fetch’ when I run a simple query or 'show value' on blx. Meanwhile it is working totally fine with data on webi. I have checked Snowflake side that query is running successful with result back.
Secondly, I want to point existing universe to use a new snowflake connection on dfx and it cannot find the existing view when I refresh structure (Table is missing) even I can insert that view with the same name from snowflake schema.
Hope you or anyone have an idea what is the issue....
Just an update on my end, so far Snowflake JDBC and ODBC 2.22.1 are both returned error and had weird behavior (sending SAP HANA or empty SQL statement) to Snowflake. However those issues are managed in 2.20.5 ODBC driver except sending SAP HANA SQL (2.20.5 is the oldest ODBC I can find from SF site). I would suggest to use ODBC between 2.20.2 and 2.20.5 and not use JDBC if possible.
Unfortunately, I’m still struggling to repoint schema tables in existing universe to those with the same name in SF, BO doesn’t even send any SQL query to SF when I refresh table structure but simply tells me table is missing in database on IDT. I don’t think SAP BO has not think about there is a need to repoint existing universe to SF. However I couldn’t find a way to make it work.
FYI my environment is BO 4.2 SP8 patch0 on windows server.
Finally I find out that changing ‘Owner’ for all tables is the way how you can let BO knows which schema should look at to refresh structure, otherwise BO is trying to search table on Database which is not working at all. e.g. BO is looking DIM_DATE in Database, but actually DIM_DATE is stored under Database.Schema e.g. ABC.SCH.DIM_DATE.
I'm using SAP 4.2 SP10 in Linux OS, I wanted to setup the JDBC/ODBC connectivity to Snowflake. I have to load the data from Oracle to Snowflake. Could you please confirm the steps you have mentioned will work for my scenario. I read that in some blogs JDBC connector can be used only for snowflake as source. Can you please confirm. This is little bit urgent as we are doing the POC for the same.
4.2 Sp10 doesn't exist yet.
I'm asking for SAP BODS 4.2 SP10, do we have option for JDBC and ODBC connection for Snowflake.
Not sure whether it is still applied to SAP BO Data Services but JDBC to snowflake on BO is currently only working on windows server not Linux as per SAP Note 2959086
Thanks for the update, I was unable to read the SAP Note, would be helpful if you could paste in the comment. As I need to be very sure that JDBC will work or not for data services 4.2 SP10 on Linux server for snowflake.
Is there any other way to connect to snowflake to push the data using SAP 4.2 SP10 Linux. Kindly let me know.
Sorry but there is no guarantee from my end, also my company doesn't use BODS.
The PAM for BI4.2 SP8 and BI4.3 semantic layer indicates Snowflake ODBC version 2.20.2 is the only supported ODBC driver version and the oldest version available on Snowflake website is 2.20.5
Any thoughts if we can use 2.20.5 with SP8 and/or BI4.3? If not then does anyone know when does SAP plan to support one of the available snowflake driver versions?
I have used both 2.20.2 and 2.20.5 for testing and had the same performance, but I would not suggest to use any higher version than 2.20.5 cause you will start getting issues on BO.
We are trying to establish the connection between SAP BO 4.2 SP6 and Snowflake through ODBC. We are able to create the connection at Repository level and local connection at IDT in server level and test connection also successful. But when we are creating the same connection in my local laptop it is through error.
Can you suggest us what could be the issue.
Snow Flake connection error in IDT
I would recommend test your snowflake connectivity using one of the SQL developer tools like RAPID and/or Toad and see if you are able to make a successful connection from your laptop before even using the BOE client tools.
Looks like snowflake has its own client tool SnowSQL that you can probably install on your laptop and test the connectivity to snowflake database.
Thanks for your reply.
We are able to connect Snowflake from laptop through other means except through IDT.
When you created your relational connection in IDT, did you use the generic ODBC drivers or the Snowflake ODBC drivers? If its the latter, what version are you using? Did you try using JDBC and see if that works?
Can you share the syntax that you specified in the 'Server' section of the ODBC DSN config?
You might want to try something like this without the https if you haven't yet
I believe there is no official ODBC support for Snowflake on SP6, so you have to either upgrade to SP8 or using generic JDBC on SP6. You can find how to set it up here: https://blogs.sap.com/?p=745812
Hi Patrick - we were very excited to see BOBJ officially support Snowlfake DSN's we are moving a great deal of our BI content into Snowflake and have been very happy with performance so far.
I am writing to ask you about a peculiar pain point that we have, that we have not been able to resolve.
When our Universe designers are aw work in IDT, and they are working with Snowflake IDT Connections, - let's say for example they want to add a new snowflake data object ( table or view ) to the Universe.
When they open the Snowflake Connection they are presented with a list of available schemas and objects that seems to duplicate often several times.
on the left side is the expected menu of available content presented to the snowflake database user in SnowSQL -
On the right is the redundant set of schemas presented by the same Snowflake Universe Connection Database user in IDT - notice the same list is duplicated several times.
The effect is not harmless. If the Universe Developer tries to open one of these schemas and select
an object IDT clocks indefinitely without ever being able to produce the list of available tables/views.
I am surmising that the problem might be that the IDT Connection database account/user has been given access to the same content from multiple assigned roles, but the fact that the Snowflake GUI produces an accurate menu, while IDT produces this redundant, and unusable list makes me wonder if the resolution is actually going to be in IDT - not Snowflake.
Have you heard of this problem? it is really problematic for us.
Any advice you might provide would be greatly appreciated.
Just thought I'd follow up here, this problem was fixed by SAP in Bobj 4.3 Support Pack 1 Patch 9. I don't believe that there were any plans to create a 4.2 fix - apparently the code bases involved are very different.
I am not seeing any information referencing setting up connections for snowflake in UDT. Is this not an option?
Snowflake is not supported for UNV universes. New data sources will not be certified for UNV Universe Design Tool.
https://support.sap.com/content/dam/launchpad/en_us/pam/pam-essentials/SBOP_BI_43.pdf (SLIDE 26)
Here is new way of authentication for Snowflake with SAP BusinessObjects https://blogs.sap.com/2022/04/21/what-is-all-about-authorization-server-oauth-sso-introduced-in-sap-businessobjects-bi-4.3-sp02/