Scheduled data publishing from SAP Lumira desktop to SAP Lumira Cloud
Ever wanted to schedule your data from SAP Lumira desktop to SAP Lumira Cloud? In the past you have been able to schedule your entire lums file using SAP Lumira Agent, but then you can only open lums file using the desktop. With the latest SAP Lumira 1.17 release you are able to also schedule your dataset from SAP Lumira desktop to SAP Lumira Cloud. This will refresh the associated Lumira stories as well. Note: infographics are static and will not refresh automatically.
The SAP Lumira Agent is a process that needs to be activated. You activate it by selecting File -> Preferences then selecting Enable SAP Lumira Agent on the General tab. You must restart SAP Lumira, then SAP Lumira Agent runs in the Windows System Tray. Once activated, it remains available after you quit SAP Lumira. When you restart SAP Lumira, you don’t have to restart the agent.
The story has to be published from SAP Lumira desktop in the first place. To create a schedule open SAP Lumira. Find your lums file in the document listing. If SAP Lumira Agent was enabled then you see a Schedule column. Click on the schedule for your lums file. This will open the a dialog that allows you to
create a schedule.
Press the Create Schedule button to set up a schedule. You can set up a daily, weekly or monthly schedule. When you create the schedule make sure to check the Publish datasets to SAP Lumira Cloud checkbox otherwise the data would not be updated. Also SAP Lumira Agent has to run at the time of the refresh.
At the time of the schedule SAP Lumira will refresh the data from all the data sources that were used in this lums file. If data transformation was applied it will replay the transformation before publishing the data to SAP Lumira Cloud.
In the SAP Lumira Cloud launchpad you will see that the Last Modified date was updated. The story itself was not modified, but when opening it, the story will retrieve the refreshed data.
In the screenshot below you can see the story and that now it also contains additional data from 2014.
Note that dataset size limit of 200MB for publishing from SAP Lumira desktop to SAP Lumira Cloud still applies.
For paying SAP Lumira Cloud customers we do offer integration with SAP HANA Cloud Integration. SAP HANA Cloud Integration facilitates the integration of business processes and data across on-premise and cloud applications such as SAP Lumira Cloud. Process integration capabilities allow you to integrate business processes spanning different companies, organizations, or departments within an organization.
Data integration capabilities allow you to efficiently and securely use ETL (extract, transform, load) tasks to move data between on-premise systems and the cloud.
saplumira.com also has video recordings for Scheduled data publishing from SAP Lumira desktop to SAP Lumira Cloud as well as the HANA Cloud Integration (and other areas of Lumira)
I have been using the Agent to auto refresh my datasets to the cloud, but it doesn't refresh every time due to an error I don't understand. Has anyone come across this error? I need to check on a daily basis if the refresh was successful, and if an error occurs then manually refresh.
Dear Users and Colleagues,
First, I apologize if I'm not using the right thread for such inquire, but I have a question that I'm not able to find the answer on SAP Notes or Jams.
During a service onsite at a customer, customer was using SAP HANA with Lumira on it.
Turns out that HANA (physical server and indexserver) was using 100% of CPU and almost 100% of memory available like a wave (high and low peaks).
After some investigation, we found out that 3 lumira background jobs were causing several connections to the system (threads) and was causing the issue.
The three jobs were disabled and the CPU, Memory resources became very stable.
Does any of you know what below jobs mean? Are they critical? Can I leave them disabled? I was not able to find relevant information...
If this is not the right place for this question, kindly let me know and I'll delete the post.
Job names:
VDSchedulerJob_1.xsjob
VDSchedulerJob_2.xsjob
VDSchedulerJob_3.xsjob
Sample:
When the jobs were disabled: 🙂
Thanks in advance,
Daniel Szortyka