Skip to Content
Technical Articles

SAP BW- Near Line Storage (NLS) Issues & Solutions

Activity: SAP BW- Near Line Storage (NLS) Issues & Solutions

Hi Folks,

I am Amulya Bhavani. SAP BW,BPC consultant at MOURI Tech.

This blog post is my learning from NLS Project where we were responsible for delivery of NLS implementation on BW HANA system.There has been so much research on NLS available on internet.However, while doing NLS Archiving (moving data from HANA DB to SAP IQ DB), I couldn’t find all issues & solutions at one place, so just want to share a few issues with solutions which we had faced during NLS implementation in one of our projects.

System Version Specifications:-

SAP BW- 7.4, SP -15, SAP IQ version -16.1

 

Archiving Issues:

Following are a few issues which we had encountered during implementation:

  1. Authorization Issues
  2. Connection Issues
  3. Issues while Archiving the Data

1. Authorization Issues:

1.1. In order to create DAP (Data Archiving Process) on any infoprovider, one should have authorization. Otherwise, we cannot create DAP on info provider.

1.2. In order to archive the data, one should have authorization. Otherwise, we get an error as shown below:

In such a case, grant read_client_file permission to the user, then go back to logon and retry.

2. Connection Issue:

2.1. When we lose the connection for SAP IQ Data base but still try to perform archiving, we may end up with job failure. We need to check the connection first, if the connection is good we can go ahead with archiving of data. If it isn’t, it needs to be fixed

You can check connection in following ways:

i.     Go to Info provider -> Right click -> Data archiving process -> Display -> General settings -> Near line connection

ii.  Go to Info provider-> Manage -> Archiving tab -> Near Line connection

iii.  Go to -> SE38 -> ADBC_TEST_CONNECTION ->execute -> Give DB connection name ->Execute

If there is a connection failure we can see below error message.

3. Archiving Issues:

3.1. Problem:  Not all the requests have been rolled up to subsequent data marts.

Solution: Data needs to be loaded to upper level (ex: above layer data marts). Then we need to set the failed archiving request to invalid and continue archiving after completion of loading data to above data marts.

3.2. Problem: When we try to archive Non Compressed data without selecting “Allow non-compression” option in DAP, it throws errors. As mentioned above, we can check this error from job log.

Solution: Go to Infoprovider -> Data archiving process ->Selection profile tab ->Click on the Check box to allow the Non compressed data

3.3 Problem: Insufficient memory in SAP IQ DB. If memory gets low, archiving job takes more time (two to three times more) when compared to time taken for normal archiving job or job may get failed. In such cases, we can consider memory issue as one of the reasons.

Solution: Increase Memory size in IQ DB.

3.4. Problem: While archiving job is in progress, if other loads are being performed on the same info provider at the same time, it leads to a job failure.

Solution: While doing archiving, only single archiving request needs to be performed on an info provider. We need to make sure that no other loads are being triggered at the same time (Ex: Scenario where regular data loads are happening and archiving is also triggered on the same object).

3.5. Problem: When one archiving job fails and if we try to archive a new job request on the same info provider before setting it to invalid (before unlocking old request) we get an error.

Solution: To resolve such issuesGo to Info provider -> Manage -> Archiving -> click on the failed job -> click on Radio button set to invalid -> Run in background

 

3.6.  Problem: When transport request gets failed due to DAP deletion request and a new DAP create request (in case of changing the primary DAP criteria) is collected in the same transport. This results in RSDANLSEG table data entry not getting deleted in the target system,  leading to subsequent failure of transport.

Solution: We need to move deletion request transport first and then move new DAP with create request.

3.7. Problem: Loss of data, if archiving request is not reloaded.

Solution: Before transporting DAP Changes (Primary characteristics of DAP), we need to reload the data or else, data will be lost but no need to reload data in order to add additional DAP candidates.

3.8. Problem: When DAP tables are not present in RSDANLSEG table and when we try to reload the data, we will get a failure.

Solution: In such a case, we need to transport DAP objects again and make sure that table contains archived info provider.

3.9. Problem: If there is a case where we are able to read the active data (not archived) but unable to read archived data from Query.

Solution: First, we need to check whether changes are done at Query level & Multi provider level (if required). We need to check, if we have proper connection or not, if it’s good then we can check for the RSDANLSEG table, If it has the archived providers, finally we can check at Multi provider level. If the issue persists (not able to read archived data), we can run a program RSR_MULTIPROV_CHECK (delete or rebuild metadata run time) to resolve the issue.

3.10. Problem: In write optimised DSO while performing archiving by selection condition as “Request Create Date” and if creation date does not exist in the request, we get an error as shown below:

 

Solution: Provide existing request for archiving.

4. Stragglers Issue:

Problem: Stragglers Issue (when back postings are done on the same data which are present in archived period then we face lock issue)

Solution: As the data is locked in archiving request, go to provider-> Manage tab -> Archiving tab-> Look for the request number from the error and reload the required request from NLS to fix the delta request.

 

Conclusion:

Despite of having few issues while performing archiving using NLS on the system.It is prompted to use NLS as it has benefits such as good system performance and also reduces cost for maintaining large amount of data.

 

Thank you.

Amulya Bhavani,

https://www.mouritech.com/

4 Comments
You must be Logged on to comment or reply to a post.