Skip to Content
|

Category Archives: Uncategorized

Executive Summary:

A regular requirement in European Union (EU) is to move inventory to warehouses across boarder. In many cases, this inventory movement might just be for logistical movement of goods for eventual sale or primarily for storage and might not be for any manufacturing value add to the product. These warehouses might have their own independent VAT registration number. Given the INTRASTAT regime in EU, there are requirements to not just record such movements but also report them separately for the warehouse in INTRASTAT and VAT reports submitted to the legal authorities.

This whitepaper discusses proven SAP standard Plant Abroad solution for such cases of cross-border inventory movements in EU and its associated VAT accounting and reporting requirements.

Note: Sound understanding of EU-INTRASTAT requirements, Taxation and SAP MM-FI integration is a pre-requisite to this reading.

 

Business requirement:

The following Scenario explains the key requirements

Scenario:

  • Company code in Germany and warehouse in Sweden with cross border inventory movement between them.
  • Sweden Warehouse is not a company code and doesn’t do any manufacturing value add to the product but has a separate VAT and INTRASTAT registration number and needs to do its own reporting on VAT and INTRASTAT.
  • Goods are transferred from within EU i.e from Germany company code to Sweden warehouse
  • As per EU VAT reporting requirements, VAT postings should happen for the goods movement to show credit and debit entries for VAT in the same document so that overall VAT impact is nullified for the scenario under
  • Critical to Quality for this Solution (CTQs) to work are as listed below
CTQ CTQ detail
1 Ability to assign VAT registration from different countries to one company code in EU
2 Correct value-added tax (VAT) registration number prints on sales and purchasing documents
3 Calculates the right VAT
4 Handles stock transfers
5 Conducts VAT and INTRASTAT reporting correctly

 

SAP Standard Solution:

SAP provides Plant Abroad Functionality for EU to address the requirements

 

Reference SAP Notes:

SAP Notes:  Description
Oss note 63103 Explains logic regarding tax procedures if you are using plants abroad
Oss note 1085758 Customizing for stock transports
OSS note 850566 If you activate plants abroad then this will be activated for all company codes within one client.
Oss note 850566 deactivate plants abroad for a particular company code
 

 

 

 

 

 

Impact Consideration: Following impacts would need to be considered while deploying the SAP plant abroad solution for an organization that is already on SAP

 

Impact level  Impact – Considered
Company code

Cross-Client FI impact. Some new fields become activated once you set this function. You enter them in the following activities:

1. In the Define Countries activity in the Global Settings IMG, you have to enter:
Country currency for the tax return
Exchange rate type (usually “M”)
Indicators for the tax base and cash discount base

 

For all company codes that do not require a tax code for ‘Plants
abroad’, you can maintain the parameter “Plant abroad not required” with ‘X’ (Transaction “OBY6”) under “Additional entries”

 

2. In the Define Tax Codes for Sales and Purchases activity, you have to enter the Reporting country for the tax codes by choosing Properties.

3. You make the remaining settings in the Define VAT Registration Numbers for Plants Abroad activity

Report Level

The activation of plants abroad has also consequences for the VAT report (RFUMSV00 program). Here you need to enter/activate additional parameters which are the following:

•Reporting country / tax return country
•Country currency instead of local currency

 

Business Benefits:

  • Compliance to EU VAT and INTRASTAT reporting
  • Clear end to end audit trail for each inventory movement and related accounting
  • Standard SAP solution

 

 

 

 

 

 

 

Detailed Mechanism:

Configuration setup:

 

Step Required Setup Transaction Code Comments Setup required
1 Financial Accounting–>Fin Act Global Settings–>Tax on Sales/Purchases–>Basic Settings–>Plant abroad–>Activate Plants Abroad SPRO Tick the box with the question: Plants abroad activated Needs activation for Company code
2 Financial Accounting–>Fin Act Global Settings–>Tax on Sales/Purchases–>Basic Settings–>Plant abroad–>Enter VAT Register Number for plant abroad SPRO Enter per company code a country code which is different than the actual country where the company is established and the VAT registration number.  country in which Plant abroad (Sweden in this case) is present  Details
3 Maintain Tax Codes FTXP Need to complete the field “reporting country” in the properties of the new tax code. This means that you can use this tax code for the new VAT registration number/new reporting country.  
4 IMG: Sales and Distribution–>Basic Functions–>Taxes–>Plant Abroad–>Maintain and Assign Pricing Procedure SPRO   RVWIA1
5 IMG: Sales and Distribution–>Basic Functions–>Taxes–>Plant Abroad–>Maintain Billing type and billing type proposal VOFA   WIA
6 IMG: Sales and Distribution–>Basic Functions–>Taxes–>Plant Abroad–>Maintain Copying control VTFL   Std Setup
7 IMG: Sales and Distribution–>Basic Functions–>Taxes–>Plant Abroad–>Maintain billing relevance for Item Categories SPRO   Std Setup
8 IMG: Sales and Distribution–>Basic Functions–>Taxes–>Plant Abroad–>Assign GL account to account key VKOA   UML need to be maintained
      9 IMG: Sales and Distribution–>Basic Functions–>Taxes–>Plant Abroad–>Maintain Declaration Numbers VI62    

Output Pre-View:

 

Sample SAP INTRASTAT output related to Plant Abroad scenario

VAT accounting entry for Plant abroad transactions:

Conclusion:

SAP’s solution for Plant abroad meets the VAT and INTRASTAT reporting requirements for Toll-manufacturing and statutory reporting.

References

http://help.sap.com

 

Executive Summary

INTRASTAT is the system for collecting information and producing statistics on the dispatches and arrivals of goods between countries of the European Union (EU). It began operation on 1 January 1993. The provision of statistics is essential for the development of European policies on the internal market and market analysis. INTRASTAT thresholds for arrivals and dispatches are published for each member country. Pre-defined formats are available for each country in EU to capture and declaration INTRASTAT.

SAP provides a standard solution for monthly INTASTAT reporting for goods receipts as well as dispatched for EU member countries. This whitepaper details the SAP solution for INTRASTAT reporting with Germany as an example.

 

Business requirement:

EU authorities require INTRASTAT to be reported in pre-defined formats. An INTRASTAT reporting format as used for Germany is as shown below.

Each box in the above format has relevant information to be captured based on the goods arrived or dispatched. Details of information contained in each box are as elucidated below:

EU INTRASTAT form Details
Box 1-First Subdivision VAT number as stated in the advance turnover tax return / party responsible for providing
information (10-11 digits)
Box 1-Second Subdivision Additional number (3 digits)
Box 1-Third Subdivision seat (Land) of the local tax office (2 digits)
Box 2-First Subdivision Month (2 Digits)
Box 2-Second Subdivision Year (last 2 digits of Year)
Box 3 – (no designation) Leave this box empty.
Box 4-Declaring third party This box must be completed only if the party responsible for providing information has instructed a third
party (e.g. forwarding agent) to prepare the statistical declaration.
Box 5-Address of the Federal Statistical Office  
Box 6 – Description of goods The usual trade name of the goods, which must be precise enough to allow a clear identification
based on the Commodity Classification for Foreign Trade Statistics.
Box 7 – Item number The serial number. If more than one form is required for the declaration, the numbering must be continued
without break on the subsequent forms.
Box 8 a – Member State of destination Indicate the Member State where the goods are moved to be used or consumed, treated or processed. If
the Member State of destination is not known, the EU Member State known as the last country to which the
goods are to be dispatched is considered the Member State of destination. The codes to be used are the
two-digit letter codes
Box 8 b – Region of origin (Land) The region of origin is the Land within the Federal Republic of Germany where the goods were produced,
mounted, assembled or treated. Enter the Land code as specified in Annex 1 for goods with German origin.
If the actual region of origin is not known, the code of the Land where the goods were dispatched must be
entered.
As for goods with foreign origin, code number 99 should be entered

 

Box 10 – Nature of transaction This is to indicate specific clauses of
the business contract.
Box 11 – (no designation) Leave this box empty.
Box 12 – (no designation) Leave this box empty.
Box 13 – Commodity code Eight-digit code of the version of the Commodity Classification for Foreign Trade Statistics in
force at the time.
Box 14 – (no designation) Leave this box empty.
Box 15 – (no designation) Leave this box empty.
Box 16 – Net mass in full kilograms Enter the net mass, in full kilograms (kg), of the commodity described in box 6 of the item concerned. Net
mass is the mass of the goods net of all packaging.
The figures should be rounded to full kilograms. If a figure is rounded down to 0 kg, “0” must be entered in
box 16
Box 17 – Quantity in supplementary units Numerical value of the supplementary unit indicated in the Commodity Classification
for Foreign Trade Statistics. Leave the box empty, if a supplementary unit is not specified
Box 18 – Invoiced amount in full euros The invoiced amount is the value billed for the declared commodity, i.e. the VAT assessment basis.
Box 19 – Statistical value in full euros The statistical value can be calculated as
follows: the statistical value declared upon arrival of the unprocessed goods plus all costs incurred in the
statistical territory for processing and transporting the goods up to the German border. This also includes
the cost of packaging.
Box 20 – Place/date/signature of the party responsible for providing information / declaring
third party
The declaration must be signed by hand by the party responsible for providing information or the declaring
third party. Apart from his signature, the party responsible for providing information or the declaring third
party must state his surname and first name. If the signatory is a legal person, he has to state his surname,
first name and position in the firm in addition to placing his signature. If the declaration is signed by an
agent, a supplementary note must clearly indicate the agency relation.

SAP Solution:

SAP enables INTRASTAT reporting by capturing the required information at transactional level based on the following setups:

  1. Finance Setup : INTRASTAT ID number, INTRASTAT additional number (Box 1) are setup while configuring the company code in additional data section (T-code: OBY6)

SPRO->Financial Accounting–>Financial accounting global settings–>Company code–>Enter global parameters–>choose the desired company code–>choose additional data

 

  1. Material Management setup:
Setup Item Details
Customer/Vendor tax identification number (TIN) The customer/Vendor tax ID number is read from the Customer/Vendor master record.
Commodity code The commodity code is specified in the material master or in the purchase order item.
Invoice value The invoice value is the value of the quantity to be reported.
Delivery requirements The delivery requirements are read from the purchase order header. They are proposed from the vendor master record.
Business transaction type The business transaction type can be entered in the purchase order item on the import/export data screen.
Tare weight in kilogram The tare weight in kilogram is specified in the material master or in the purchase order item.
Supplementary unit The supplementary unit is a unit defined by the statistical offices for the commodity codes. This unit can be maintained with the commodity code in table T604.
Country of origin The country of origin is the country from which the vendor – or the vendor of the delivery – derives.
Mode of transport (when goods cross border) You can enter the mode of transport on the export/import screen of the purchase order header, or it can be proposed there from the vendor master record.
Country of origin of goods The country of origin of the goods can be entered on the export/import screen of the purchase order item, or it can be proposed there from the info record.
Document number Purchase order Number
Maintain purchase order (Materials Management) data Maintain the goods receipt data in the header (choose Header –> Import) and in the items (choose Item –> More functions –> Import).

 

 

 

 

 

 

  1. Sales and Distribution setup:
Setup Item Details
 1.Maintain your company’s ID number. You can maintain this number in Customizing for Foreign Trade/Customs. Choose Sales and Distribution –> Foreign Trade –> Periodic Declarations –> Maintain Official Numbers for Declarations to Authorities. Select your company code and choose Goto –> Detail. Select Additional details. Then enter your company’s number in the appropriate field(s).
2. Maintain the appropriate region code. In Customizing for Foreign Trade/Customs, choose Periodic Declarations –> Maintain Region. Then in the State of manufac. field, enter the region or state of manufacture.
3. Maintain the commodity code and special unit of measure. In Customizing for Foreign Trade/Customs, choose Basic Data for Foreign Trade –> Define Commodity Codes/Import Code Numbers by Country. Select a line and choose Goto –> Details. In the Spec. unit of measure field, enter the unit.
4. Maintain the default business transaction type table. In Customizing for Foreign Trade/Customs, choose Basic Data for Foreign Trade –> Define Business Transaction Types and Default Value –> Define Default Business Transaction Type.
5. Maintain the import/export procedure. In Customizing for Foreign Trade/Customs, choose Basic Data for Foreign Trade –> Define Procedures and Default Value –> Define procedure default.
6. Maintain the completeness check. In Customizing for Foreign Trade/Customs, choose Periodic Declarations –> Log of Incomplete Items – Aggregation Criteria – Individual Maintenance. Make sure only the relevant declarations are selected for each field name and country.
7. Maintain the currency conversion. If the document currency differs from the local currency, the system converts the value into the local currency if the relevant tables contain the correct settings. In Customizing, choose General Settings –> Currencies –> Enter exchange rates. Be sure to maintain the currencies for conversion in both directions (for example, USD to DEM and DEM to USD). You can also check the entries under Set decimal places for currencies and Check exchange rate types.Some countries’ customs authorities use their own exchange rate for official declarations. You can take this rate into account by using the parameter “exchange rate type I” when selecting data for INTRASTAT declarations. You must also maintain the exchange rate tables accordingly.
8. Maintain the route definition (relevant only for dispatches). You can maintain the mode of transport in the route’s Details screen. The system automatically copies this mode of transport into the foreign trade header data if a route is assigned to the delivery or billing document.To display the detail screen for route definition, choose the following in Customizing for Sales and Distribution: Basic Functions –> Routes –> Define Routes –> Define Routes and Stages. Select the route and choose Goto –> Details.
9. Maintain the transportation connection point definition (relevant only for dispatches). You can select a transportation connection point as the border crossing point as part of the leg, which, in turn, is part of the route. The system automatically transfers the office of exit entered for the transportation point to the foreign trade header data only if the point is marked as the border crossing point and the address for the point is maintained (especially the Country field).In Customizing for Sales and Distribution, choose Basic Functions –> Routes –> Define Routes –> Define Transportation Connection Points.
10.Master Data-SD On the General Data screen of the customer master record, maintain the customer’s EC VAT registration number.
11 Data in Documents

Foreign trade data must be maintained for all documents relevant for declarations. If, for example, you want to select credit memos for a declaration, you must also maintain foreign trade data in the credit memos. The system selects only those documents containing foreign trade header or item data.

Maintain pricing data.
Make sure that the invoice value or purchase value and statistical value are maintained. So that the system can calculate the statistical value, do the following:

Maintain condition type GRWR in the pricing procedure
Create the corresponding condition records for condition type GRWR
Maintain delivery and billing (Sales and Distribution) data.
Maintain the dispatch data (under Foreign Trade data) in the header and in the items.

 

 

 

Solution validation:

  1. Purchase side solution validation:

a) Create a Purchase Order:

 

  • additional step is to furnish the foreign trade information in the foreign trade section of the purchase order

b) Execute MEIS: to select INTRASTAT relevant documents for arrival cases transaction code MEIS has to be executed

 

 

 

The following log is available, showing the document(s) selected for reporting

c) Paper version of INTRASTAT report-VE02-: Data for printing on pre-printed stationary (EU authorities provide a pre-printed stationary for INTRASTAT reporting-as shown in business requirements section of this whitepaper) is made available in for matted soft copy version on executing transaction VE02 as shown below:

 

the following log is available

Soft copy formatted version for INTRSTAT reporting is available as shown below:

 

2)Sales Side Processing-VE01: A sales order is created normally and billing done. The billing doc created is processed in transaction VE01 as follows:

Dispatch side INTRASTAT selection log is available as follows:

b) VE02 ( Paper version of receipt / Dispatches – Germany): Selection screen for dispatch side INTRASTAT is as follows:

 

 

 

the following soft copy version is created for printing on pre-printed stationary

 

The output in the formatted soft copies for receipts/dispatches can be compared with pre-published INTRASTAT format box by box as follows:

EU INTRASTAT form Details Source
Box 1-First Subdivision VAT number as stated in the advance turnover tax return / party responsible for providing
information (10-11 digits)
OBY6
Box 1-Second Subdivision Additional number (3 digits) OBY6
Box 1-Third Subdivision seat (Land) of the local tax office (2 digits)  
Box 2-First Subdivision Month (2 Digits) Posted Doc
Box 2-Second Subdivision Year (last 2 digits of Year) Posted Doc
Box 3 – (no designation) Leave this box empty. NA
Box 4-Declaring third party This box must be completed only if the party responsible for providing information has instructed a third
party (e.g. forwarding agent) to prepare the statistical declaration.
 NA
Box 5-Address of the Federal Statistical Office   Pre-Printed Form
Box 6 – Description of goods the usual trade name of the goods, which must be precise enough to allow a clear identification
based on the Commodity Classification for Foreign Trade Statistics.
Basis Comm./imp. code no. info iin Mat master
Box 7 – Item number The serial number. If more than one form is required for the declaration, the numbering must be continued
without break on the subsequent forms.
SAP Automatic
Box 8 a – Member State of destination Indicate the Member State where the goods are moved to be used or consumed, treated or processed. If
the Member State of destination is not known, the EU Member State known as the last country to which the
goods are to be dispatched is considered the Member State of destination. The codes to be used are the
two-digit letter codes
Derived from Ship to Party address in SO for Dispatches/ Form Vendor Address for receipts
Box 8 b – Region of origin (Land) The region of origin is the Land within the Federal Republic of Germany where the goods were produced,
mounted, assembled or treated. Enter the Land code as specified in Annex 1 for goods with German origin.
If the actual region of origin is not known, the code of the Land where the goods were dispatched must be
entered.
As for goods with foreign origin, code number 99 should be entered
Derived from sales, PO Order line item details-Forg trad/Customs tab
Box 10 – Nature of transaction This is to indicate specific clauses of
the business contract.
Derived from sales Order line item details-Forg trad/Customs tab
Box 11 – (no designation) Leave this box empty.  NA
Box 12 – (no designation) Leave this box empty.  NA
Box 13 – Commodity code eight-digit code of the version of the Commodity Classification for Foreign Trade Statistics in
force at the time.
Material master-Comm./imp. code no
Box 14 – (no designation) Leave this box empty.  NA
Box 15 – (no designation) Leave this box empty.  NA
Box 16 – Net mass in full kilograms Enter the net mass, in full kilograms (kg), of the commodity described in box 6 of the item concerned. Net
mass is the mass of the goods net of all packaging.
The figures should be rounded to full kilograms. If a figure is rounded down to 0 kg, “0” must be entered in
box 16
Sales Doc/PO
Box 17 – Quantity in supplementary units numerical value of the supplementary unit indicated in the Commodity Classification
for Foreign Trade Statistics. Leave the box empty, if a supplementary unit is not specified
Sales Doc/PO
Box 18 – Invoiced amount in full euros The invoiced amount is the value billed for the declared commodity, i.e. the VAT assessment basis. Sales Doc/PO
Box 19 – Statistical value in full euros the statistical value can be calculated as
follows: the statistical value declared upon arrival of the unprocessed goods plus all costs incurred in the
statistical territory for processing and transporting the goods up to the German border. This also includes
the cost of packaging.
Sales Doc/PO
Box 20 – Place/date/signature of the party responsible for providing information / declaring
third party
The declaration must be signed by hand by the party responsible for providing information or the declaring
third party. Apart from his signature, the party responsible for providing information or the declaring third
party must state his surname and first name. If the signatory is a legal person, he has to state his surname,
first name and position in the firm in addition to placing his signature. If the declaration is signed by an
agent, a supplementary note must clearly indicate the agency relation.
Available on pre-printed form

 

Conclusion:

Each country in Europe will have its own pre-printed stationary for INTRASTAT reporting. SAP’s solution detailed above can be extended for the relevant countries with some country specific minor adjustments that might be required to make it compliant towards INTRASTAT reporting. Master data needs to be updated accurately to help achieve correct report.

  

References

http://help.sap.com

Single Sign On ( SSO) Configuration For Hana DB Using Kerberos

This blog explain the steps on setting up of Single Sign On ( SSO) Configuration For Hana Database With Kerberos .

SSO allows a user to log on only once and provide access to multiple systems and services without being asked to produce credentials again.

Kerberos is one of many ways for realizing SSO (other examples are SAML or X.509 certificates).

Configuration File Description
<sidadm home>/etc/krb5_hdb.conf Configuration of the Kerberos realm to be used with the SAP HANA server installed under <sidadm>
<sidadm home>/etc/krb5_hdb.keytab List of service keys required to authenticate the services on the Kerberos server
<sidadm home>/etc/krb5_host.keytab One entry only to authenticate the host on the Kerberos server for the purpose of delegation

Step-by-Step Procedure

We have to create a Service User representing SAP HANA database in Active Directory, being mapped by a Service Principal Name(SPN), then we have to create a keytab file for this SPN on the DB server. On the DB server, we also need a krb5.conf file. Finally, we have to create an externally-mapped SAP HANA database user.

Prerequisite:

Make sure that the Kerberos client and server libraries are already installed on Hana database server , to verify that the software requirements are met, please run below command:

rpm -qa | grep krb5*

The version numbers don’t have to match exactly, but should be above 1.6.3-132 to include important security patches.

Configuration steps:

1-> Hostname Resolution

On Linux there are several tools for hostname resolution. Some of them use DNS directly (dig, host, nslookup), while others (hostname)

To verify , on the Db server run below command

>hostname –fqdn (This command gives the FQDN of the DB server.)

> hostname –ip-address(This command gives the respective IP address)

2->Setup a configuration file for the MIT Kerberos libraries, krb5.conf on database server shown as below example

Standard Kerberos configuration:

#> cat /etc/krb5.conf
[libdefaults]
default_realm = MYDOMAIN.COM
[realms]
MYDOMAIN.COM = {
kdc = mykdc1.mydomain.com
kdc = mykdc2.mydomain.com
}
[domain_realm]
.mydomain.com = MYDOMAIN.COM
mydomain.com = MYDOMAIN.COM

The [domain_realm] covers only the mapping for the DB server domain (it has nothing to do with the client domain(s)!). This will be used in mutual authentication when the SAP HANA database client tries to authenticate the SAP HANA database server.

The domain part in the [domain_realm] mapping must consist of the domain name in its full length. In case the FQDN of the DB server is hdbserver.subdomain.domain.com,

then the [domain_realm] entry has to be

subdomain.domain.com = DOMAIN.COM
subdomain.domain.com = DOMAIN.COM

 

3->Create Active Directory user for the Kerberos authentication or use existing one.

The user should be created with “password never expires” option.

( This may change based on your organization structure)

In my case I have raised request to the Windows Service Team

  • SPN for HANA Studio:

setspn -S hdb/<server_name_fqdn> <user>

  • SPN for HANA XS (web access):

setspn -S HTTP/<server_name_fqdn> <user>

 

4->Verify Kerberos configuration using <SID>adm user:

When using the kinit and klist utilities, we rely on the proper configuration of the Windows AD test user aduser1. Using kinit we try to authenticate the test user against the AD domain and create a TGT:

/usr/bin/kinit <SPN_user>@MYDOMAIN.COM

If it succeed it will ask for password of the SPN_user

  • The next command should be used immediate after the kinit command (it will show the ticket of the previous login).

Using klist we can see this ticket:

>klist

Ticket cache: FILE:/tmp/krb5cc_1000

Default principal: SPN_user@MYDOMAIN.COM

Valid starting Expires Service principal
02/18/13 15:25:58 02/19/13 01:26:02 krbtgt/ MYDOMAIN.COM@MYDOMAIN.COM
renew until 02/19/13 15:25:58

 

5->Create the keytab file. We will use ktutil from the Linux server (the HANA server).

Ktutil can be used to edit the keytab file.

First we need to discover the kvno number to use. In order to know the kvno number run the commands:

  • run kinit command to get the TGT for the user (SPN_user):

For example: /usr/bin/kinit <SPN_user>@MYDOMAIN.COM

Next run kvno command to get the kvno number:

kvno hdb/<server_name_fqdn>@MYDOMAIN.COM

In the output we can see kvno= value.

->If kvno value is 3 , then in that case while creating keytab file we will use kvno value as 3

 

6->The commands for creating the keytab file (it requires root as a result of the keytab directory /etc permission):

  • Keytab for HANA STUDIO:

In shell run:

/usr/bin/ktutil

In ktutil run:

addent -password -p hdb/<server_name_fqdn>@MYDOMAIN.COM -k 3 -e rc4-hmac

wkt /etc/krb5.keytab

q

  • Keytab for HANA XS (web access):

In shell run:

/usr/bin/ktutil

addent -password -p HTTP/<server_name_fqdn>@MYDOMAIN.COM -k 3 -e rc4-hmac

wkt /etc/krb5.keytab

q

 

7->Securing the keytab file. This step is mandatory, without it the SSO will not work (requires root)

  • Change ownership of the file to <sid>adm:sapsys:

8->Verify The Keytab File using <SID>adm

  • Run klist command:

klist -k /etc/krb5.keytab -etK

The output should return both SPNs

  • Run kinit command to get the TGT for the user from step 3 (SPN_user):

/usr/bin/kinit <SPN_user>@CORP.AMDOCS.COM

For example: kinit <SPN_user>@MYDOMAIN.COM

  • Next run the kvno command:

/usr/lib/mit/bin/kvno -k /etc/krb5.keytab hdb/<hostname_fqdn>@MYDOMAIN.COM

/usr/lib/mit/bin/kvno -k /etc/krb5.keytab HTTP/<hostname_fqdn>@MYDOMAIN.COM

 

9->Create/Change user in HANA

In authentication, check the Kerberos checkbox and in external ID fill the value <user_name_AD>@CORP.AMDOCS.COM

10->Create connection in HANA Studio with SSO

Create system in HANA Studio. In the phase of the user and password – choose Authenticated by current OS user.

 

11->Try to login using the connection created in the previous step

 

For Reference:

Kerberos/ Active Directory

Any background information on Kerberos or Active Directory troubleshooting, the following web sites may be helpful:

Whitepaper AD/Kerberos Troubleshooting:        https://wiki.wdf.sap.corp/wiki/download/attachments/1180221073/Troubleshooting_Kerberos_Errors.DOC?version=1&modificationDate=1355764274996

Switch on Windows System Event Log for Kerberos: http://support.microsoft.com/kb/262177/en-us

Things to check when Kerberos authentication fails using IIS/IE…: http://blogs.msdn.com/b/friis/archive/2009/12/31/things-to-check-when-kerberos-authentication-fails-using-iis-ie.aspx

 

SAP HANA Database

 

Regards,

Sumit

The professional world now runs on data. Businesses need massive amounts of data to understand their customers, analyze their efficiency, and remain competitive with the other businesses around them using data analytics on a regular basis. Unfortunately, unless your data acquisition methods are entirely automated or purchased, you’ll be relying on repetitive, manual data entry to store and organize these data points.

You’re almost certainly losing time to this process, so what’s the best way to mitigate that loss?

Key Factors for Time Loss

First, let’s understand why and how data entry can be a time waste. After all, data is still important and can make an organization more valuable.

Time loss usually happens due to:

  • Process issues. Sometimes, the problem is in the process. You might have an unclear system of priorities for your employees to follow, or an informally documented process that leads to confusion and redundancy upon entry.
  • System issues. Other times, the issue is with a system. Some systems require weeks to months of employee preparation—especially when you’re transitioning from one system to another. If the system is hard to learn or counterintuitive, it can make the process much murkier and vulnerable to errors.
  • Data issues. Your data entry may also suffer if the quality of your data is too low. There are many potential issues here, including inconsistent formatting and low-quality sources, and any of them can affect your employees’ efficiency in entering that data in a central system.
  • People issues. Finally, you may have a problem with the individual employees doing the work. They may be unproductive, unfocused, or ill-equipped for the job.

How to Stop It

So how can you prevent time loss in your data entry?

  • Automate what you can. First, try to automate anything and everything you can. It’s certainly possible to write a script that performs the work of a data entry employee—at least, in most fields—and by and large, it’s more the most cost-efficient path. Automation is faster, cheaper, and most importantly, less prone to errors, so long as it’s programmed correctly.
  • Make a consistent process and document it. For the data entry requirements that you can’t operate, make sure you create and formally document the correct process for entering it. This will make it easier and faster to train new employees, and all employees will have equal access to the centralized guidelines for how data entry is supposed to work. This will lead to greater efficiency and consistency across the board.
  • Train and monitor your employees. Don’t just assume your employees know how to follow the guidelines; invest the time upfront to supervise them and make sure they follow those guidelines properly. Beyond that, you’ll want to invest in monitoring software, tracking their communication habits, data entry habits, and overall time expenditure. You’ll be able to tell which employees are most efficient and which ones are struggling, and you’ll be able to take preventative or reinforcing action accordingly.
  • Observe your system’s performance, and consider upgrading. Finally, pay attention to how quickly your new employees take to your system, and if there are any pain points associated with it. If you find there’s a flaw or UI hiccup in your system, you may need to consider upgrading or switching systems. Though it might be a short-term pain, it will save you time in the long run.

If you’re in the market for a new software platform, consider giving some of SAP’s powerful solutions a try. For example, our sales software offers multiple integrations and is intuitive enough to be learned in a day.

I try to blog. Some people will say most of it is not technical. And yes, they would be right. Today I’m issuing a challenge. It is specific to all those people who are out there writing ABAP. It could also be for non-ABAP languages. Get the theme? Technical blog.

My challenge: Write a blog about what you are working on!

Why? Because when I ask people why they don’t write blogs about ABAP, the answer is that it’s all been done. We don’t want to just repeat things. So if you are writing about your project you won’t be duplicating anything. AND we will get some new ABAP (technical type) blogs.  Heck, you can write about an old project, we wouldn’t know. It can also be an old project.

You can share as little or as much code as you want!

This is cool, because there is something in it for you!

Over one million users stop in to the SAP Community. If you aren’t doing something “right” then they will probably comment. Worried about that? Don’t be. I don’t do a lot of things “right”. That’s OK. I learn and get better. Note to those who comment constructive feedback is always welcome. Rude feedback is not welcome. An example would be you telling me that I’m an idiot. Yes, part of the rules. Sadly, people like me rarely read all of them.   (I did not either – but I’m going to go look at them today.)

Anyway Challenge time

So I thought – I’d just pull out a project that I worked on before.   I am on 4.6C. Yes, it’s still working great! That means I have limited access to classes/objects. That’s why I’ve been bookmarking so many things.

So I’m trying to decide what fun things to share from the project.    I think you may want to know the description. We wanted to save our “PDF” documents to a directory, be able to retrieve and mail them. This is a general area where we could put whatever documents we wanted.  I think we have more than 10 types of documents there. Some we create at the time we are using this, Some are uploaded from a directory, some are created via output types and/or messages.  It just depended on what we wanted.  And we could add or remove document types at a later date.

I decided to share the mailing piece of it because it could use some help with some of the newer ABAP objects that are available.

This is a function module that accepts the email name, the document and the body.  The document is actually a key for where we pull the information. It is also an RFM – it’s called from a webpage.

 LOOP AT gt_doc INTO gs_doc.
    IF gv_message IS INITIAL.
      PERFORM check_input.
    ELSE.
      message = gv_message.
      EXIT.
    ENDIF.
  ENDLOOP.
  IF gv_message IS INITIAL.
    PERFORM fill_receiver_table.
    gv_start = 1.
    gv_cnt = 1.
    LOOP AT gt_doc INTO gs_doc.
      PERFORM get_attachment.
    ENDLOOP.
    IF gt_contents_bin IS INITIAL.
      gv_message = 'No attachments selected.'.
      message = gv_message.
    ELSE.
      PERFORM email_create.
      PERFORM send_email.
    ENDIF.
    message = gv_message.
  ENDIF.

So this tells you next to nothing.  Here I loop through the receivers:

  LOOP AT gt_email INTO gs_email.
    CLEAR gs_receivers.
    gs_receivers-receiver = gs_email-email_add.
    gs_receivers-com_type = lc_send_internet.
    gs_receivers-rec_type = lc_internet.
    APPEND gs_receivers TO gt_receivers.
  ENDLOOP.

Easy! Now I’m pulling the “PDF” from our system:

 OPEN DATASET docs-file_name FOR INPUT IN BINARY MODE.

  IF sy-subrc <> 0.
    CONCATENATE 'Could not open file' docs-file_name
        INTO message SEPARATED BY space.
    PERFORM update_log USING imp
                             message
                             docs-file_name.
    EXIT.
  ENDIF.

  DO.
    READ DATASET docs-file_name INTO pdf LENGTH lv_size.
    IF sy-subrc <> 0.
      EXIT.
    ENDIF.
    pdf_size = pdf_size + lv_size.
    APPEND pdf.
  ENDDO.

Got it!  So now I get it ready!

* Move the 134 character table to a 255 format need for PDF attachment
    CALL FUNCTION 'SX_TABLE_LINE_WIDTH_CHANGE'
         EXPORTING
              line_width_dst              = 255
         TABLES
              content_in                  = lt_pdf[]
              content_out                 = lt_contents_bin[]
         EXCEPTIONS
              err_line_width_src_too_long = 1
              err_line_width_dst_too_long = 2
              err_conv_failed             = 3
              OTHERS                      = 4.

* Add the contents to the table that will have all the attachments
    CLEAR ls_contents_bin.
    LOOP AT lt_contents_bin INTO ls_contents_bin.
      APPEND ls_contents_bin TO gt_contents_bin.
      CLEAR gs_contents_bin.
    ENDLOOP.

    CLEAR: lv_lines_bin, gs_packing_list.
* Set up the name & Description
    WRITE gs_doc-document TO lv_document NO-ZERO.
    WRITE gs_doc-document TO lv_document15 NO-ZERO.
    WRITE gs_doc-item TO lv_item NO-ZERO.
    WRITE gs_doc-item TO lv_item6 NO-ZERO.
    WRITE gv_cnt TO lv_cnt NO-ZERO.
    CONDENSE: lv_document NO-GAPS,
              lv_document15 NO-GAPS,
              lv_item NO-GAPS,
              lv_item6 NO-GAPS,
              lv_cnt NO-GAPS.
    CONCATENATE lv_document lv_item lv_cnt
                INTO gs_packing_list-obj_name.
    CONCATENATE lv_document15 lv_item6 lv_cnt
                INTO gs_packing_list-obj_descr
                SEPARATED BY space.

* Set up the packing list describint the current attachment

    DESCRIBE TABLE lt_contents_bin LINES lv_lines_bin.
    READ TABLE lt_contents_bin INTO ls_contents_bin INDEX lv_lines_bin.
    gs_packing_list-transf_bin = 'X'.
    gs_packing_list-head_start = gv_start.
    gs_packing_list-head_num = gv_start.
    gs_packing_list-body_start = gv_start.
    gs_packing_list-doc_type = 'PDF'.
    gs_packing_list-doc_size = lv_pdf_size.
    gs_packing_list-body_num = lv_lines_bin.
    APPEND gs_packing_list TO gt_packing_list.

* Set up start position for next attachment
    gv_start = gv_start + lv_lines_bin.
    gv_cnt = gv_cnt + 1.
  ENDIF.

Here’s the fun part – creating the attachment.

 CLEAR: gs_contents_txt, gs_packing_list.
  READ TABLE gt_doc INTO gs_doc INDEX 1.
  lv_numc = gs_doc-document.
  lv_vbeln = lv_numc.
* Create Message Body Title and Description
  IF gs_imp-mail_mess IS INITIAL AND
     gt_email_body IS INITIAL.
    gs_contents_txt-line = 'See attachment.'.
    APPEND gs_contents_txt TO gt_contents_txt.
  ELSEIF NOT gs_imp-mail_mess IS INITIAL.
    ls_text-mail_mess = gs_imp-mail_mess.
    APPEND ls_text TO lt_text.
    CALL FUNCTION 'CONVERT_STREAM_TO_ITF_TEXT'
         EXPORTING
              language    = sy-langu
         TABLES
              text_stream = lt_text[]
              itf_text    = lt_mail_content[].

    LOOP AT lt_mail_content INTO ls_mail_content.
      gs_contents_txt-line = ls_mail_content-tdline.
      APPEND gs_contents_txt TO gt_contents_txt.
    ENDLOOP.
  ELSE.
    LOOP AT gt_email_body INTO gs_email_body.
      gs_contents_txt-line = gs_email_body-email.
      APPEND gs_contents_txt TO gt_contents_txt.
    ENDLOOP.
  ENDIF.

  IF gs_imp-doc_type IS INITIAL.
    READ TABLE gt_doc INTO gs_doc INDEX 1.
    IF sy-subrc = 0.
      gs_imp-doc_type = gs_doc-doc_type.
    ENDIF.
  ENDIF.

  IF gs_imp-doc_type = 'TD'.

    SELECT SINGLE vbeln INTO vbak FROM vbak       
    WHERE vbeln = lv_vbeln.                      
    IF sy-subrc = 0.
      CALL FUNCTION 'SD_SALES_DOCUMENT_READ'
           EXPORTING
                document_number      = lv_vbeln
                i_block              = ' '
                i_no_authority_check = 'X'
           IMPORTING
                ekuwev               = ls_ekuwev.
      IF NOT ls_ekuwev-name1 IS INITIAL.
        CONCATENATE 'Deliver to ' ls_ekuwev-name1
                     INTO lv_obj_descript
                     SEPARATED BY space.
      ENDIF.
    ELSE.
      SELECT SINGLE banfn INTO eban FROM eban        
      WHERE banfn = lv_vbeln.                       
      IF sy-subrc = 0.
        CONCATENATE 'Deliver to' 'Furst-McNess Company'
                     INTO lv_obj_descript
                     SEPARATED BY space.

      ENDIF.

    ENDIF.

  ENDIF.
  IF NOT gs_imp-doc_type IS INITIAL AND
         lv_obj_descript IS INITIAL.
    SELECT SINGLE description INTO lv_description
           FROM zdoc_types
             WHERE doc_type = gs_imp-doc_type.
    IF sy-subrc = 0.
      CONCATENATE 'McNess' lv_description 'Document'
            INTO lv_obj_descript
            SEPARATED BY space.
    ENDIF.
  ENDIF.
  IF lv_obj_descript IS INITIAL.
    gs_document_data-obj_descr = 'McNess Documents'.
  ELSE.
    gs_document_data-obj_descr = lv_obj_descript.
  ENDIF.
  gs_document_data-expiry_dat = sy-datum + 10.

  gs_document_data-sensitivty = 'F'.
  gs_document_data-doc_size = gv_total_size.

* Body of E-mail
  CLEAR gs_packing_list.
  gs_packing_list-head_start = 1.
  gs_packing_list-head_num = 0.
  gs_packing_list-body_start = 1.
  gs_packing_list-body_num = gv_start.
  gs_packing_list-doc_type = 'RAW'.
  INSERT gs_packing_list INTO gt_packing_list INDEX 1.

 

Finally Send it!

TYPES: BEGIN OF lty_obj,
         tp TYPE soos-objtp,
         yr TYPE soos-objyr,
         no TYPE soos-objno,
         END OF lty_obj.

  DATA: lv_objid TYPE sofolenti1-object_id,
        ls_obj TYPE lty_obj,
        lv_objtp TYPE soos-objtp.

  CALL FUNCTION 'SO_NEW_DOCUMENT_ATT_SEND_API1'
       EXPORTING
            document_data              = gs_document_data
            put_in_outbox              = ' '
            commit_work                = 'X'
       IMPORTING
            new_object_id              = lv_objid
       TABLES
            packing_list               = gt_packing_list[]
            contents_txt               = gt_contents_txt[]
            contents_bin               = gt_contents_bin[]
            receivers                  = gt_receivers[]
       EXCEPTIONS
            too_many_receivers         = 1
            document_not_sent          = 2
            document_type_not_exist    = 3
            operation_no_authorization = 4
            parameter_error            = 5
            x_error                    = 6
            enqueue_error              = 7
            OTHERS                     = 8.
  IF sy-subrc <> 0.
    CASE sy-subrc.
      WHEN 1.
        MESSAGE e007  INTO gv_message.
      WHEN 2.
        MESSAGE e008 INTO gv_message.
      WHEN 3.
        MESSAGE e009 INTO gv_message.
      WHEN 4.
        MESSAGE e010 INTO gv_message.
      WHEN OTHERS.
        MESSAGE e011 INTO gv_message.
    ENDCASE.
  ELSE.
    ls_obj = lv_objid.
    COMMIT WORK AND WAIT.
*** Try to verify the record is ready to send - wait only 100 times
    DO 100 TIMES.
      SELECT SINGLE  objtp INTO lv_objtp
        FROM soos
          WHERE objtp = ls_obj-tp AND
                objyr = ls_obj-yr AND
                objno = ls_obj-no.
      IF sy-subrc = 0.
        EXIT.
      ENDIF.
    ENDDO.

    SUBMIT rsconn01 WITH mode = 'INT' AND RETURN.
*    gv_message = 'Mail sent'.
  ENDIF.

 

This is the end of just a piece of that development.  We have screens where the user can enter requests for different documents.  If they double click then they have the PDF display so they can look at it prior to sending it. AND a co-worker put together a webpage that calls some of this code. That ways it can look pretty when they call it. And yes – this was 4.6C. She used JavaScript and Visual Basic. Cool. This is prior to even Webdynpro.

GUI Result:

 

Webpage result:

 

So my challenge is to share your project.  As much or as little code as you want. I promise we will be nice!

A side challenge – use objects to send the e-mail. I didn’t have them available. SO if someone could put them in the comments – that would be so nice.

Also anything else you would like to know about this? Hate the idea? Feel free to comment. I love responding to them.

Big data has tremendous potential in revolutionizing business intelligence. Have you ever heard of “information overflow?” This term has gained popularity over the last decade because of the digitized business ecosystem, which has data flowing in from multiple channels, making it difficult, but essential, to evaluate them and draw productive conclusions. This insight can leverage your business significantly amid the competitive clutter of the business world. Today, as much as 65% of global brands are willing to integrate big data technologies to stay competitive.

Consider Intel, a company that integrated big data technologies back in 2012. The company has successfully incorporated these to evaluate historical data for marketing purposes, garnering massive savings in the subsequent years. From an optimized supply chain to analytics, these technologies can cut costs in various areas of your business.

Let’s have a look at the five prime areas where big data can slash operational costs for enterprises.

Customer relations

Companies worldwide have come up with various post-sales strategies to gauge their customer satisfaction levels. They give surveys, seek feedback from customers both online and offline, check out reviews, and invest tons of money in assessing how satisfied their customer is. Integrating big data tools can simplify the process, reducing costs overall.

Sophisticated tools have been developed to track the buying journeys of customers. This helps business firms correctly shape their campaigns, reducing failed marketing drives and costs.

While companies strive to enhance their fulfilment operations , it is important to keep costs low. For instance, fraudulent orders lead to company losses. Customers of e-Commerce businesses often order goods and opt for COD payments (cash on delivery), only to cancel the order at the last minute. Sometimes, customers never receive the products they bought. Analysing the purchasing and ordering habits of customers can help businesses make predictions on the likelihood of a sale. They can react accordingly which would result in significant savings.

Improving the supply chain

Big data technologies have a widespread application for optimizing supply chains. They can help companies manage their inventories and provide better delivery services to their customers. Take Amazon, for instance. Today’s largest e-Commerce company has already integrated big data techniques to optimize their supply chain. This enables them to offer an unprecedented level of service to their customers.

Big data also strengthen the optimization processes of warehouses and fulfilment centers. Amazon incorporates predictive analytics and big data for “anticipatory shipping” of its products. It has more than 200 fulfilment centers worldwide, of which 90 are located within the United States. The products are procured on the basis of purchase history and sophisticated demand driven Material Requirement Planning. This reduces shipping costs, as well as time. Factors like seasons, economic conditions, and weather are taken into consideration when studying the acquired data.

Performance

Big data technologies leverage business operations by enhancing the way they interact with their customers. In recent years, businesses have integrated automated processes to enhance their performance. As companies collect more and more data on their customers, their habits and intricacies of their supply chain, the volume of gathered information reaches an unprecedented level way beyond capabilities of traditional Enterprise Resource Planning (ERP) systems. As a result, a new market for big data connects ERP system has emerged. For instance, SAP HANA has become the defacto industry standards for in-memory data processing systems which allows a large volume of data to be analyzed using various big data tools and generates inferences to automate their systems. This reduces operational costs greatly. Business intelligence that is generated from analytics enhances the reliability, maintenance, and productivity of a business. When machines are digitally monitored and data generated by the systems is analyzed using big data tools, it reduces downtime and bolsters the productive capacities of businesses. In the oil and gas industry, the data-driven approach cuts down on unplanned downtime by as much as 36%. A predictive approach, evidently, can reduce delays, and costs, for businesses.

Reducing costs in marketing strategies

For businesses, understanding customer profiles is immensely important. Even a couple of decades back, a manual process was used to study customer behaviour, and then companies would come up with their own marketing strategies. With the globalization of business and the information explosion, it has become almost impossible to continue in this manner.

Today, you need to come up with multi-channel marketing campaigns. Successful marketers integrate big data technologies to evaluate customer behaviour and make strategic business decisions.   

Cutting costs on logistics

One of the most important benefits of big data for e-Commerce businesses is that it can save product return costs. On average, the cost of product returns is 1.5 times that of the actual shipping. By integrating big data analytics, companies can assess the possibilities of products being returned. These tools can identify the products that are most likely to be returned, and can allow companies to take the necessary measures to reduce losses and costs.

The most common items that are returned, or exchanged, are garments, shoes, and fashion accessories, to name a few. The common factors leading to product returns include faulty products, incorrect sizes, unmet standards, etc. Companies can find out which cities have the most product returns, or which customer frequently exchanges goods, by integrating big data technologies. They can also adopt a proactive approach, calling customers to seek feedback on a product. This can help businesses cut down on expenses in transportation and logistics.

Conclusion

Big data can significantly reduce enterprise costs, optimizing expenses and directing the company towards productive shores. SAP HANA and more importantly ERP systems based on and designed for in-memory computing platforms like SAP S/4HANA allows organizations to transform business processes and utilize Big Data to help cut costs and increase profits. This will leverage the productivity of your business, eliminating unwanted costs substantially. Contact reputed companies for their expertise and integrate advanced big data tools to your platform for cost savings.

While going through the video library, we noticed our Hoeffding Tree machine learning series was a little out of date, so we decided it was time for a makeover. If you’re using streaming analytics version 1.0 SP 11 or newer and want to learn how to train and score data in SAP HANA studio, check out part 1 of this new video series. There are more videos to come, and each will have an associated blog post just like this one, so stay on the lookout.

Here’s an overview of part 1 and a sneak-peak of the videos to come:

Summary

Part 1 is the beginning of the training phase, which involves creating a training model that uses the Hoeffding Tree training machine learning function. As data streams in, this function continuously works to discover predictive relationships in the model. To make sure things go smoothly, you need to use specific Hoeffding Tree training input and output schemas.

Input Schema

[IDS] + S

You can have as many feature columns as you need and each of these columns can be any combination of an integer, double, or string. The last column is a string, which is the The data we’re working with in this video is sample insurance data, so the classifier will be “Yes” for a fraudulent claim, or “No” for a legitimate claim.

Output Schema

[D]

The output schema is more simple. There’s just one column, and it’s always a double. This displays a value between 0 and 1, which tells you the accuracy of the model.

You’re now ready to create a training model.

Creating and Building a Training Model

To create your model, first make sure you’re connected to an SAP HANA data service. Don’t have one? Learn how to add one from our documentation.

Once connected, drill down to the Models folder in the Data Services view and choose Add Model:

Then, open the model properties by selecting the model. In the General tab, fill in the fields:

Choose HoeffdingTreeTraining as the machine learning function. The above input and output schema match the source data you’ll use in the next video (more on that in the next blog). Because this is a simple example, we set the sync point to 5 rows. This means that for every 5 rows, the model will be backed up to the HANA database. Of course, you would likely use a much higher interval in a production setting. If you want to fill in the specifics of the algorithm, switch to the Parameters tab. In the video, we just used the defaults.

To learn more about model properties and parameters, here’s a handy reference..

Next, right-click the new model and rename it:

Note that you also have the option to save the model (if autosave isn’t enabled), delete it, or reset it. Deleting a model removes every trace of it, whereas resetting a model only removes its model content and evaluation data. This means that the model’s function will learn from scratch. The model itself remains in the data service, and the model metadata remains in the HANA database tables.

You can find the model metadata, along with the other tables that get created when you create a model, under the user schema in the SAP HANA Administration Console:

These tables store valuable overview information about the models you create, to These tables include a snapshot of the model properties, model performance stats, and so on. For more details, check out our table reference.

What’s Next?

Stay tuned for part 2 of this video series (and its associated blog post), where you’ll complete the training phase by creating a streaming project that can use and effectively train your model. You’ll learn how to:

  • create and build a project that uses your training model,
  • compile and run the project,
  • and upload the data and view the output.

In later videos, you’ll create a scoring model and use it in a project to make calculations about future insurance claims, predicting whether they’ll be fraudulent or not.

For more on machine learning models, check out the Model Management section of the SAP HANA Streaming Analytics: Developer Guide. If you’re interested in creating machine learning models in Web IDE, we’ve got a blog post for that too!

 

I’ve been speaking to a lot of SEO specialists and they’re worried about artificial intelligence. To be fair, it’s hard to blame them. They’ve been in a constant battle with Google for years. Some people think AI means the search engines will finally win.

Please don’t listen to anyone who thinks the glass is half empty. AI can actually help SEO marketers. Will things change in the coming years? Of course, but it’s not necessarily a bad thing and we’ll look at why you should be excited.

1. Letting AI Take Care Of Our Ads

In a perfect world, every business would be built using paid ads. Hand over a dollar and you’ll get two dollars back. It’s the easiest way to build a 7-figure company, but it involves a lot of hard work.

Once artificial intelligence starts doing it for you it will become easy. It will know how much you can afford to pay for every ad. It will even know the right keywords to target. A reputable SEO agency should help you set it up.

2. AI Will Start To Create Content


There are lots of amazing websites on the internet covering many topics. It’s tough to choose between them, but let’s use Best In AU as an example. Do you know how many hours it takes to write the content they publish.

Imagine if artificial intelligence could do it for you. It would be so advanced it would even be able to search for breaking news. Publishers could sit back and enjoy their earnings without doing much work.

3. More Money To Spend On Paid Ads


Google says we’re not allowed to buy links, which is fine because it’s their company. It doesn’t mean SEOs need to listen to them. If you want a natural link on large publications you’ll need to pay for it.

The amount you’ll have to pay will depend on the website in question, but they’re not cheap. Luckily you’ve just saved a lot of money dismissing some of your writers. You can use the excess money to buy fantastic links.

4. You Can Include The Right Keywords

It’s fair to say everyone with a website has done some basic keyword research. Those who specialize in SEO will know it’s the most crucial part of the job. Get in right and you’ll generate free traffic for years to come.

It does take a lot of time if you’re going to do it properly. Normally, you’d know the right keywords to use before writing an article. Artificial intelligence will tell you exactly what to include while you’re writing one.

5. Updating Old Articles Automatically

Smart marketers realize it’s crucial to update old articles. They add keywords, increase the article length, and do anything else they think will help. It won’t be long before AI does it for you automatically.

You know trends change on Google all the time. Keywords that were once hard to rank for become easier to target. Artificial intelligence will work this out in the background and use it to update your old articles.

You always Have To Look On The Bright Side

Maybe you did think artificial intelligence would wipe you out, but it’s obviously not true. You have to start thinking on the bright side. Use new technology to your advantage instead of wondering how it’s going to hurt your business

In my earlier blog, we had discussed Integrating Big Data Workflow with SAP BODS. In this blog, we will explore how we can directly use Cloud Services on BODS Workflow.

loud Storages are Services provided by major cloud platforms that can store and handle large number of files of huge sizes. AWS S3, Azure and Google provide Cloud Storages that are used for storing ad-hoc files like log, flat files and data dumps. SAP BODS 4.2. SP7 introduced the support for the above-mentioned Cloud Storages.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 1

In this blog, we will consume data from AWS S3. The Steps for the other Cloud Services are similar.

Configuring Cloud Storage Services

The Cloud Storage Services should be configured so that SAP BODS can connect to it. The configuration can be followed from the guide published by the Cloud vendor.

To Connect to AWS S3, we will need to enable IAM access to AWS. Once the IAM access is enabled, then access and secret key must be generated to the IAM user for the S3 which is used by BODS to consume the data from S3.

The access and secret key can be generated from the Users section in IAM. Copy the access and secret key after generation.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 2

Place the required files in S3 bucket to consume it in SAP BODS.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 3

Configuring BODS with the Cloud Services

We need to create a File Locations in SAP BODS that points to the AWS S3. Login to the Designer and navigate to Formats in the Local Object Library.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 4

In the File Locations context menu, select New and create a new Flat File or Excel file depending on your source.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 5

Create the File Location by selecting the protocol as Amazon S3 Cloud Storage. Fill in the Security details of Access and Secret key and select the region. Provide the details of bucket name from which the data has to be fetched and configure the other necessary parameters.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 6

Different Configurations can be set for your Dev/Quality and Production. Azure and Google Cloud can be configured in similar manner.

Create a new Flat File or Excel file depending on the Data Source and Enter the format of the file.

Consuming data from Cloud Storages in SAP Business Objects Data Services - 7

Drag and drop the file in the Data Flow and you can use that Object to perform Transformation and other operations.

Azure and Google Cloud Services can be configured using the above mentioned method and BODS can be used to process files between each other or combine files from them together and process the same.

It has been a year since we launched SAP Cloud Platform at MWC in 2017. If we are a little modest and don’t call 2017 an unstoppable year for us it definitely has been an exciting one, with more than 40 services available on the platform, 7,500 customers and 780 partners building apps on the platform— with more than 50 SAP Enterprise Apps and 1,000 third-party apps available today.

 

Consumption Pricing Delivers Flexibility

2018 is shaping up to be yet another big year for SAP Cloud Platform, thanks to many big updates- one of them is a new consumption-based pricing model.

Traditional subscription pricing, with fixed prices for access to SAP Cloud Platform services, is still available. That approach safeguards IT investment through known costs for the life of use.

Consumption based pricing offers a new option for customer’s innovation projects. They get access to all eligible SAP Cloud Platform services to quickly start and scale projects and quickly meet changing business needs. The amount of usage can change as they need it from one year to the next. They can start a project today, build a prototype, and when they’re ready, scale up in a subsequent year. They don’t pay for resources that are not needed for months. Bringing more power in the hands of the customer.

 

The pervasive SAP Cloud Platform at MWC

That’s not all about SAP Cloud Platform at MWC, you can attend many theater sessions, find out how the platform powers SAP Leonardo and what are the new developments on Apple-SAP partnership and the new iOS SDK.

It will be hard to miss SAP Cloud Platform as it underpins lot of showcases at the booth! Connected airport shows how to intelligently connect things people and business processes. Here, with SAP cloud platform we integrate different types of data from external resources like flight departure schedules. We can also integrate data from the Internet of Things, for example, connected parking, so we can inform passengers when they drive to the airport that they directly know about spaces. We can deliver the information in the moment with an AR app immediately, for a quick turnaround, for example, show where the container needs to be shipped to, whether it is in the right temperature range, check the staff working on the cargo and the aircraft, which of the safety and security checks have already been performed and many more such processes.

The connected car showcase here is about a seamless in-car experience. SAP Vehicles Network powered by SAP Cloud Platform, combines technology and data for drivers to enjoy a seamless experience from parking reservation to all kinds of cashless transactions from the driver seat.

SAP Digital Boardroom, running on SAP Cloud Platform, equips executives with contextualised, real-time information for decision making in the digital era.

Finally, you can enjoy the platform through “shopping bot” game built on SAP Cloud Platform where a player drives eBB9 through shopping maze and visits 5 kiosks. Proximity sensors are triggered at each kiosk and mobile messages are displayed (i.e. on-the-go offers, orders, customer notifications).

 

Our continuous upgrades will help make the platform an even more strategic asset for enterprises looking to innovate quickly and change their business models. We are excited to see how our customers will use this new flexibility and use cases to power digital transformation in this year. Join us at MWC to know more at SAP Booth in Hall 3.