Skip to Content
|

Category Archives: Uncategorized

With the advent of GDPR in Europe / UK from 25th May 2018 have been receiving many questions as to how we can achieve compliance. In fact from my perspective there is no simple straight forward answer to this.  It has to be an organizational push + a set of application which will help us to achieve the same. We still have to wait and watch as to how many articles which are yet to be defined properly.

Still one of the very essential and well defined article within GDPR is the personal data should be properly tagged and an enterprise should be aware as to where all it is storing user personal information.

This brings us to the importance of storing business metadata as part of our study / analysis phase to determine where are data needs to be anonymized and pseudonymized. Among the limited number of tools available in the market to document metadata across the enterprise application landscape, SAP Information Steward (IS) stands out in its capability, flexibility and possibility to connect to multiple SAP / Non SAP systems.

In this blog we will see how SAP IS could be connected to a SAP Data Services repository and a SAP Business Warehouse system, technical metadata from these 2 systems can be exported from the repository in SAP IS, GDPR relevant tagging can be done and conversion of this technical metadata from multiple source systems can be categorized in a single business metadata.

  1. Connecting SAP Data Services to Information Steward. From SAP Central Management Console > Information Steward > Metadata Management > Integrator Sources > Manage > New > Integrator Source. Here you will see a list of all source systems that could be connected to SAP IS to extract Metadata.Here one can see a master list of all source systems that can be connected with SAP IS.
  2. Once the above connection is established one can see the SAP DS repo as below 
  3. To import technical metadata structures we can right click on the connection name, select schedule if we want to extract the metadata regularly or click on run now in case only one time import is required. 
  4. Once the import metadata job is completed login to Information Steward and click on Metadata Management tab. The connection which we created in the SAP CMC console will start appearing here. All the above steps needs to be repeated for all the source systems (BW in this case)
  5. SAP IS does not have a out of the box capability to tag GDPR relevant attributes, and the same has to be custom built in the system. This could be built using the custom attribute functionality given under Manage drop down 
  6. I have created a GDPR attribute which will signify if the object in question is GDPR relevant of not. Similarly one can create N number of custom attributes like GDPR Dependency details, Data Owner etc but here I have just taken one attribute to keep it simple. Once we have created the custom attribute we can assign access levels also so that for e.g this attribute will only be visible to BW source systems or data services source systems or all systems etc.
  7. Once the custom attributes are defined they will start appearing in the object description view and could be marked during the system analysis / study phase. In the below depiction we can see that the GDPR attribute has been marked at 2 levels. The below is showing that the table that is present in this imported repository is GDPR relevant as well as the attributes within this table are also GDPR relevant. This tagging needs to be done across all objects of a source system and also across multiple systems 
  8. Once the above technical metadata is consolidated we will now move to creating business metadata. In this case we will see how all the GDPR relevant objects across multiple source systems can be consolidated under single business relevant common headings. For that Go to Metapedia > Click on New Category and create a GDPR category. This master category can further be categorized into sub categories like Bank Details, Employee Details etc. 
  9. Go to All Terms create a new term let us say Bank Account Number fill in the details as exhaustive as you like, assign it to a category (save it before assigning it to be category) and submit for admin approval. 
  10. Once the admin approves it will start appearing under the sub category which we created earlier. After approval one can search and assign GDPR relevant attributes across multiple source systems. 
  11. A normal business user with read only access will be seeing the below view wherein enterprise data steward team has tagged, categorized and mapped enterprise wide GDPR relevant attributes under a single business term. 

I hope the blog will be useful for you to simplify the GDPR relevant documentation in your enterprise. In case of any queries / suggestions please do not hesitate to reach out to me.

Keep reading and keep learning.

Best Regards.

Shazin

 

On May 23rd I had the great privilege to represent SAP at the finals of the Northwestern University VentureCat competition. I was joined by Paurav Patel, SVP & GM Midwest Region Services and Joe Zavala, Midwest Region Marketing who graciously agreed to represent SAP with me. Thanks to the generous support of SAP North America, SAP served as one of the financial sponsors of this year’s event. Besides being thanked from the stage, our company was recognized for its support in the printed materials and digital communications for the competition.

VentureCat is an event and showcase celebrating Northwestern’s most promising student-founded startups. The event culminates with a pitch competition in which over $100,000 in prize money is distributed to student ventures. VentureCat is a collaborative event at Northwestern, including efforts through The Farley Center for Entrepreneurship and Innovation, The Kellogg School of Management, the Donald Pritzker Entrepreneurship Law Center and The Garage.

VentureCat is an evolution of the Northwestern University Venture Challenge (NUVC), which was originally introduced in 2007. From inception, NUVC has distinguished itself from other pitch competitions around the globe by organizing competitors in industry-specific tracks, which leverages the rich expertise of distinct schools from across the university in both graduate and undergraduate programs, and awarding top teams with non-dilutive capital.

VentureCat is an experience for Northwestern’s most promising student-founded startups teams to network and pitch their ventures to an esteemed panel of judges and the broader community. Semifinal teams participate in the Four Week Semifinalist Prep Program. The program includes: pitch coaching, advice from industry experts, and professional graphic design support. The semifinalist teams compete in industry tracks. The 1st place winner of each track takes the main stage to compete for the grand prize at the Finals in front of an audience of hundreds of students, alums, and investors.

The six final presentations included themes directly aligned with many of the UN Global Goals for sustainable development including Goal 3 – Good Health and Well Being Goal 5 – Gender Equality, and Goal 6 – Clean Water and Sanitation.

2018 Winners:

  • NUMiX Materials: NUMiX Materials manufactures and supplies Northwestern University-patented materials to remove heavy metals from aqueous streams at ten times the efficiency of competing materials on a per-volume basis.
  • BrewBike: BrewBike fuels college students with cold brew coffee.
  • Rheos: Rheos has developed a noninvasive, wearable biosensor capable of diagnosing ventricular shunt malfunction.

Also competing in VentureCat 2018 was Cariset Backpacks, a manufacturer of high-end utility backpacks for female young professionals; FaciliKey an AirBnB-like app for landlord-tenant relations; and BraveCamp, a program to help girls develop a passion for coding.

Do you have an older profile for an account that is inactive, to which you no longer have access? Maybe you changed companies and had to start a new account with the new company, or maybe you don’t have access anymore to the email that the old account was tied to.

If you were a regular contributor (or heck, even an occasional contributor) on SAP Community, or its predecessors SCN or SDN, you may have accumulated a nice body of blogs and answers, as well as a reputation of points, level, and badges in our legacy reputation system. Here, for example, is the current profile and reputation of SAP Mentor Paul Hardy:

And here is the legacy reputation accumulated by Paul in a previous profile:

While Paul has been active with his current profile and has content and reputation in it, he would certainly want SAP Community members to know that that profile does not represent the full scope of his content and reputation in the community, and he would want them to be able to find his older blog posts and other content. So he would want this profile to remain visible to other SAP Community members.

If this sounds like your situation, you may be worried right now that your older, inactive profile will “disappear” soon because you can’t log into that account anymore to set the privacy setting for that profile to “Allow” your profile to be displayed publicly.

I’m here to tell you that all is not lost!

You can still change the privacy setting for that older, inactive profile to public, now or even after it gets hidden by default when our new privacy settings go into effect in the next hours and days. You just need to follow a few simple steps:

  1. If the old account is not active (meaning you cannot reset the password), fill out this contact form or send an email to sapnetwork@sap.com to ask for their assistance. Let them know that you want to link your old, inactive profile to your current, active one. Give them the link to both of the profiles and/or the SAP ID account numbers, and let them know you are not able to log into the old one as the account is inactive. They will likely respond with a few questions that you will need to answer so that they can verify that you are the true owner of the inactive profile.
  2. Once the profiles have been linked, you might find that both your inactive profile and your currently active profile are no longer public. In cases where a profile with a privacy setting of “Allow” is linked to a profile with a privacy setting of “Do Not Allow,” the “Do Not Allow” setting takes precedence, to ensure that private profiles do not get unintentionally switched to public. (See the Privacy Setting FAQ.)In the next step you can change this, so that the linked profiles are both public.
  3. Log into your currently active profile and go into your Account Settings (select Account Settings from the avatar dropdown in the header). In the Privacy Settings section, enable the profile to be publicly displayed by clicking the Edit Privacy Settings link, then selecting the green “Allow” button. This will set the Privacy Setting of the currently active profile AND the linked inactive profile to “Allow.” Now both profiles are visible publicly.

You can read more about Profile Linking in my previous blog post: https://blogs.sap.com/2017/12/14/hey-look-its-both-of-the-tammys-together/

Hi, in this post I would like to share usage of ABAP Dictionary External Views in ABAP CDS views development to overcome some limitations.

Scenario 1

Imagine that we have S/4HANA system and HANA DB also used for other applications. In this case we usually have and use in HANA not only main schema of S/4HANA system. We use ABAP CDS views in S/4HANA for reporting reasons.

How to use data from REPZME schema in our ABAP CDS views? By default you could use only tables visible in ABAP Dictionary of T85 Application server and placed in SAPT85 schema.

Scenario 2

Imagine that we have S/4HANA system and using ABAP CDS views for reporting reasons. But some developers in your company are good at HANA Information Models development or already exist some Calculation Views with complex logic.

Do you need to rebuild them using ABAP CDS views or may be you could reuse it?

Solution

In this post I would like to show solution for Scenario 1 and Scenario 2.

From ABAP 7.4 SP02 it is possible to create Dictionary External Views for this reasons.

We would like to use table MARA from REPZME Schema in ABAP CDS views.

1.Go to HANA Studio Modeler Perspective, create in Content Area some package and Calculation VIew for table MARA. I just select 4 fields for demo purpose. In CV we could use tables not only from main SAPT85 Schema.

2. Go to HANA Studio ADT Perspective, create some ABAP Package and create new “Dictionary view”. Give it a name, description and select External View option and find our Calculation Views with Browse button.

3. Change DDIC fields type if needed and activate.

4. Go to ABAP Dictionary and check existence and look at data.

 

5. Go back to ADT Perspective and create ABAP CDS views Data definition based on External view ZT04_PROXY.

Result

We use in ABAP CDS views data from table not in visible by default in ABAP Dictionary or reuse in ABAP CDS views complex logic that was written in Calculation Views .

 

Thank you for attention!

Transport Management Service for SAP Cloud Platform is currently available as beta version (see The new cloud-based Transport Management Service and Transport Integration Content across Tenants using the Transport Management Service released in Beta).

In the present blog, I reproduce and summarize all steps required to set up this service to support a concrete, simple transport scenario. In particular, I would like to use this service to transport integration content (for SAP Cloud Platform Integration) from a source to a target tenant.

A big Thank You to the following people for their support: Ralf Belger, Boris Zarske, VishnuPrasath Dhayanithi, Dimitar  Aleksandrov.

To mention some further information sources worth to check out:

Transport Landscape

In this blog, I walk you step-by-step through the tasks required to set up the following transport scenario: You like to use Transport Management Service to transport integration content from a source tenant (referred to as dev tenant) to a target tenant (referred to as test tenant).

It is important to point out that Cloud Integration tenants (in our example, the dev and the test tenant) run in the SAP Cloud Platform Neo environment, whereas the Transport Management Service runs in the Cloud Foundry (CF) environment. For this blog, I assume that you have access to a Cloud Foundry account. For more information on the different SAP Cloud Platform environments, see the online documentation of SAP Cloud Platform and search for Environments.

The following figure shows the transport landscape:

The following list provides a summary of the tasks to set up this landscape:

  1. Get entitled to use Transport Management Service. You can use an own global Cloud Foundry account (for the following steps, I assume that you have a global account of SAP Cloud Platform, Cloud Foundry environment). Check out the documentation of Transport Management Service to find out about eventual process changes once the Transport Management Service becomes general available and the beta phase ends.
  2. In the Cloud Foundry environment: Create a subaccount of your global account and subscribe to Transport Management Service.
  3. Enable API access to Transport Management Service.
  4. Create the required destinations.
  5. Use Transport Management Service to create source and target nodes and a transport route to connect both of them.

In the following part of the blog, you find the detailed description of these tasks.

Create Subaccount and Subscribe to Transport Management Service (Cloud Foundry)

As prerequisite, make sure that you have access to a global account of SAP Cloud Platform, Cloud Foundry environment.

  1. Within this global account, create a new subaccount. Make sure that you enable beta features (select the Enable Beta Features option).
  2. Select the subaccount and choose Subscriptions.
  3. Select the tile Transport Management Service.
  4. On the next screen, choose Subscribe.
  5. The Not Subscribed field changes to Subscribed.

As next step, assign the required roles to your users. To do that, perform the following steps:

  1. Go back to the overview page of your subaccount (for example, by clicking the subaccount link in the breadcrumb link on top).
  2. Choose Security > Role Colection.
  3. Click New Role Collection, and in the next screen specify a name (for example, Test Transport). A link to the new role collection is added (with the name of the role collection as link text).
  4. Click the link with the role collection name.
  5. On the overview page for the role collection, click Add Role.
  6. As Application Identifier, select the application identifier provided by SAP (which is alm-ts!t339; this identifier corresponds to the SAP organization where the Transport Management Service is running).
  7. For both attributes Role Template and for Role select the entry Administrator.
  8. Choose Save.
  9. Go back to the overview page of your subaccount.
  10. Click Security > Trust Configuration.
  11. Choose SAP ID Service.
  12. Enter the user (email address) that needs to have administrator rights (your user in case you like to set up Transport Management Service) and choose Show Assignments.
  13. Click the Add Assignment button and select the role collection created with the previous steps.
  14. Click Add Assignment. The screen should now look like in the following figure.
  15. Go back to the overview page of your subaccount and choose Subscriptions.
  16. On the Transport Management Service  tile, click Go To Application.
  17. A login screen opens. Login with your user and password.
  18. The Transport Management Service is opened.

You can keep the application open, as you need to perform additional steps later-on in Transport Management Service (to create transport nodes and routes).

Enabling API Access to Transport Management Service

With the next steps, you need to enable API access to the service.

  1. Go back to the overview page of your subaccount, choose Overview, and click Enable Cloud Foundry.
  2. Provide an Organization Name and choose Create.
  3. Go to the global account and choose Entitlements.
  4. From the dropdown list, select Transport Management Service and in the field at right of the dropdown list, enter the name of your subaccount.
  5. Click Edit and under Transport Management Service, next to your subaccount name, increase the number from 0 to 1.
  6. Click Save.
  7. Go back to your subaccount (by clicking the subaccount name in the current dialog).
  8. Select Quota Plans.
  9. Choose New Plan, provide a name for the plan, and in the Services field enter 1.
  10. Click Save.

Now you need to create a new Cloud Foundry space for the subaccount and assign the quota plan.

  1. In the navigation area, choose Space.
  2. Choose New Space, provide a name and click Save.
  3. Go back to Quota Plans.
  4. Under Plan Assignments, select the previously created quota plan from the dropdown list for Quota Plan (in our example: test_transport).
  5. Click the Space name (TestSpace for example).
  6. Choose Service Marketplace and then click the Transport tile.
  7. Choose Instances.
  8. Create a new instance (click New Instance).
  9. Choose Next, and then again Next on the following screen.
  10. Enter an instance name and choose Finish.
  11. Click the instance name and choose Service Keys.
  12. Choose Create Service Key.
  13. Enter a name for the key.
  14. The connection details are displayed.

You need this information to create the required destination from your Cloud Integration Neo account (dev) to Transport Management Service.

You have successfully set up Transport Management Service. You now can start configuring the transport landscape.

Creating the Required Destinations

Start creating the required destinations. For our example transport landscape (with the dev and the test tenant), you need to create the following destinations:

  • In the Neo dev subaccount (transport source): Create two destinations (with specific names TransportManagementService and TransportManagementServiceOAuth) that point to the Cloud Foundry subaccount (where the Transport Management Service is active). To do that, you need the service key created in the previous step.
  • In the Neo dev subaccount (transport source): On the Solutions Lifecycle Management service, create one destination with name CloudIntegration that points to the Cloud Integration tenant management node of the same dev subaccount.
  • In the Neo test subaccount (transport target): On the Solutions Lifecycle Management service, create one destination with name CloudIntegration that points to the tenant management node of the same subaccount.
  • In the Cloud Foundry subaccount (Transport Management Service): Create one destination (with an own-chosen name, for example CPI_TEST) that points to the target (test) subaccount (Neo)

To make it easier to keep an overview of the next steps, the following figure shows all required destinations in the transport landscape.

Let’s start with the two destinations in the Neo test subaccount (the source from which you like to transport integration content).

  1. Go to your dev subaccount in the Neo environment.
  2. Choose Services and then the tile Solutions Lifecycle Management.
  3. Under Take Action click Configure Destinations.
  4. Choose New Destination and specify the following attributes:
    Attribute  Value 
    Name TransportManagementService
    Type HTTP
    URL Enter the parameter provided as uri in the service key (it starts with https://transport-service-app).
    Authentication NoAuthentication
    Proxy Type Internet
  5. Select New Property.
  6. In the dropdown field, enter sourceSystemId, and in the field to the right DEV_NODE. This is an example ID, but I use the same ID later-on when defining the transport node for the source account (therefore, note down the chosen value of sourceSystemId).
    The destination should look like in this figure.
  7. Save the destination.
  8. Create a second destination.
    Attribute  Value 
    Name TransportManagementServiceOAuth
    Type HTTP
    URL Enter the parameter provided as url in the service key.
    Proxy Type Internet
    Authentication BasicAuthentication
    User Enter the clientid value from the service key.
    Password Enter the clientsecret value from the service key.
  9. Save the destination.

Now define the destination CloudIntegration on the source (dev) subaccount in Neo that points to the tenant management node of the dev subaccount.

  1. Choose New Destination and enter the following attributes.
    Attribute  Value 
    Name CloudIntegration
    Type HTTP
    URL Enter the URL of the source tenant (assigned to the dev subaccount in Neo). It has the following form: https://<dev tenant>-tmn.hci.eu1.hana.ondemand.com/itspaces.
    Authentication BasicAuthentication
    User User for the dev subaccount
    Password Password for the dev subaccount
  2. Save the destination.

Repeat these steps on the target (test) subaccount in Neo and create a destination with the name CloudIntegration there as well. As URL use the one that points to the tenant management node of the target (test) subaccount.

Now define the destination on the Cloud Foundry subaccount where the Transport Management Service resides.

  1. Go to your Cloud Foundry subaccount.
  2. Choose Destinations.
  3. Click New Destination.
  4. Provide the following settings:
    Attribute Value 
    Name Enter a string of your choice, for example, CPI_TEST
    Type HTTP
    URL

    Enter the URL of the lifecycle management service in the test account. It has the following form: https://<host>/slservice/slp/basic/<ID of the Neo account>/slp.

    The URL is composed in the following way:
    https://slservice.<host>/slservice/slp/basic/<Neo account ID>/slp

    • host is the host name of your account, for example:
      eu1.hana.ondemand.com
    • ID is the technical name of the Neo account, enter the account name (not the Display Name) of the test account, for example, abcd123ef. Note that this is the target account of your transport landscape. You find this name on the overview page of the subaccount under Subaccount Information (the value beneath the Name attribute).

    Example URL:
    https://slservice.eu1.hana.ondemand.com/slservice/slp/basic/abcd123ef/slp

    Authentication BasicAuthentication
    User User that is member of the target (test) account
    Password Password for user that is member of the target (test) account

Creating Transport Nodes in Transport Management Service

A transport node represents either a source or target endpoint in a transport scenario – in our example, you have two nodes, one for the dev and one for the test account.

  1. Open the Transport Management Service.
  2. Go to Transport Nodes.
  3. Choose + to create a new node.
  4. Let’s start with the node for the source account (dev). Enter a name and a description for the node. For example, enter the name DEV_NODE, enter a description, and click OK.
  5. Repeat the steps to create a node for the target account (test). Enter the name (for example) TEST_NODE, provide a description, and select the Allow Import to Node checkbox.
  6. As Content Type, select Multi-Target Application.
  7. Select the destination CPI_TEST created before and click OK.

The created transport nodes should show up on the Transport Management Service page like shown in the following figure.

Creating Transport Routes

  1. Open Transport Management Service and select Transport Routes.
  2. Choose + to create a new route.
  3. Enter a name for the route, select the source node (DEV_NODE) and a target node (TEST_NODE), and click OK.

Now, you have finally set up the transport landscape for your transport scenario. As last step, you need to determine the transport mode for your test tenant.

Transporting Integration Content

Based on this setup, you can now transport integration packages from the dev to the test tenant with only a few clicks. I keep this section short. For more information, refer to the documentation of Cloud Integration (search for Content Transport).

To summarize, these are the steps:

  1. Open the Web UI for your source (dev) tenant (the URL is composed in the following way: https://<dev tenant>-tmn.hci.eu1.hana.ondemand.com/itspaces), choose Settings and select the Transport tile, Click Edit and as Transport Mode select Transport Management Service (Beta). Save the settings.
  2. In the Design section, select the integration package that you like to transport and choose Transport (you need to add a transport comment).
  3. Open Transport Management Service, select the transport node of the target tenant, select the import queue for the initiated transport confirm to import the content to the target tenant.
  4. Go to the target tenant and check if package arrived there.

 

Thank you to our SAP Screen Personas customers for rapidly adopting the Slipstream Engine. It seems we are fulfilling a need to run classic SAP GUI transactions on mobile devices. SAP Screen Personas 3.0 SP07 builds on that foundation and adds more mobile capabilities, more enterprise readiness features, and over 150 enhancements.

“Given that virtually everything we are pushing is via the Fiori platform with the more modern mobile enabled type of functionality I am only looking at delivering via the Slipstream Engine as opposed to the WebGUI because of its better rendering on mobile devices”
– Jamie Hastings, Senior Systems Analyst, Transalta Corporation

We are also seeing a rapid uptake in the SAP S/4HANA installed base, who are using SAP Screen Personas to complete the Fiori vision of having simple and beautiful screens that run on any device.

While there are many new features in SAP Screen Personas 3.0 SP07, we will discuss three big innovations in this blog:

  • Full editing in Slipstream Engine
  • Mobile preview for different device sizes
  • New Viewport feature to simplify complex screens

 

SAP Screen Personas Flavor editor now works in Slipstream Engine

In the last service pack, we introduced the Slipstream Engine to render SAP Screen Personas on mobile devices. Now, we have added the SAP Screen Personas editor to the Slipstream Engine so you can create flavors and render them in the same environment. So, you have a complete, end-to-end, what you see is what you get (WYSIWYG) platform for simplifying your classic SAP GUI Screens. The SAP Screen Personas editor works on the desktop only (yes, the Slipstream Engine runs on desktops as well as tablets and phones) as you need the full screen real estate to build flavors. The flavor editor includes scripting, theming, templates, and everything you need for creating mobile (or desktop) flavors. With the editor running in the Slipstream Engine, there is no need to fine-tune your flavors due to rendering differences between Slipstream Engine and the Web GUI.

 

Mobile preview makes it easier to build flavors for different sized devices

To make it faster to build flavors for a variety of tablets and phones, we added a mobile preview feature to the flavor editor. Now, you can preview what the flavors will look like on iPads, iPhones, and Android devices, all while staying in the editor environment.

 

New Viewport Scripting API simplifies complex screens

As more customers use SAP Screen Personas for more complex tasks, they occasionally create flavors that perform many functions behind the scenes to present a simple screen to users. This could include showing and hiding different parts of the screen and even moving controls around dynamically to give users a manageable amount of information to process at a time. To handle these business scenarios more easily, we created a new feature called Viewports.

“The new Viewport API in SAP Screen Personas allows us to simplify a very complex screen and provide the right amount of information to our users, without overwhelming them. It replaces a large script, reducing the maintenance effort and greatly improving performance.”
– Ingrid Schwarz – VP IT Applications, Penguin Random House LLC

Viewports break a large, complex screen into smaller pieces, only showing your users the required fields for each stage in a business process. You can even use Viewports to create a guided, step-by-step approach for your users, where you can add or hide different functions.

To summarize, viewports

  • reduce complexity for your users
  • improve performance and loading time
  • allow for a more efficient flavor structure
  • lead to easier development and maintenance

In SAP Screen Personas 3.0 SP07, viewports are available as an API in the scripting engine. We are working on a GUI to simplify working with viewports.

 

See SP07 at SAPPHIRE NOW and ASUG Annual Conference

We will be showing SAP Screen Personas SP07 at SAPPHIRE now and the ASUG Annual Conference.

  • SAP Screen Personas as Part of the SAP Fiori User Experience: With the latest version of SAP Screen Personas software, SAP has made it even easier to create SAP Fiori-inspired flavors that run on your desktop, tablet, or phone. Themes, templates, and best-practice guides allow many customers to create a seamless user experience and go live with SAP Fiori-inspired screens within weeks, often with no development resources. Wednesday, June, 6, 2018 at 11am.
  • Roundtable: Going Mobile With SAP ERP or SAP S/4HANA Using a Mix of SAP Screen Personas and SAP Fiori Apps: Join this roundtable discussion on how to render classic SAP ERP transactions on your tablet or phone. The presenters will discuss how to choose between SAP Screen Personas software flavors and custom SAP Fiori apps to balance functionality, time to value, development cost, and maintenance costs. This talk will also cover the latest SAP technologies for mobilizing flavors. Wednesday, June, 6, 2018 at 2:30pm.

 

Support runs through 2023

Here is our current support strategy for SAP Screen Personas 3.0.

  1. SAP Screen Personas 3.0 will be supported until at least Dec. 31, 2023. This rolling 5-year support is the same as S/4HANA. This is a minimum of five years from the release of this service pack, rounded to end of the calendar year.
  2. We support the current as well as the two previous Service Packs. What does this mean to you?
    • SP07 will receive continuous innovation
    • SP06 will get important fixes and selected down ports
    • SP05 will receive emergency fixes
    • SP04 and earlier will no longer be supported

 

System requirements

SAP Screen Personas 3.0 SP7 works on a variety of systems from SAP S/4HANA to some much older versions. Specifically:

Basis Minimum Service Pack Additional Notes Required? Supported Kernels
S/4HANA all No 749+
750 All No 745, 749
740 SP03 Yes 745, 749
731 SP07 Yes 722
702 SP09 Yes 722
701 SP10 Yes 722
700 SP25 Yes 722

 

For SAP Screen Personas 3.0 SP07, the supported kernel releases and their corresponding minimum and recommended (as of May 24, 2018) patch levels are listed below:

Kernel release 753 –> Minimum patch level 27   –> Recommended patch level: latest available

Kernel release 749 –> Minimum patch level 400 –> Recommended patch level 400 or higher

Kernel release 745 –> Minimum patch level 600 –> Recommended patch level 600 or higher

Kernel release 742 is out of maintenance. We no longer test on this kernel release.

Kernel release 722 –> Minimum patch level 400 –> Recommended patch level 500 or higher

See the SAP Screen Personas 3.0 SP07 Master Note 2633027 for more details and updated recommendations.

 

Continuous improvement

The SAP Screen Personas team is planning to continue to enhance mobile capabilities and address the growing needs of enterprise customers that are striving to become intelligent enterprises.

Please send us your requests (via email or as a comment to this blog) for anything you would like to see in SP08 and beyond.

 

Next Steps

Upgrade to SAP Screen Personas 3.0 SP7 (requires valid NetWeaver License).

Take the free Introduction to SAP Screen Personas openSAP class to get a foundation for starting your project.

Extend your knowledge with the Using SAP Screen Personas for Advanced Scenarios openSAP class to go deeper into how to use the product.

Learn more from our knowledge base with articles on a variety of topics.

Watch the Productivity Power Play episode on SP07.

 

For the SAP Screen Personas product team, Peter Spielvogel.

 

Hi everyone,

This is the first post of a series about the report formats you receive by default when you install the Magnetic Media compliance report. The objective of this series is to give you more details about these report formats so that you become able to make the configurations to run them correctly in your system.

To start with the report-formats series, let’s go through the basics you need to know about the 1005 and 1006 report formats.

 

The 1005 Report Format

What is the 1005 report format?

You use the 1005 report format to report deductible tax values of sales from third party, and generated VAT (value-added tax) values of sales from third party that were canceled, terminated or completed. All these values correspond to the prior year.

Find below the information reported in the seventh version of the XML file that you inform to DIAN (Dirección de Impuestos y Aduanas Nacionales) when you run the 1005 report format in the Magnetic Media compliance report.

ATTRIBUTE

 TITLE

TYPE

LENGTH

CRITERIA

tdoc

Document type

int

2

According to what is specified in the DIAN resolution.

This data must be always reported.

nid

Identification number

string

20

Fill this data without hyphens, periods, commas or blank spaces.

This data must be always reported.

dv

Verification digit

int

1

For the 31-document type (NIT).

Report this data if you have this information.

apl1

First surname of who is reporting

string

60

This data is mandatory for natural persons.

apl2

Second surname of who is reporting

string

60

This data is mandatory for natural persons.

nom1

First name of who is reporting

string

60

This data is mandatory for natural persons.

nom2

Other names of who is reporting

string

60

This data is mandatory for natural persons.

raz

Social reason

string

450

This data is mandatory for legal persons.

vimp

Deductible tax

double

20

This data must be a positive and whole number. Do not include periods or commas.

This data must be always reported.

ivade

VAT returns on canceled, terminated or completed sales

double

20

This data must be a positive and whole number. Do not include periods or commas.

This data must be always reported.

You can find more information about the technical details of the report formats in the DIAN portal: https://www.dian.gov.co/.

How to configure the 1005 report format

Before running the 1005 report format, you need to configure data sources for the extraction. To do so, all you have to do is to specify all the VAT identification codes in the Maintain VAT Codes customizing activity.

To access this customizing activity, go to SAP Customizing Implementation Guide > Financial Accounting > General Ledger Accounting > Periodic Processing > Report > Statutory Reporting: Colombia > Magnetic Media (DIAN) > Configure Data Sources for Report Formats > Maintain VAT Codes.

In the Maintain VAT Codes customizing activity, you find the following view:

In this view, you specify all the VAT identification codes that are relevant for the 1005 report format. You also inform:

  • Procedure
  • Tax Code
  • Account Key
  • Validation: beginning and end of validation

You don’t need to fill the Amount Classification column, since values are automatically classified according to their tax type (input values go to the VIMP table, whereas output values go to the IVADE table).

Ready! After setting this configuration, the Magnetic Media compliance report automatically extracts and organizes relevant data in the output XML file when you run the 1005 report format.

See below the steps to run the 1005 report format:

Advanced Compliance Reporting (ACR) – Magnetic Media compliance report

Execution example: 1005 report format

Execution example: 1005 report format

Execution example: 1005 report format

Execution example: 1005 report format

Execution example: Analytics in 1005 report format

Execution example: Output XML file of 1005 report format

 

The 1006 Report Format

What is the 1006 report format?

You use the 1006 report format to report the following values:

  • Generated taxes on sales
  • VAT returns on canceled, terminated or completed sales
  • Consumption taxes

Find below the information reported in the seventh version of the XML file that you inform to DIAN (Dirección de Impuestos y Aduanas Nacionales) when you run the 1006 report format in the Magnetic Media compliance report.

ATTRIBUTE

TITLE

TYPE

LENGTH

CRITERIA

tdoc

Document type

int

2

According to what is specified in the DIAN resolution.

This data must be always reported.

nid

Identification number

string

20

Fill this data without hyphens, periods, commas or blank spaces.

This data must be always reported.

dv

Verification digit

int

1

For the 31-document type (NIT).

Report this data if you have this information.

apl1

First surname of who is reporting

string

60

This data is mandatory for natural persons.

apl2

Second surname of who is reporting

string

60

This data is mandatory for natural persons.

nom1

First name of who is reporting

string

60

This data is mandatory for natural persons.

nom2

Other names of who is reporting

string

60

This data is mandatory for natural persons.

raz

Social reason

string

450

This data is mandatory for legal persons.

imp

Generated tax

double

20

This data must be a positive and whole number. Do not include periods or commas.

This data must be always reported.

iva

VAT returns on canceled, terminated or completed sales

double

20

This data must be a positive and whole number. Do not include periods or commas.

This data must be always reported.

icon

Consumption tax

double

20

This data must be a positive and whole number. Do not include periods or commas.

This data must be always reported.

You can find more information about the technical details of the report formats in the DIAN portal: https://www.dian.gov.co/.

 

How to configure the 1006 report format

You also need to configure data sources for extraction in the 1006 report format. To do so, you have to enter VAT codes in the Maintain VAT Codes customizing activity too.

The path to access this customizing activity is the same as in the 1005 report format (see the images above). Go to SAP Customizing Implementation Guide > Financial Accounting > General Ledger Accounting > Periodic Processing > Report > Statutory Reporting: Colombia > Magnetic Media (DIAN) > Configure Data Sources for Report Formats > Maintain VAT Codes.

At this time, after entering the VAT codes, you have to specify if they are values of consumption tax in the Amount Classification column.

Values that are classified as consumption tax in the Amount Classification column go to the ICON table. Other values are automatically classified according to their tax type (input values go to the IMP table, whereas output values go to the IVA table).

The 1006 report format configuration is done! Now the Magnetic Media compliance report automatically extracts and organizes relevant data in the output XML file when you run this report format.

See below the steps to run the 1006 report format:

Advanced Compliance Reporting (ACR) – Magnetic Media compliance report

Execution example: 1006 report format

Execution example: 1006 report format

Execution example: 1006 report format

Execution example: 1006 report format

Execution example: Analytics in 1006 report format

 

 

Execution example: Output XML file of 1006 report format

 

We hope this post is useful for you.

Are you missing any information about the Magnetic Media compliance report? Let us know in the comments. Your doubt can be the subject of our next post.

And don’t forget to follow the SAP S/4HANA Finance tag to stay tuned on Magnetic Media latest news.

 

Until the next post,

Rosana

 

Want to read this post in Spanish?

 

If you enjoyed this post, you may be also interested in this one:

Reporte de Medios Magnéticos: Lo que usted socio tiene que saber

Hola,

Este es el primer post de una serie sobre los formatos que usted recibe por defecto al instalar el Magnetic Media compliance report. El objetivo de esta serie es darle más detalles sobre los formatos para que usted pueda hacer las configuraciones necesarias para ejecutarlos correctamente en su sistema.

Para comenzar con la serie sobre los formatos, hablamos sobre lo básico que usted necesita saber sobre los formatos 1005 y 1006.

 

Formato 1005

¿Qué es el formato 1005?

Usted utiliza el formato 1005 para reportar el valor del impuesto sobre las ventas descontable y el valor del IVA generado de las ventas devueltas, anuladas, rescindidas o resueltas, correspondientes al año gravable anterior, por tercero.

En la tabla abajo encuentra la información reportada en la versión 7 del archivo XML que usted declara a la DIAN (Dirección de Impuestos y Aduanas Nacionales) al ejecutar el formato 1005 en Magnetic Media compliance report.

ATRIBUTO

DENOMINACION CASILLA

TIPO

LONGITUD

CRITERIOS

tdoc

Tipo de documento

Int

2

De acuerdo a los definidos por resolución. Siempre debe diligenciarse

nid

Número identificación

string

20

Diligenciar sin guiones, puntos, comas o espacios en blanco.

Siempre debe diligenciarse

dv

Dígito de Verificación

Int

1

Para el tipo de documento 31 – Nit, si se conoce debe diligenciarse.

apl1

Primer apellido del informado

string

60

En caso de ser una Persona Natural siempre debe diligenciarse.

apl2

Segundo apellido del informado

string

60

En caso de ser una Persona Natural y si se conoce debe diligenciarse.

nom1

Primer nombre del informado

string

60

En caso de ser una Persona Natural siempre debe diligenciarse.

nom2

Otros nombres del informado

string

60

En caso de ser una Persona Natural y si se conoce debe diligenciarse.

raz

Razón social informado

string

450

En caso de ser una Persona Jurídica siempre debe diligenciarse.

vimp

Impuesto descontable

double

20

El valor debe ser positivo, entero y no debe incluir ni puntos ni comas.

Siempre debe diligenciarse.

ivade

IVA resultante por devoluciones en ventas anuladas, rescindidas o resueltas

double

20

El valor debe ser positivo, entero y no debe incluir ni puntos ni comas.

Siempre debe diligenciarse.

Usted encuentra más informaciones sobre los detalles técnicos de los formatos en el portal de la DIAN: https://www.dian.gov.co/.

Cómo configurar el formato 1005

Antes de ejecutar el formato 1005, usted necesita configurar las fuentes de datos para extracción. Para hacerlo, necesita ingresar todos los códigos Indicadores de IVA y codigos clave de operación, relevantes para el formato 1005 en la actividad de customizing Actualizar códigos de identificación de IVA.

Para acceder esa actividad, ir a:  Guía de implementación de Customizing SAP > Gestión financiera > Contabilidad principal > Operaciones periódicas > Declarar > Reporting legal Colombia > Soporte de datos magnético (DIAN) > Configurar fuentes de datos para formatos de informes > Actualizar códigos de identificación de IVA.

En la actividad de customizing Actualizar códigos de identificación de IVA, usted encuentra la siguiente pantalla:

En esa pantalla, usted ingresará todos los indicadores de IVA relevantes para formato 1005. Usted también informa:

  • Esquema
  • Indicador IVA
  • Clave de Cuenta
  • Fecha de Validez Inicial y Final.

Usted no necesita indicar nada en la columna Clasificación de Importe, porque los valores son automáticamente clasificados de acuerdo con su tipo de impuesto (valores de entrada van para la columna VIMP, mientras los valores de salida van para la columna IVADE).

¡Listo! Después de hacer esta configuración, el Magnetic Media compliance report extrae y organiza automáticamente las informaciones relevantes en el archivo XML cuando usted ejecuta el formato 1005.

Mire abajo la secuencia de ejecución del formato 1005:

Advanced Compliance Reporting (ACR) – Magnetic Media compliance report

Ejemplo de ejecución: formato 1005

Ejemplo de ejecución: formato 1005

Ejemplo de ejecución: formato 1005

Ejemplo de ejecución: formato 1005

Ejemplo de ejecución:  Analíticos en formato 1005

Ejemplo de ejecución: Archivo XML de formato 1005

Formato 1006

¿Qué es el formato 1006?

Usted utiliza el formato 1006 para reportar los siguientes valores:

  • El valor de impuesto generado sobre las ventas
  • El valor de IVA recuperado en devoluciones en compras anuladas, rescindidas o resueltas, IVA resultante por devoluciones en venta anulada, rescindidas o resueltas
  • Impuesto al consumo

En la tabla abajo usted encuentra las informaciones reportadas en la versión 7 del archivo XML que usted declara a la DIAN (Dirección de Impuestos y Aduanas Nacionales) al ejecutar el formato 1006 en Magnetic Media compliance report.

ATRIBUTO

DENOMINACION CASILLA

TIPO

LONGITUD

CRITERIOS

tdoc

Tipo de documento

Int

2

Siempre debe diligenciarse.

De acuerdo a los definidos en la resolución.

nid

Número identificación

string

20

Siempre debe diligenciarse.

Diligenciar sin guiones, puntos, comas o espacios en blanco.

dv

Dígito de Verificación

Int

1

Para el tipo de documento 31 – Nit, si se conoce debe diligenciarse.

apl1

Primer apellido del informado

string

60

En caso de ser una Persona Natural siempre debe diligenciarse.

apl2

Segundo apellido del informado

string

60

En caso de ser una Persona Natural siempre debe diligenciarse.

nom1

Primer nombre del informado

string

60

En caso de ser una Persona Natural siempre debe diligenciarse.

nom2

Otros nombres del informado

string

60

En caso de ser una Persona Natural siempre debe diligenciarse.

raz

Razón social informado

string

450

En caso de ser una Persona Jurídica siempre debe diligenciarse.

imp

Impuesto generado

double

20

El valor debe ser positivo, entero y no debe incluir ni puntos ni comas.

Siempre debe diligenciarse.

iva

IVA recuperado en devoluciones en compras anuladas, rescindidas o resueltas

double

20

El valor debe ser positivo, entero y no debe incluir ni puntos ni comas.

Siempre debe diligenciarse.

icon

Impuesto al consumo

double

20

El valor debe ser positivo, entero y no debe incluir ni puntos ni comas.

Siempre debe diligenciarse.

Usted encuentra más informaciones sobre los detalles técnicos de los formatos en el portal de la DIAN: https://www.dian.gov.co/.

Cómo configurar el formato 1006

Usted también necesita configurar fuentes de datos para extracción en el formato 1006. Para hacer eso, usted necesita ingresar todos los códigos Indicadores de IVA y códigos clave de operación, relevantes para el formato 1006 en la actividad de customizing Actualizar códigos de identificación de IVA.

El camino para acceder a esa actividad es el mismo de formato 1005 (mirar las imágenes). Ir en Guía de implementación de Customizing SAP > Gestión financiera > Contabilidad principal > Operaciones periódicas > Declarar > Reporting legal Colombia > Soporte de datos magnético (DIAN) > Configurar fuentes de datos para formatos de informes > Actualizar códigos de identificación de IVA.

En este punto, después de indicar los indicadores de IVA – uno a uno – en la columna Esquema, usted tiene que especificar si cada uno de los códigos de esquema es un valor de impuesto al consumo en la columna Clasificación de Importe.

Valores que sean clasificados como impuesto al consumo en la columna Clasificación de importe van para la tabla ICON. Otros valores son automáticamente clasificados de acuerdo con su tipo de impuesto (valores de entrada van para la tabla IMP, mientras valores de salida van para la tabla IVA).

¡La configuración del formato 1006 está lista! Ahora el Magnetic Media compliance report extrae y organiza automáticamente las informaciones relevantes en el archivo XML cuando usted ejecuta ese formato.

Mire abajo la secuencia de ejecución del formato 1006:

Advanced Compliance Reporting (ACR) – Magnetic Media compliance report

Ejemplo de ejecución: formato 1006

Ejemplo de ejecución: formato 1006

Ejemplo de ejecución: formato 1006

Ejemplo de ejecución: formato 1006

Ejemplo de ejecución: Analíticos en formato 1006

Ejemplo de ejecución: Archivo XML de formato 1006

 

Esperamos que este post le sea útil.

¿Usted siente que falta alguna información de Magnetic Media compliance report? Déjanos saber en los comentarios.  Su duda puede ser el asunto de nuestro próximo post.

Y no olvides seguir el tag SAP S/4HANA Finance para saber todas las noticias de Magnetic Media compliance report.

 

Hasta lo próximo post,

Rosana

 

¿Quiere leer este post en inglés?

 

Si te gustó este post, te puede gustar este otro:

Reporte de Medios Magnéticos: Lo que usted socio tiene que saber

 

It is now quite common that SAP Inside Tracks happen at a Saturday. The Friday before is often used for a code jam.

Attending an event on Saturday is convenient and cheap for the participants. No need to use holidays or arrange with the company or customer to get the free time needed. And often the event is hosted also in the own city.

This is different from the perspective of a speaker who attends multiple SAP Inside Tracks during the year. She or he has to pay for hotel and transportation himself.

For some speakers a further problem comes up, they have to explain to their family that they will not be at home in the weekend. And this for each SAP Inside Track they want to participate. In my case this limits the number of SAP Inside Tracks I am able to attend.

So I propose:

  • SAP Inside Tracks should be more often on working days
  • SAP Kids Tracks should remain on a Saturdays

If a SAP Kids Track is on a Saturday after the SAP Inside Track, it is possible to attend both events. I will do this as often as I get a free Saturday.

What Do you think?