It was quite a while that something interesting came up like this one. For Old school developers in ABAP like myself who have found comfort working on ABAP for a decade become so ingrained in it that we fail to think out of the box, and find solace in what we know. Even treading into OO ABAP is uneasy feeling and best avoided, but as time ticks on we try to keep pace with the changing technology lest we are left miles behind.

Sometimes we need we come out of our shell and explore the new possiblities of combining technologies to reach a solution. Recent times has been like a storm has passed over you and am lucky to be still alive 🙂 , with clients demanding more bang for the buck and difficult to appease we need to find some innovative ways to stay afloat.

Off late there was bolt from the blue requirement from the client which sent a chill down the spine as we were on the verge of completion of the build, the requirement from their perspective was simple, they wished or rather wanted all the ALV reports to be displayed in PDF. Though initially we cowered a bit, we planned to take it as a challenge as it hurts the pride of an ABAPer when a comparison is drawn with other legacy software applications which can do such things immaculately. There were also possibilities in SAP by spooling the data and converting to PDF or use of a pdf printer, but those solution had some inherent shortcomings, because as a developer we have no control on the form per se, and many finer requirements could not be met using that approach.

From my previous experiences with Adobe forms I knew that such a requirement cannot be met just by ABAP, we need to have a multi-faceted approach, yes, we needed more arrows in the quiver. I was aware that adobe forms can be dynamically manipulated using javascript or formcalc, and have also come across concepts of dynamically rendering sub forms using instance manager scripts. Browsing adobe forums helped a lot to come up with a solution, I came know that JavaScript Object Notation(JSON) is way of representing data as a string object model and the same can be parsed using javascript, now I needed to find a way to convert SAP data to JSON format, browsing and searching we found a class in ABAP which  converts an internal table into a JSON string and that is exactly what we wanted, but we had to clone and modify the class a bit as the JSON produced by it was not being interpreted by javascript correctly.

So, as the first step we cloned the class CL_TREX_JSON_SERIALIZE to ZCL_TREX_JSON_SERIALIZER and made modifications to recurse method to put single quotes across data. Tested the class and this brought out what we exactly required.

Sample Output String

[{‘pernr’: ‘14900001’, ‘begda’: ‘20140922’, ‘endda’: ‘20140923’}, {‘pernr’: ‘14900002’, ‘begda’: ‘20140922’, ‘endda’: ‘20140924’}]

The next step was to design the form which will render dynamically based on the JSON String. The form is a very simple one,

it just has a subform with a table contained in it, the table has one row and a colum Cell which is wrapped in repeatable subform name Column. We had also made certain other dressups like the header, footers, and the logo.

Form Layout

Form.PNG

The form has a very simple interface where data is passed as a JSON string, we did incorporates more fields in the interface to help us manipulate the form using Java script.

Form Interface

Interface.PNG

Having done this, we need a small, but a bit complicated js code in place to parse the JSON string and then automatically build table using instance managers.

The javascript code is fired in the formready event of DIS subform.

FormJS.PNG

The createTable function does the following,

1. It calculates the cell width based on the page width and the number of columns.

2. It splits the JSON string into an array of Javascript objects, you can simply use the eval function in JS without use of any custom library functions.

3.Count the data length which is basically the rows in the internal table

4.Count the number of columns which is the number of columns in the internal table, this will be the number of attributes of JS object in the array.

5. Start rendering using instance manager commands.

ZigZag Rendering

Since we have just one cell which has to be instantiated as many times there are cells, we start rendering column by column in a zig-zag fashion as shown below.

The JS code which helps to instantiate several cells is dynamically constructed instance manager commands.

 

xfa.resolveNode(t).Column.instanceManager.addInstance(0);

Rendering.PNG

Since the idea of the blog is only explain the concept I have not pasted the entire JS code here.

Our next major challenge was how to plug this PDF to all the reports, fortunately for us, we had developed a reusable ALV display function module to minimize the coding for developers for simple ALVs, this function was being used in most of the reports. We enhanced the functional module GUI status and created a PDF function which will respond by passing the data to the form and displaying the same. So once we had enhanced this function all the reports got the PDF function with no extra effort, and interestingly this idea was conceptualized and executed in less than week.

Report and PDF display.

Report.PNG

Clicking on Generate PDF

Formdata.PNG

Some of the challenges we faced after initially implementing the concept,

1. Heading on each page was not getting displayed as it was dynamically rendered, this we overcame by having separte logic for header on the master page.

2. Modifying header labels – we did a replacement of generated headers from the data

3. The next major challenge was to display only filtered rows or columns in the PDF generate – from using ALV related functions we were able to trace out which colums and rows we need to display and based on that we built internal table dynamically before converting to JSON and transferring to PDF form.

4. Font type and size control – we were able to dynamically control font size and type using javascript based on the number of columns to display.

We still have challenges on performance front when data is huge, and also on customizing due its generic behavior, but at the end of day I believe we were able to conquer some unexplored frontiers, we analysed and joined the dots to come up with a decent solution. Thanks for reading this blog and aplogies for not being technically very detailed.

My sincere thanks to fellow developer Rajiv Kumar who steadfastness made this happen.

Regards,

Raghav

To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply