Skip to Content

The basics of SAP BusinessObject Intelligence Platform Publication in my earlier blog. This post outlines a few useful tips to consider when working with publications. I will look at three different aspects: Source documents, Dynamic recipients and publication performance.

Source documents

It is recommended to view and schedule dynamic content documents individually before you add them to a publication. Publication uses scheduling as a means of creating the personalized documents, hence ensure that you can view, refresh and schedule each publication document on its own. If you can view and schedule dynamic content documents successfully, the data source connection is working properly and the source document data can be refreshed when the publication is scheduled. If you cannot view and schedule dynamic content documents, ensure the data source connection settings are correct.

Use publication log files to troubleshoot errors in failed publications. When you schedule publications to run, log files are generated that record any errors that may occur when the publications are processed. To view all log files for a publication instance, click Actions > History. On the “History” page, click the instance link in the Instance Time column.

Try to avoid unnecessary data refreshes. If a data refresh is unnecessary for a dynamic content document, in the “Source Documents” section, clear the Refresh At Runtime check box for that document. This will improve overall publication performance. Also consider to use the best bursting mode for your publication.  For more information on bursting modes read my previous blog entry.

If you are using parameter-based personalization for Crystal reports, set parameters to default. Parameter-based personalization may lead to slower publication performance. It is highly recommended that you personalize Crystal report publications by mapping fields to Enterprise recipient profiles or to dynamic recipient personalization values. However, if you need to personalize Crystal reports using parameters, in the “Personalization” section, set parameters to Default.

Dynamic Recipients

In general, it is recommended that you sort dynamic recipient sources according to the recipient ID column. This is especially
important when you are running a high-volume publication or when you enable One database fetch for each batch of recipients because it can reduce the number of deliveries for recipients who have multiple personalization values.

For Crystal report dynamic recipient sources, ensure the database configuration information is correct. In the CMC, select the dynamic
recipient source and go to Manage > Default Settings to ensure the following:

  • In the “Database Configuration” section, the database logon information is correct and Use same database logon as when report is run is selected.
  • In the “Parameters” section, all parameters have parameter values, and all Prompt when viewing check boxes for parameters are cleared.

  

Also, if you use Crystal report dynamic recipient sources, consult your administrator to ensure the Report Application Server (RAS) is configured correctly. The RAS must be configured to read at least the same number of database records as the number of recipients in the dynamic recipient source. For instance, to process a dynamic recipient source with data for 100,000 recipients, the RAS must be set to read more than 100,000 database
records.

Publication Performance

Further I want talk about how you can improve the performance of a publication.

Let’s start with the Adaptive Processing Server (APS). If both CPU and memory for the Adaptive Processing Server are heavily utilized during publication runs, then move the Adaptive Processing Server to a faster machine that has more available CPUs and SAP BusinessObjects Business Intelligence platform 4.0 SP4 or later installed. The server will automatically scale to use more CPUs. Also it is recommended to isolate the Publishing Service and the Publication Post Processing Service on dedicated Adaptive Processing Server instances and remove unused services hosted on the server. Each service will consume more shared resources (request thread pool, memory, and CPU consumption) on the Adaptive Processing Server, and publishing performance may
improve.

There are a few considerations to keep in mind regarding the Publishing Service on the APS. Horizontally “scaling out“ of the Publishing Service across multiple APS instances (on one or multiple machines) will enable more publication instances to be processed concurrently. In contrast, a single publication job (for example, one with 1,000,000 recipients) is not shared across Publishing Services hosted on  different APSs and horizontally scaling out the Publishing Service will not improve processing time for a single publication, regardless of the number of recipients. For publications with many recipients, vertically scale the APS on machines that have more CPUs and RAM. This will enable the Publishing Service to concurrently process more recipients
and the APS to generate more jobs.

Because publishing is a disk-heavy process, use a machine with fast I/O or SAN disks for the FRS and use the publishing cleanup option for a large publication that does not need redistribution or to view artifacts in the report. To automatically clean up do not select the default destination.

For Crystal report publications select One database fetch for each batch of recipients if you do not need to apply unique refresh security for each recipient. Database access will be batched into multiple concurrent, smaller queries.

For Web Intelligence publications select One database fetch for all recipients or One database fetch per recipient. When you select One database fetch for all recipients for a large publication, to break up the database query into multiple smaller atomic queries, enter

                -Dcom.businessobjects.publisher.scopebatch.max.recipients=<integer>

on the command line of all APSs that host the Publishing Service.

Last I want to mention a few tips regarding the Publishing Post Processing Service. The Publishing Post Processing Service is called when the Package as ZIP File check box and/or the Merge Exported PDF check box is selected or when custom post-processing plugins are enabled on a publication. For publications with both check boxes selected, you must create additional Publishing Post Processing Services to improve publication processing time. Also the amount of work the Publishing Post Processing Service receives is limited by how the Publishing Service is scaled. Horizontally scaling out the Publishing Post Processing Service spreads the ZIP- and PDF-merging workload across multiple Publishing Post Processing Services hosted on different Adaptive Processing Servers.

An interactive diagram showing the interaction of platform components when a scheduled publication of a Crystal Reports 2011 report is run, can be found in this BI tutorial.

To report this post you need to login first.

9 Comments

You must be Logged on to comment or reply to a post.

  1. Former Member

    Hi,

    Once again excellent work!!!

    where I can find document for An interactive diagram showing the interaction of platform components when a scheduled publication of a webi report based on UNX (not BICS) is run ???

    (0) 
  2. Andreas J A Schneider

    It would be nice to have

    1. A sizing guideline for the platform when using publications, e.g.
      I have 1000 recipients, 1 Webi document, returning a total of 100k data rows (one datbase fetch), which are then filtered according to recipient. Outtput as PDF.
      How large should my SAPBI4 server be ith respect to RAM, SAPS/CPU cores?
    2. A detailed process flow as already mentioned for Webi publications would be most helpful.

    Otherwise thanks for your blog : )

    (0) 
      1. Former Member

        Hi Derrick,

        the Sizing Guide states that:

        For Publishing, it is especially important to understand how many concurrent personalization jobs will be running as part of each publication. For example, if you need to serve three regions in a Publishing job and thus have three separate database queries, that would be equivalent to three active concurrent users. If you are publishing with personalizations that require a query-per-recipient, you need to determine the number of queries that might be able to be processed at once and use that number as the active concurrent user count.

        Now, if I understand it right, when you’re dealing with webi publications with dynamic recipients, you always have one query for all recipients. Therefore, in a webi bursting scenario the active concurrent user count would amount to one.

        Any comments?

        (0) 
  3. Former Member

    Hi Christina,

    in your blog you wrote:

    For Web Intelligence publications select One database fetch for all recipients or One database fetch per recipient

    You can choose one database fetch per recipient only when dynamic recipients are not used, right? So, one database fetch per recipient would be an option only when you personalise through enterprise recipients?

    As, in your other other blog on bursting, you cleared up that:

    Webi publication with dynamic recipient only supports “One database fetch for all recipients” bursting mode.

    .

    (0) 
  4. Raghavendra Hullur

    Hi Christina,

    Nice Document.

    I am trying to create a publication for a Crystal Report (created in CR XI R2) and trying to publish it on BO CMC.

    When I have a date range parameters, ( I have Start Date and End Date as 2 separate date parameters in my source and dynamic reports) I am facing an issue that, under Personalization, I am not getting the database objects (the whole command object itself) under Report Fields dropdown, which is necessary to create the Dynamic Recipient Mapping.

    The same works when I remove the data range parameters and retain other parameters.

    I have created a question http://scn.sap.com/thread/3581373 on the same, but haven’t got any response so far.

    Any help in this regard would be greatly appreciated.

    Thanks,

    Raghavendra

    (0) 
  5. Former Member Post author

    Hi,

    I have moved to new responsibilities within SAP a year ago. I am blocking new comments to this thread, please post questions to the BI Platform forum to allow the community to help further.

    Thanks,

      Christina

    (0) 

Comments are closed.