Skip to Content
Author's profile photo Former Member

The Case for Package Software

Before you even start considering package software, you need to have a good understanding of the business process you are trying to enable. More importantly, you need to know if this business process is the key differentiator that makes your company successful or is a supporting function like accounting. Package software for the most part excels at enabling these supporting functions. There are also many business processes that fall in between. They are more than just support functions, but not so highly specialized that they are truly differentiators. These often appear in supply chain, sales force automation, call center management, and product lifecycle management. Finally, there are industry specific processes. Many large package software companies like Oracle and SAP have industry vertical solutions that attempts to provide standard software for these processes.

In this first part we will dig into the following topics: Standard Processes, Speed-to-Market, Legacy Systems, On-premise Software, and Software as a Service.

Standard Processes
Package software is well suited to enable standard processes such as finance or human resources. As organizations start to consider package software, the first exercise they need to go through is a classification of their business processes. These business processes need to be broken into categories such as differentiating, competitive, and non-competitive. Enabling competitive and non-competitive processes with package software is a great place to start. Most differentiating processes will be too unique to easily enable them in package software solution. While there are many reasons to leverage package software for differentiating processes, that is a topic for another post.

Package software promises speed-to-market if you are able to accept its standard out of the box approach for your processes. This puts the onus of change on the business vs. the IT organization. For those package solutions that allow you to tweak their standard processes to better match yours through configuration, you can still enable your business process. However, there are several key areas that you must think through in order to move fast.

The first is data. Will you convert data or start your transactional history from scratch? Do you know where you master data lives and will you enable that in the new system or integrate with another system?

The second is integrations. How many upstream and downstream systems are required to integrate into the new solution? How flexible are those systems in changing their data structures to match the new package system? Will you need to create a translation layer to manage the data between the systems?

The third is performance. Do you have the internal infrastructure to manage the peak loads of the new system? Have infrastructure timelines been aligned to the overall implementation timelines? Do you know how the new system will scale?

These three components of data, integrations and performance are critical to understand in order to move fast. The good news is that many software vendors deal with these issues in every implementation and in some cases have built out methods to address these components. Unlike custom developed software, you have a starting point of where you need to begin in managing these risks to the ability to move fast.

Legacy Systems
We touched upon legacy systems a bit in the speed-to-market section above. So let’s dig a bit deeper on Data, Integration, Performance and Reporting.

Starting with Data, you will need to understand the quality of the data that exists in the legacy environment. If you are converting from a custom developed application or even an old package solution, you most likely have more flexible data rules than what the new system has. This will inevitably lead to challenges during data conversion to the new system. The payoff is structured data that will ensure data quality and consistency, easing future data consumption. Start data activities in parallel with the rest of the integration activities, otherwise you will find this track lagging and it will ultimately hold up your go-live.

Reporting should also be started in parallel with data and the whole program. Understanding what the new software package enables is important and educating the users to what is available out of the box is critical. What usually is not available from a software package is ad-hoc reporting. If this is a requirement, you will need to move data between the transaction system and your data warehouse solution. In addition, if analytical reports are required vs. standard historical reporting, this too may need to be built off of your data warehouse. Though many software vendors are migrating to in-memory systems that allow for very fast data analysis without having to migrate data between systems. These issues are similar to issues faced in custom solutions or packages due to the performance issues of doing data analysis against the transaction system.

Integration is another challenging effort to making software packages to work in a heterogeneous architecture environment. Embedding your integration team into your overall implementation team is a great way to ensure that integrations are completed on time. Cataloguing your integrations and determining if you can leverage a services architecture vs. a point-to-point integration may ease your overall development and simplify your architecture in general. The biggest challenges in developing integrations is incompatibilities in the data structure between the systems, performance of the interface, and access to knowledgeable subject matter experts on both systems. These integration challenges hold true regardless of custom or package software solutions.

Finally, performance of legacy systems and more importantly the interfaces are often over looked. For example, if there is expectation of real-time updates of data, but the source system from legacy operates on a nightly batch, there is no way to facilitate real-time data. Interface performance tends to be a more common challenge with legacy systems, where data flowing the interface is disrupted, leaving the new software package lacking the relevant data for the business to do its job. Fault tolerance in your interfaces is required to ensure a seamless experience by your users. Finally, the package itself needs to be performance tested. The software vendor will have great statistical data proving performance, but you must plan to validate its performance in your own environment.

On-Premise Software vs. Software as a Service (SaaS)
Software as a Service has grown tremendously over the past decade. This includes traditional software vendors adding SaaS offerings to there product lists. The hard part is deciding if SaaS, Hosted, or On-premise software is the way to go. In evaluating these options the following themes emerge for all of them as points of consideration: Security, Performance, Customizations, and Support.

Let’s start with On-Premise Software. If your business is concerned about having secure data, hosting your solution within your data center or on dedicated hardware within your data center provider’s offering is a must. By hosting your solution you control who has access to the data, have control of your network and physical access and don’t risk security breaches during transit of data. Many SaaS providers don’t segregate your data from their other customer’s data, increasing the risk that your data could be exposed inappropriately.

On-premise software also puts you in control of performance. You are able to scale, as you need too. If you have integrations from your software package to legacy systems, keeping data movement within your network is often faster than moving it over the Internet to your SaaS provider.

If you require customizations, you are able to control and manage that due to having the source code on premise where you can access it as you please. SaaS providers gain their scale by not allowing customizations and typically prohibit customers from making any change to the core code, limiting the extensibility of the solution.

Finally, there is support. On-premise software supported by your company falls within your support SLA’s. SaaS providers may not align to your internal SLA’s and for upgrades and outages; they often don’t have to avoid business sensitive times due to their business models. For example a SaaS provider may have an eight hour SLA for a severity one issue, while your internal SLA for severity one issues is 4-hours.

SaaS solutions have many benefits around Speed-to-Market, Scalability, Costs, and Performance. Due to the limited ability to do any customizations or configurations SaaS is often very turnkey resulting in speed-to-market. Pricing models tend to be lower cost at the start due to the subscription model vs. license model of traditional software. This means you don’t have to invest in costly infrastructure or the time to order, install, or configure that hardware.

From a performance perspective, many SaaS providers invest heavily in their data centers beyond what most companies may invest in their own data centers, leading to higher availability. The issue is that when they do have an outage, you don’t have control over recovery.

When considering SaaS solutions focus on the following business functions as a place to start: travel and expense management, training, knowledge management, sales force automation and talent management.

Please add your own perspectives on your experiences with Package Software. In the next installment we will discuss the Secrets of Making Packages Work in more depth. Topics will include a deeper dive on integration, data conversion, user interfaces, customizations, skills, and system integrators.

Links to Part II and Part III

Assigned Tags

      1 Comment
      You must be Logged on to comment or reply to a post.
      Author's profile photo Jon Gilman
      Jon Gilman

      Very well-written.