Additional Blogs by SAP
cancel
Showing results for 
Search instead for 
Did you mean: 
Former Member
0 Kudos

Forecast Accuracy: It’s All About the Data

A key characteristic of any high-performing organization is its ability to forecast consistently and accurately to internal and external stakeholders. Spot-on forecasts demonstrate organizational alignment, strong leadership, and attention to detail that boosts the confidence of employees and investors alike.

Driving your organization to improve forecast accuracy requires a culture of accountability, which in turn can drive financial performance. AMR Benchmark data suggests that companies with the most accurate forecasts have:

  • 10% higher revenues
  • 5% to 7% better profit margins
  • 15% less overhead costs

To achieve this competitive differentiation, companies need a foundation of relevant and accurate data, a dedicated data integration approach, and appropriate process automation.

Accurate, Relevant Data is a Must

As the philosopher Santayana said, “Those who cannot remember the past are condemned to repeat it.” In this spirit, a strong forecasting process starts with consistent, accurate, and relevant historical data; without this information, bad assumptions can become embedded in the forecast and magnified over time. To support good strategic and operational decisions, forecast participants need high-quality, timely, and relevant data. Any imperfections lead to distractions – for example, users may disproportionally focus on a single piece of erroneous data, such as a missing reversal in January. And if the data is not accurate, business logic based on historical data will promulgate errors into the go-forward forecast.

A forecasting process must also allow for the ability to refresh and update relevant data during the forecasting cycle. This alleviates users’ tendency to collect updated data on their own and enter it into the system, which often introduces errors – or, worse, forecasters ignore recent events and let dated assumptions persist. Another concern is data “dumped” in front of users replete with useless fields, codes, and descriptions – for example, those field names found only in source systems. This practice greatly decreases the signal-to-noise ratio, forcing users to spend inordinate amounts of time trying to make sense of irrelevant or poorly labeled data. This is not only a data sourcing issue, but also a master data definition problem.

Less is Sometimes More

Sometimes forecasters need to focus on less data than they have – an idea that may seem counterintuitive, and can be more difficult to contend with than it sounds. Many financial planning and analysis departments are swimming in data; planners are asked to forecast in excruciating detail. In some cases, the culprit is a philosophy that there must be a forecast number wherever actuals are tracked. Other companies lack either the will or the expertise to streamline information properly.

In either circumstance, analysts spend time chasing minutiae instead of focusing on the important issues: where the business is headed and what, if any, corrective action should be taken. A program of rationalizing the underlying data to support nimble forecasting algorithms and processes can contribute to overall forecasting precision. Best practices include the ability to:

  • Map away one-time events, irrelevant artifacts, and obsolete financial activity
  • Aggregate items that can be forecasted better in total (do you really need 10 separate travel accounts in your forecast?)
  • Develop “80/20” rules for forecasting; sweat the details on the big revenue and expense drivers, then apply generic business logic to the rest

Don’t Allow CPA to Stand for “Copy, Paste, Append”

Disparate source systems can pose a challenge for companies. The sheer effort required to acquire, merge, reconcile, and purge all the necessary data can become overwhelming, leaving little time for analysis and action. And each manual touch point represents a potential for failure and compromised accuracy.

A best practice forecasting process requires a dedicated data integration approach as well as process automation at every possible step. Extract, transform, and load (ETL) strategies, master data harmonization, and data staging should all be designed and executed with the same attention to detail as the forecast itself.

Building a Foundation for Improved Forecasting

Quick access to clean, logical, and accurate data that reflects your forecasting philosophy is a prerequisite to improving forecast accuracy. SAP consultants have helped companies of all sizes improve their forecasting accuracy and can help your organization:

  • Determine the data needed to support your desired forecasting approach
  • Retrieve, cleanse, integrate, and stage data from multiple source systems
  • Build connections to systems and automate data updates
  • Design and build forecasting systems that capitalize on all of the above

Consultants with Business Analytics Services from SAP have deep experience with SAP technology and can help solve even the most complicated business problems. They routinely work across multiple industries and recognize the differences in forecasting processes based on vertical or LOB scenarios. Forecasting initiatives are defined up front, providing the cost and scope clarity that brings confidence to these types of projects.

For more information on Business Analytics Services from SAP, please visit us online.