The banking industry is in the midst of a significant regulatory overhaul. In the wake of the financial crisis, the management of scarce financial resources, such as capital and liquidity, is becoming an increasingly important source of competitive separation. Banks need to develop a more accurate reading of the capital and liquidity implications of their business models and product portfolios. The path to compliance is rapidly emerging, and pressure is building for banks to comply, in relatively short order, or risk their reputations and ratings. Banks need to figure out how to create value in the context of a new regulatory environment which has set a much higher standard for bank capital and liquidity in pursuit of greater stability.
IT Requirements Stemming from the New Regulations
The new rules, particularly the capital and liquidity requirements imposed by Basel III and Dodd Frank, will change the competitive dynamics of the industry and compel many banks to revisit their business strategies. While assessing the impacts on the business, banks must not lose sight of the technical side of reform, which is critical to their ability to stay compliant and competitive, in the new regulatory environment.
New demands on functionality, including calculations/valuations and reporting/analytics, and increased pressure on the underlying infrastructure – especially data storage and computing capacity – are stressing existing IT architectures. Three areas, in particular, will require significant upgrades:
- Data management and quality
- Models and calculation/valuation engines
- Reporting/analytics capabilities
Demands for better data management and quality, governance, and transparency are being made by both external and internal stakeholders. Moreover, we are seeing rapidly increasing amounts of structured and unstructured data to integrate and consolidate, greater demands on data availability, and a pressing need to ensure consistency across expansive data sets. Specific requirements for static, position, and market data include the following:
- Integrate all static, transactional, and position data into a unified data repository covering finance and risk relevant data. Users should be able to aggregate transaction data at any level and generate multi-dimensional views of data.
- Integrate data derived and calculated from position data (e.g., present value, sensitivities, exposure, credit risk parameters, and other key figures) into the data repository. Calculated data, including cash flows at the single-transaction level, must be explicitly linked to their underlying transaction data.
- Integrate present and historical market data, including macroeconomic data and bank -specific data into the data repository.
These requirements demand more sophisticated and transparent models and calculation engines, and these engines must be housed in a high integrity, tightly controlled, and transparent platform. This will be a regulatory imperative to provide:
- Calculation engines for risk assessment (for example, present value, sensitivities, and exposure), capital ratios, leverage ratio, and liquidity ratios (LCR and NSFR). The engines must be implemented within the risk IT infrastructure and be audit-proof, including versioning, and should enable the monitoring, planning, and forecasting (through scenario analyses) of figures.
- Enhance predefined and ad-hoc stress-testing capabilities that include both sensitivity and scenario analysis.
- Multiple cash flow scenarios concurrently (contractual, expected, etc.)
Demands for enhanced reporting capabilities will be critical to allocating financial resources. IT infrastructures will need to provide dynamic information at any level and across any dimension with the capability to drill from summary information to the supporting details:
- Provide near real-time capital and liquidity reports (for example, capital ratios, leverage ratio, LCR, NSFR), including limits and limit utilization for all relevant figures.
- Provide consolidated reports on both the accounting and economic views of expected losses, valuations, and hedges, including hedge effectiveness analyses for single risk types.
- Provide full access to all portfolio data (master, transactional, position, and derived or calculated data on all financial instruments) at the single-transaction level as well as at different aggregation levels (for example, portfolios, business units, or enterprise). The results of calculations, scenarios, and stress tests should be accessible on different aggregation levels, as required.
Are you prepared for the future of financial services? Learn how to compete and stay compliant by registering for the SAP Financial Services Forum in New York City, September 18-19.
Mike Russo, Senior Industry Principal – Financial Services, SAP