Advertisment

Business-Centric Testing : a Key Requirement for New Regulatory-Driven Systems

author-image
PCQ Bureau
New Update


Advertisment

Advertisment

Prakash Kini, Director-Technology at Sapient Global Markets

The capital and commodity markets and trading governing bodies are striving to make the markets, their instruments and products more transparent and less susceptible to fraud, manipulation and systemic risk. Along with the increase in transparency, firms must also improve competition, efficiency and liquidity. As a result, market participants are navigating through several new regulations.

New regulations

Advertisment

Let's have a look at the regulations including Dodd-Frank Act, Basel III and UCITS IV, which are impacting core trading strategies and processes.

The Dodd-Frank Act

The Dodd-Frank Act mandates the reporting of information by all funds that can be used to determine whether the fund poses systemic risk to the industry. It also prevents banks from participating in proprietary trading and investing in hedge funds. It mandates central clearing of swaps, electronic trading (execution) of OTC derivatives using swap execution facilities (SEFs) and real-time reporting of swaps information to central swap data repositories (SDRs). Finally, it attempts to promote consumer financial protection and tight scrutiny of government emergency lending procedures.

Advertisment

Basel III

Basel III strengthens capital adequacy requirements and credit exposure risk management. It regulates the build-up of leverage, strives to reduce risk measurement errors and promotes central clearing.

UCITS IV

Advertisment

UCITS IV focuses on improving the quality of investor information and the diversity of investor products. It facilitates European cross-border market access and fund management and asset pooling and, therefore, economies of scale. It also seeks to strengthen the supervision of -and enhanced cooperation between -supervisors.

The business impact of regulations

The directives outlined in these new regulations impact the business processes of today's market participants. There is a renewed need to prove that trading activities are within boundaries and initiated by a customer. This requires additional metrics to be reported and the installation of safeguards against prohibited investments through pre and post-trade compliance checks and quantitative analytics.

Advertisment

There is greater scrutiny around the measurement and reporting of forward price calculations, enterprise-wide trading mark to market (MTM) and profit and loss (P&L) exposure, risk and accounting numbers, consumer products design and fairness in execution procedures

Further, there is a need for improved enterprise risk management, including systemic and liquidity risk measurement and management, capital adequacy provisions and counter-party collateral and credit exposure management.

Advertisment

The key to meeting regulatory requirements is high quality data, precise, timely data exchange and strong data governance. There can no longer be silos between internal desks and operational groups. Instead, tighter integration between internal trading desks and associated risk and trading operations will provide the needed enterprise-wide information views for decision making, risk control and regulatory reporting.

In light of all these new requirements, it will be important for clearing houses, exchanges and inter-dealer brokers to develop the ability to handle the electronic clearing and execution of OTC products across such asset classes as credit, equity, FX and commodities. These bodies must be able to publish market prices for the products traded on their platforms. What's more, the platform must build client facing information portals that provide real-time information to market participants about orders and their clearing and execution status.

The regulatory changes require market participants, service providers, regulatory bodies and central banks to make major investments in the following areas:

Reference data systems: With a renewed focus on data governance, data quality, data flow and cleansing processes, there is a greater need for forward prices to be explained and tighter contract and collateral management.

Valuation and risk engines: Market participants need to build up their ability to explain key financial numbers and measurement calculations used. Plus, they need to invest in better asset optimization tools to ensure that the best possible service is being provided to customers at optimal cost levels.

Data warehouse, data flows and data aggregation: Because data quality is important, organizations must produce an enterprise-wide view of exposure, risk, credit risk and collateral. For many firms, this will require an investment in data warehouses to handle the large global, enterprise wide data volumes. The new systems must satisfy the requirement to maintain longer histories and audit trails and explain information flow and recording data at more granular or denser levels (e.g., time series data).

Trading systems: (Front, Middle, Back office): Tighter straight through processing (STP) will be needed to handle electronic OTC clearing and execution and tighter integration with swaps central clearing, execution and depositories. Plus, market participants will need to set up stricter post-trade compliance alerts and enterprise-wide limit checks and alerts and deepen their ability to audit and prove customer initiation of trading activity.

Real-time surveillance monitoring tools: Additional investments must be made in pre-trade compliance systems to enable the aggregation of trading data/events into real-time in-memory caches, identification of patterns (complex event processing) and production of real-time alerts and block-prohibited investments or trades violating set limits.

Business Intelligence (BI) reporting tools: With a strong focus on information accuracy, transparency and timeliness, there is a need for improved analytics and risk management facilities for consumer products that involve quantitative analytics and modeling, what-if analysis and stress testing.

Data interfaces and information portals: Under the new regulations, market participants must be able to support larger data volumes, thinner information granularity and real-time data transfers to regulatory and central service providers, such as clearing houses and swap central depository. Service providers need to build client-facing information portals and information feeds, aggregating large amounts of data from multiple sources rendered in a user-friendly, rich graphical user interface (GUI). Examples of this include intuitive charting and graphing, drill down, slice and dice, 3D visualization.

These requirements could necessitate a full or partial re-platform of existing systems-resulting in either the selection and implementation of new cross-asset trading packages and consolidation of trading processes to the package or major bespoke development to enable data integration.

With the many system changes and enhancements required based on these new regulations, market participants will need to use a wide range of verification and validation testing techniques and tools including:

Data foundation: There is a need for data analysis and testing, including data quality analysis and cleansing, data issue investigation and resolution that involves tracing through all the upstream systems and downstream data aggregations, transformations and calculations.

Trade processing systems: The focus is on tighter business process (functional) testing of data capture, business workflows and underlying business validations and business rules. Large re-platform and system/package upgrade programs require considerable regression and parallel testing to prove that system changes have “zero impact” (numbers mostly tie out and residual differences are justified and can be explained) on existing functionality.

This would also involve the following: testing of data migration to the new/re-platform/ consolidated trading systems and to the data warehouse; testing of new or modified business processes (front to back) and underlying system and system interaction changes; and interface testing of all incoming and outgoing third party interfaces, as well as all inter-system interactions (messaging, feeds and SOA testing).

To ensure that systems can meet the new transaction and data loads and can scale to future loads, participants will need to conduct robust capacity planning, performance testing and tuning.

Trade surveillance, data interfaces and regulatory reporting: There is a need for rigorous business intelligence (BI) reports numbers tie-out testing. BI tools and monitors typically become rich internet or thick GUIs and, therefore, testing (and automation) must be geared up to handle these rich applications.

Approximately 30 percent or more of the overall effort in large programs is testing and if it's not made a priority, the testing process can become a bottleneck for large programs. This problem can lead to operational risk because critical issues may arise in production that impede trading operations due to a variety of factors, including the need for workarounds or a delay in getting the numbers, penalties that result from decision-making errors or the calculation and reporting of wrong numbers, etc.

Testing techniques and tools

The unique challenges posed by trading in the capital and commodity markets require trading business-focused testing tools and techniques. For example, data quality testing involves deep data analysis and providing business users with regular data quality reports that include data analysis results and identified data issues and their initial diagnosis. This is followed by collaborative issues investigation and resolution, which involves tracing back to upstream data sources, transformations and calculations. High-volume data testing requires a high-performance data comparison engine that handles large data volumes in a top down manner where aggregated data (up to a portfolio or product type or region level) is first compared with error thresholds (e.g., P&L difference of less than $1 is acceptable). Not only does this help speed up data comparison, it also helps identify the root cause faster.

Testing trading systems and BI tools have very high information density (e.g., blotters and grids), complex user interactions (e.g., pivots, slicing, dicing, drag and drop) and graphics-rich charting and real-time messaging updates (e.g., monitors and ticker). Not only does this require deep functional and usability testing, but also knowledge of how to automate the testing of such a GUI (dotNET, Silverlight, Java Swing, Yahoo or Google UI Toolkit or Delphi, PowerBuilder) and any third-party controls that have been used, such as controls, pivots, grids.

Business tier and middleware testing involves exercising complex business rules and inter-system handshakes using messaging (e.g., JMS, TIBCO) and service oriented architecture (SOA)-based testing tools. Trading systems must also be tested for high transaction volume with a large background ambient traffic of other transactions. For example, testing the performance of an order entry scenario in an order management system (OMS) will need to be done alongside such ambient “noise” as traffic of executions coming in and allocations occurring. This requires focused test harnesses to be created and utilized, such as a deal/ price pump to create large load of trades and/or market prices.

Advertisment