Managing Risk Data: Financial Instrument Terms and Conditions

An instrument’s terms and conditions lie at the heart of cash flow generation and valuation. Not surprisingly, errors in terms and conditions can drive errors in valuation. Fortunately, fixing these errors is often straightforward, provided the terms and conditions data is readily available, which is not always the case for private placement instruments.


Security Identification: What’s in a Name?

Over the years, financial markets have adapted several different standards to identify instruments, including:

  • ISIN
  • MIC
  • Bloomberg ID
  • RIC

Unfortunately, most of these identifiers only cover listed and registered securities. Over-the-counter derivatives, futures contracts, and other private transactions are identified using either exchange-generated symbols or data vendor-specific nomenclature.

Risk systems need to adapt to different naming conventions without always having a clear indication from input data. Intelligent identification can be implemented in a few different ways. Traditionally, a rules-based approach has been used to identify different naming conventions (e.g., CUSIPs are 9 characters long, while ISINs are 12 characters long).

Use of regular expressions improves upon a rules-based approach by identifying derivative contracts.  Naming conventions for derivative contracts embed terms and conditions. For example, equity options are identified in the form: <stock ticker><call / put><strike>

But these two approaches may not entirely resolve naming conflicts. For instance, there are futures contracts that are named in different ways across sources. In addition, certain equity tickers conflict with futures contract symbols. Typically, such conflicts are resolved by human intervention and require conversations with the client.

Recent advances in machine learning research can be applied to resolve such conflicts.  An approach that learns from the portfolio structure and historical trading patterns of the client can be used to identify instruments and catch potential data issues. We cover the application of machine learning in market risk management in a separate blog post.


Field Normalization

Best-of-breed data vendors are used by risk management providers and groups to source terms and conditions. Risk and pricing models require instrument terms to be provided in a consistent format. Each data vendor will use its own naming conventions and codes for categorical fields for different instruments and markets. For instance, day-count conventions can be specified in several different vendors, asset classes, and markets.

A typical multi-asset risk management framework may require more than 300 fields to accurately run pricing models and underlying predictive models. Risk systems should not only normalize the incoming sources but also appropriately map to each pricing routine. The challenge to maintain the mapping and catch issues can be achieved by implementing a rule-based mapping engine.


Using a Managed Service

A few risk providers offer a fully managed service, where the provider also sources and manages the incoming data, including Terms and Conditions. Using a managed service takes the burden of T&C normalization off the risk team, freeing analysts and developers to work on higher value-added tasks. For example, RiskSpan’s RiskDynamic service normalizes data stored in the native source format to a published XML message format. The general-purpose XML processing language XSLT enables mapping of these normalized data messages to various pricing engines. XSL transformation scripts are maintained for various flavors of pricing engines used by clients.

Normalization and mapping of instrument characteristics ensures a uniform application of scenarios and stress tests along all risk factors. Risk factors for an instrument are easily identifiable when a uniform normalization model is applied.

In addition to integration with pricing engines, normalization of terms and condition is also essential for portfolio reporting.  A normalized data store enables reports to be stratified and aggregated by instrument characteristics.



Pricing and risk models depend on accurate information about securities, structures, and underlying asses. Data vendors are not always provided with the terms of a new security in digital format, which leads to data entry errors. Additionally, information that is critical for risk may not be essential for booking a trade. As a result, the data in a vendor’s security master may need scrubbing and validation.

The RiskDynamic service also uses the XSL-based transformation engine to validate data fields for reasonableness and consistency. The engine is configured to perform basic validations on numerical values, such as coupon rate, and allow for intelligent ranges for derivative structures. The validation framework also looks for consistency within a structure—for instance, that call schedules or frequency matches the indicator for call options embedded within a bond.


Data Source Management

Risk and valuation models require complete and validated structural data for each instrument. It is difficult for one vendor to provide the necessary level of detail for all asset classes. Although basic indicative information for reporting can be obtained from a single source, data required for valuation is only available from specialized vendors.

In addition to managing contractual relationships with multiple vendors, it is also a challenge to maintain download processes and API interaction.

Information for the same instrument is sometimes available from multiple sources. Source hierarchy per asset class is necessary to ensure that the best available information is used by valuation and risk models. For instance, collateralized loan obligations (CLO) are sometimes modeled as a level pay corporate bond structure. Most CLOs are modeled with a structural credit hierarchy and payment triggers. Using level pay bond cash flows would lead to erroneous valuation and misleading risk assessment.

Data sources also update at different times and frequency.  In an ideal scenario, risk valuations should be performed with the most recent information. However, it is critical to ensure that the portfolio is valued using consistently updated terms and conditions. All updates should occur prior to starting the risk process to avoid inconsistency within a risk run.


Data Enrichment

Along with using best of breed sources for each asset class, predictive models require enrichment through additional sources. Some use cases are described below:

For structured asset backed securities, such as mortgage backed securities, data for underlying assets is not always as complete and granular as necessary to run predictive models. In 2008, when the credit crisis made it necessary for investors to value credit risk of subprime mortgage backed securities, loan-level data attached to RMBS structures was found less than adequate.  Loan level data provided by specialized vendors is necessary to enable credit and prepayment models to generate projections.

For funds that invest in index arbitrage strategies or exchange traded funds (ETF), risk metrics would be incorrect if the index or ETF members are not independently evaluated. Members and corresponding weights must be sourced from several sources, and the timing of updates should be coordinated to avoid inconsistencies with portfolio updates.

Information is not always available in structured, digital formats. Critical data elements are sometimes buried in unstructured notes. Non-performing loans and distressed debt is often modified and restructured. Revised structural terms are not always compatible with the existing data model and note or comment fields are used to describe the changes.  Natural language processing techniques are used to extract information from notes.


Historical Analysis

Integrity of data and ability to audit results are key requirements of any market risk system. Ability to recreate an analysis performed in the past or evaluating the risk of a strategy as of a period is dependent on maintaining a snapshot of instrument definition and market data.

Security masters must track changes to an instrument and provide an as-of view based on the analysis date. Ability to maintain a historical view also requires the system to support backward compatibility with vendor APIs and data formats.

Most security masters can track varying coupon for floating rate notes of amortizing principal for mortgage backed securities. Restructured distressed instruments, modified mortgage backed securities and amended waterfall rules after legal action are some of the more complex changes to track.


User Defined Instruments

Over-the-counter (OTC) derivatives and certain private structures cannot be sourced through vendor data feeds. These instruments must be modeled and stored to facilitate risk valuations.

The International Swap and Derivative Association (ISDA) supports a data standard for derivative transactions. Financial Products Markup Language (FpML) is the XML format standard recommended by ISDA for exchanging information on derivatives. Several portfolio management and transaction management systems do not support FpML and investment firms use a flat data structure to store OTC derivative structures. As a result, CSV files remain a common medium of data exchange.

Risk systems must also provide data templates and user interfaces for clients to model private structures. This feature also helps with proxying of certain asset classes that don’t lend themselves to traditional modeling.



There are many challenges to managing Terms and Conditions data in a multi-asset environment. In the past, risk teams have committed time and personnel to address these data sources with an eye to minimizing error. This process requires ongoing attention and staff and can represent a significant cost in managing risk analytics.

In today’s environment, many risk management teams are turning to a managed risk service. A managed risk-service such as RiskSpan’s RiskDynamic leverages this data management across many users, allowing for economies of scale and permitting the risk management team to devote its staff to its primary mission – identification, quantification, and management of risk.