Reference Data Management in Financial Services


Increased global regulatory pressure coupled with fragmented regulatory landscape is making financial institutions realize the value of putting a data governance strategy in place. With effective reference data management (RDM), companies can facilitate seamless flow of clean, consolidated, and accurate data throughout the enterprise. By doing so, they will realize more value—save millions of dollars every year, dramatically improve value chain and augment efficiencies, manage risk efficiently, improve customer loyalty, and support sound corporate governance.

The Case for Reference Data Management (RDM)

Even as financial institutions, exchanges, and market participants are undergoing a fundamental transformation, data management is becoming increasingly challenging. In this context, it is extremely important to manage creation and maintenance of data to ensure its relevance and mitigate risks arising out of data inconsistency. Data accuracy and reliability is mission-critical and key enabler for all business operations including trade execution, risk management, and compliance reporting.

Data management is the development and execution of architectures, policies, practices, and procedures to manage the information lifecycle needs of an enterprise in an effective manner. Effective data management calls for seamless integration between all elements of the overall data management lifecycle—strategy, governance, operations, review, analysis, and actions.

In most financial institutions, data is spread across multiple regions, departments, and systems. Many of these entities have to reference data pertaining to the parent company. However, they are unable to do so with ease when there is no central source of data. Instead, the entities have their own nomenclature and data sources piled in silos, with redundant systems designed to extract and process data for individual requirements. Apart from being an inefficient design, this is extremely cost-ineffective and prone to data inconsistency.

RDM addresses all the above stated issues. It is a methodology of managing the creation and maintenance of data that can be shared across multiple regions, departments, and systems. It collates data from multiple sources, normalizes it into a standard format, validates the data for accuracy, and consolidates it into a single consistent data copy for distribution.

This white paper analyses the need for reference data management in the financial services industry and delves into the challenges in the absence of effective RDM, critical elements of RDM implementation, and some of the major benefits an organization can derive by implementing a robust RDM solution.

Why Data Management cannot be Ignored Anymore

Fundamental changes in the financial services industry have created a significant impact on data management platforms. Some of the key drivers of change are:

Diverse Instruments: In the quest to offer compelling products to customers brokers/ dealers have created many innovative financial instruments. Currently, there are more than eight million instruments, each requiring a firm to maintain detailed, timely, and accurate information. Derivative issues are only one example of financially engineered securities that did not exist a few years ago. These new financial products and their complex terms have become a challenge for executives managing financial information.

Changes in Market Mechanism: Trade execution mechanisms have been altered by the shifting composition of market participants. For example, there has been a rapid increase in the number of hedge funds and the emergence of mega “buy-side” firms, many of which use program trading and other algorithmic execution models. Decimalization and program trading have led to a reduction in the trade size with a corresponding increase in volume. These factors have put a strain on data management platforms as they are required to deliver high volumes of data with low latency to black-box trading systems.

Regulations and Compliance: Regulation and compliance are also key drivers in the march toward an improved data management platform. The emergence of Basel III, Sarbanes-Oxley, and other key risk and compliance considerations has forced firms to place high priority on production of accurate and timely data to feed internal risk management systems. As a result, institutions must now meet a more stringent fiduciary responsibility to provide correct data to regulatory agencies. Faulty information can result in dire consequences and catastrophic financial exposure.

Expanding Role of Data Aggregators: The industry’s demand for a wide range of security attributes and pricing information has given rise to an entire sub-industry populated by vendors who specialize in financial data capture and distribution. These vendors are playing an increasingly significant role in managing and providing data. However, managing multiple sources of data creates cost and consistency issues that must be fixed.

Data Classification: Recognize, Categorize, Then Analyze

Data is not a homogeneous entity. It consists of different categories, each with its own set of characteristics. Each of these categories may have strong dependencies on each other. However, failure to recognize these differences is risky. Projects that do not address the unique nature of each data category will invariably encounter problems and are likely to fail.

Financial services data can be categorized into the following types:

  • Transaction Activity Data: It represents the transactions that operational systems are designed to automate.
  • Transaction Audit Data: It is the data that tracks the progress of an individual transaction such as web logs and database logs.
  • Enterprise Structure Data: This is the data that represents the structure of an enterprise, particularly for reporting business activity by responsibility. It includes organizational structure and charts of accounts.
  • Master Data: Master Data represents the parties to the transaction of an enterprise. It describes the interactions when a transaction occurs.
  • Reference Data: Reference Data is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise. In financial services, it includes descriptive information about securities, corporations, and individuals.
  • Market Data: In financial services, market data refers to real-time or historical information about prices.
  • Derived Data: Derived data refers to data that is derived from other data. It is calculated by various calculators and models made available to a wide range of applications.

The Many Challenges of RDM in Financial Services

Improving data quality is an ongoing effort and financial institutions are facing the challenge of improving their technology infrastructure to address this issue. Reference data management projects are major technology investments to improve the data quality. Data integration and the concept of a single source is a massive challenge, especially in APAC (Asia-Pacific) banks, as data is still being managed in silos.

Increasing volume of data means working with multiple data sources. Client data and the single view of the customer is a critical area driven by regulations such as Anti-Money Laundering (AML) and Know Your Customer (KYC).

Historically firms have maintained, built, and managed their own security and client master databases in isolation from other market participants. As these organizations expanded organically or through acquisition, data silos matching each line of business emerged. Most of these data platforms are similar in style and content within and across firms. Typically, they are maintained through a combination of automated data feeds from external vendors, internal applications, and manual entries and adjustments. It is not uncommon for these platforms to contain aging infrastructure and disparate, highly de-centralized data stores.

Some of the common challenges financial institutions face in reference data management are:

  • Exponential increase of asset classes, new securities, and volume
  • Duplicate data vendor purchase, expensive manual data cleansing, and poor data management, leading to high aggregate costs
  • Management of multiple securities masters, repositories, and different sources of asset classes across different geographical markets
  • Prevalence of different identifiers like Committee on Uniform Securities Identification Procedures (CUSIP), International Securities Identification Number (ISIN), Stock Exchange Daily Official List (SEDOL), and internal identifier used by front- and mid-offices

Get a Grip on Reference Data Management

Coforge deploys new methodologies, proprietary software, and tools from industry-leading software vendors to tackle reference data management challenges. There are many third-party product providers who focus on specific elements in the chain of reference data management without having a holistic view of the complexities surrounding the entire lifecycle of reference data. Our RDM process focuses on these complexities and is divided into four critical stages:

Data Acquisition

  • Data is acquired via robust market-facing interfaces such as Bloomberg, Reuters, and JJ Kenney
  • Data is continuously updated and monitored as it is critical for successful data acquisition

Data Validation and Mapping

  • Automated via rule engines as exception management and support is required to perform manual data mapping

Data Enrichment and Transformation

  • Reference data is enriched and standardized
  • A golden copy of data is created for instrument pricing

Data Distribution

  • Golden data is distributed to external third-party systems
  • Audit trail and action tracking is performed as it is extremely important at this stage

The Road to Effective RDM: One Step at a Time

Based on the fundamental components of the data lifecycle, we have developed a comprehensive solution for end-to-end reference data management. Our reference data management solution enables firms to manage the entire reference data environment from vendor data rationalization to enterprise reference data architecture design and integration, and from indexing to automated data cleansing and distribution.

Our reference data management offering includes the following elements:

  • Reference and Data Rationalization: This process workflow creates a cross reference of each data element and rationalizes reference data spend by identifying duplicate purchases
  • Enterprise Data Architecture Assessment and Package Implementations: Evaluates current architecture, aligns it with future growth plans, and identifies constraints for the enterprise reference data architecture.
  • Index and Normalize Securities Data: Uses a set of industry-standard tools to create a consistent and single enterprise-wide key matrix for all securities.
  • Automated Data Cleansing system: Supports a rule-based commercial reference data cleansing systems to process reference data.
  • Data validation and Mapping: Automates data mapping and data validation based on rules engine. This prevents automatic overrides.
  • Corporate Actions Processing: Helps maintain security reference data by automatically applying corporate actions with manual support for complex electives.
  • New Securities Setup: Enables continuous monitoring of security masters and sets up new securities on demand.
  • Enterprise Reference Data Distribution: Enable BOCADE (buy once, clean and distribute everywhere) reference data distribution across the enterprise and build audit capability for price requests.
  • Instrument Pricing: Provides timely and accurate instrument pricing data to bankers and financial advisors.
  • Reference Data Efficiency Dashboard: Makes RDMS black box transparent by monitoring reference data consumption, quality, and cleansing status.

The Coforge Thought Board:

Reference Data Management in Financial Services

The Holy Grail of Data Singularity

Financial services organizations deal with numerous financial instruments, ranging from stocks and funds to derivatives, to meet the ever-increasing demands of the global securities marketplace. They need to tackle a huge amount of data to trade and keep track of these instruments.

Coforge’ Reference Data Management solution helps financial services institutions rationalize the process of reference data consumption. It is designed to consolidate, cleanse, govern, and distribute these key business data objects across the enterprise. It includes pre-defined extensible data models and access methods with powerful applications to centrally manage the quality and lifecycle of business data. The solution is augmented by our implementation know-how to develop and utilize the best data management practices with proven industry knowledge. These strengths have led to a large data ecosystem with a numerous specialist partners.

We deliver a single, well-defined, accurate, relevant, complete, and consistent view of the data across multiple regions, departments, and systems. Companies that have implemented our solution are successfully achieving the elusive goal—consolidated version of data across the enterprise.

More Value at the Heart of Partnerships

As a partner to financial services institutions, Coforge brings new ideas and more value to every engagement. Our strengths include:

  • Strong Industry Focus: Coforge has several thousand person years of experience in designing, building, and maintaining largescale applications for day-to-day business with considerable experience in front-, mid- and back-office operations. As per the Datamonitor Black Book of Outsourcing 2010 survey, in the overall satisfaction ratings, Coforge ranked number 1 in the Data Management Services. Our team has expertise in tools such as Charles River, Calypso, Advent Moxy, Linedata Longview, MacGregor ITG, Eze Castle, Omgeo, Bloomberg, Reuters, and Yodlee Account Data gathering.
  • Technology Depth: Our offerings span business and technology consulting, application development and management services, IT infrastructure services, and business process outsourcing services. Our services are underpinned by a strong valueoptimizing framework with a cost-effective delivery model, which can be used in single shore, dual - or multi-shore formats.
  • Mature Best-in-class Process Framework: Coforge is ISO 27001, CMMi Level 5 and PCM Level 5 accredited. Our resources are, therefore, well-versed with operating in a highly mature process oriented and secure environment and bring this expertise to all client engagements.
  • Ability to Scale : With a large resource base of over 5,000 analysts and consultants, Coforge is able to quickly source professionals with the desired skill sets required for a project. Furthermore, we also possess the capability to ensure quick ramp-up of project resources as and when needed.
Download the Whitepaper

Sorry. Your email address is not valid. Please submit a company email ONLY.

2 + 6 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.

By clicking the download button, you allow us to connect with you using email, phone or post (as provided) for responding to you and for other marketing activities. This information is protected under our privacy policy.