Close

The right insights, right now

Access the latest news, analysis and trends impacting your business.

Explore our insights by topic:

About Broadridge

White Paper

The Service Value Chain

This whitepaper explores how incremental data improvements can support vast reductions in organizational costs.

Service Value Chain

Every financial institution relies on good data, however not every institution is aware of the total effort necessary to procure it. This whitepaper explores how incremental data improvements can support vast reductions in organizational costs and show readers that by engaging a proven, best-in-class data management service, firms will be able to differentiate their business as a nimble financial institution quickly adapting to current and future challenges. An effective strategy to manage reference data will not only create cost savings and consistency in the creation and maintenance of the data, but how the ripple effects across the organization can be enormous.

Want to read the full white paper?

INTRODUCTION

Changes in regulatory requirements, exponentially increasing data volumes and complexity, and evolving client sophistication are forcing financial institutions to rethink business strategies and processes. For many institutions, the struggle is not about innovation but about adapting to exogenous forces that continue to impact margins. Greater efficiencies and improved compliance are on-going requirements. Growth into new investment strategies, and the technology necessary to manage those new products, has created an internal quagmire.

There is one process that can help institutions address all of this: evolving reference data management.

Data management has always been a foundational process of financial services, but it is also an imperative in regulatory compliance, client demands and growth into new markets. Because of this, the means of acquiring, cleansing, normalizing, managing, governing and distributing reference data has exponential effects on downstream costs. An effective strategy to manage reference data will not only create cost savings and consistency in the creation and maintenance of the data, but the ripple effects across the organization can be enormous.

“Chartis Research estimates that 60-70% of project costs go to data readiness, before analytics can even begin.”

THE RECENT EVOLUTION OF DATA MANAGEMENT

In virtually every financial institution, the reference data management architecture was built internally and piecemeal throughout the enterprise. Created primarily through departmental initiatives and agendas, the systems were built to enact numerous regulatory and client demands, with little thought to enterprise suitability. These disparate data platforms were often created via separate technologies and staffs, and perhaps even more alarmingly, with separate connectivity to internal trading processing and risk systems.

As firms developed more centralized or shared services, infrastructure wasn’t initially replaced, but fixes were attached to allow commonality of end-data processing. From there, the next step was typically initiating a common processing architecture across an institution’s different platforms and staffs. However, because none of the internal systems were initially built with the needs of the enterprise in mind, this proved a difficult integration for many banks. The “last mile work” — the distribution of aggregated data to all systems and consuming applications throughout the enterprise — has perhaps been the largest obstacle in re-engineering cumbersome legacy platforms and processes; very few firms have been able to centralize the acquisition and cleansing of data in a common enterprise repository (“Golden Copy”), adding to the complexity and burden of business operations and regulatory requirements.

“CEB TowerGroup finds that almost 70% of capital markets firms either do not have a data management strategy or have not invested in it.”
The recent evolution of data management

INDUSTRY CHALLENGES

These piecemeal, incongruous legacy systems have not only impeded efficiency and data quality within an institution, but in turn burden a firm’s ability to comply with regulators and create revenue in light of many new industry challenges. These challenges manifest themselves in many areas, but are acutely felt in five core themes:

Enterprise-Wide View: The importance of being able to view full asset coverage across an enterprise is becoming a necessity in competitive industry comparison. Additionally, regulatory stress tests require a common set of information and a common process to produce said reports in a way that didn’t exist previously.

Data Governance and Lineage: Upcoming data governance and data lineage requirements are producing obstacles for banks’ legacy infrastructures. Going forward, regulators are requiring institutions know the source of their data and how it has been modified; a requirement that didn’t exist previously. Banks will need to create the past state of reference data despite the fact that virtually none of the infrastructures used by institutions today are built to accommodate this. Therefore, paramount is the maturation of a platform’s capabilities in storing past and present states of reference data to meet regulatory requirements going forward.

Risk Management: Additionally and understandably, increased attention on risk management has put much higher demands on the data infrastructure. It must be understood that the largest challenge in risk implementation and the accuracy of risk reporting is a function of the underlying reference data and pricing used to drive it. According to Chartis research, a typical Tier-1 bank has between 2,000 and 3,000 separate risk models used throughout the institution; just getting an inventory of those models and their data is a huge undertaking.

Data Quality: Achieving high data quality is an enormous challenge. The Data Warehousing Institute concluded that the cost of data quality problems exceeds $600 Billion annually. Institutions are challenged to come up with a common and objective methodology that allows them to benchmark and improve. The data that financial services firms consume comes from disparate sources at all times and in different formats. Some of the most challenging information to understand and govern comes from internal sources. Firms must first apply tools to “normalize” the data so that standard metrics can be applied and meaningful results can be used to improve the rules and exception management processes.

Data Expense: The expense of data has become a large component of most businesses. Therefore, banks are looking to engage third-party experts to help them understand and analyze the costs associated with their data supply chain. This effort is leading firms to the conclusion that better data provides a huge cost advantage over their peers.

Industry Challenges

THE ECONOMICS OF REFERENCE DATA MANAGEMENT

Cost is a huge aspect of reference data management that can no longer be ignored. Few institutions truly understand the cost of their reference data infrastructure — most see their systems as working, albeit not always together, and that has been enough. However, the cost of running tactical systems and operational services in separate business units, geographies, asset classes and functional areas are proving that assumption to be very costly.

The Cost Of Bad Data

Bad data and poor management of that data is a competitive disadvantage and regulatory risk, and one that must be addressed enterprise-wide. An institution’s systems and infrastructure are generally not designed to illuminate the problems in day-to-day business operations and tie back to the root cause for data reconciliation. Oftentimes parties responsible for resolving data inconsistencies don’t realize that the root problem can be three or four systems away in the reference data repository. In fact, it is estimated that poor data quality can cost companies 15% to 25% (or more) of their operating budget1. Although not all bad data can be eliminated, incremental improvements in data quality have large impacts in staffing levels and operations.

THE COST OF INCONGRUITY

The industry consensus that internal acquisition and cleansing of data is not necessarily a differentiating advantage is challenged by the escalating costs of regulatory compliance, time to support new products and business, margin compression, and quality improvement. Efficient and effective data management should not be thought of as a competitive differentiator, but as a profitability differentiator. The below example demonstrates how many different pockets of data management may exist in an institution and how many different systems reference data touches.

Compartmentalized technology is being used by hundreds of people globally in duplicative functions and redundant data operations. Broadridge estimates that banks employing segmented management of reference data may be spending $75 to $125 million annually just in the acquisition, cleansing, normalization and distribution of reference data for their enterprise. Others industry experts have estimated that the effort in price management is even higher. Therefore, the economics of acquiring, cleansing, normalizing and distributing reference data lends itself to a mutualized function across multiple banks. Depending on the maturity of a bank’s infrastructure, it can be an intimidating conversion, however one that may offer savings up to an estimated $50 million annually.

EXAMPLE: POOR DATA MANAGEMENT

Firm X has a centralized data management product for its investment bank and is running a separate system for its derivative business. The asset management and trust and custody departments each have their own infrastructures. Some of the functions run are the trade support roles, general operations, and dedicated data management group. And, because each department stores data pertinent to their particular business, all data-dependent departments have their own security master systems. Additionally, the operations team is split between each of the groups and systems. Vendor feeds are coming into multiple places and only about half of the downstream systems are actually connected to the central data store.

EVALUATING THE DATA OPERATING MODEL

Obtaining the highest quality reference data is a key competitive advantage, but without a holistic solution to manage that data, the costs to maintain the data are sky-high. Institutions are realizing that outsourcing internal data operations is freeing up resources and costs to truly differentiate through product and client-centric services. Institutions are beginning to understand that by employing not just technology to run reference data but also a managed service, they can gain a competitive advantage in the industry.

But a wholesale move from an internally controlled to externally managed reference data solution serviced is daunting for many institutions. Some apprehensions stalling the engagement of a managed service are:

  1. Institutions fear making the wrong choice in a managed service provider.
  2. The implementation, integration and conversion may appear too complex, costly or overwhelming.
  3. The potential savings via employing a managed service is high (although empirical support data is scarce); however, firms are distrustful of the estimates and have difficulty internally validating this.
  4. Corporate culture may be disrupted by moving to a managed service, and the trust-building and responsibility shift may be hard for some institutions to accommodate.
  5. Most managed service solutions only offer pieces or components rather than a complete or an integrated solution, which may spawn different but equally frustrating issues.
  6. The specter of previous large and expensive failures both in setting up service bureaus and in implementing reference data solutions weighs heavily.
“For those banks that produce reference data internally without managed support, very few internal operations or technology departments provide their user communities with real service level agreements, and virtually none would have any penalties attached to it. From a service level perspective, this is a competitive disadvantage.”

NEW STANDARDS: EVOLVING THE SERVICE VALUE CHAIN

These reservations are real, but neither clients nor regulators are willing wait for these hesitations to be resolved. Trends have emerged that point to a growing need for a reference data managed solution, which will make that transition easier for many institutions:

The Emergence of Chief Data Officer (CDOs): In the past, data was handled by either the operations department that ran the process or the technology staff that owned the infrastructure. Oftentimes these departments had difficulty overcoming the fear of change and letting go of the infrastructure they had built and for which they were responsible. However, with the emergence of CDOs, new resources are looking at infrastructure and processes with not just an open mind but also from a business perspective devoid of the protective motivations of a platform creator or owner. Options for managing data have increased, and many banks have begun the conversation surrounding innovative approaches for enterprise-wide efficiencies.

The Cost of Data: Banks are becoming aware of the enterprise cost to internally acquire, cleanse and distribute data. In addition, they are also ascertaining the relevance of data quality; a significant impact in the operational costs of an enterprise can be made with even incremental increases to data quality. Higher quality data also allows banks the ability to better service clients and provide regulatory consistency.

Complexity of Data: The complexity of products and global markets as well as the need to link products has stressed institutions’ legacy infrastructure. Reference data platforms must be able to interface with upstream applications successfully. Failure can arise at any interface point, and testing must be constant to reduce occurrence of error.

Client Sophistication: Client customization demands have grown so substantially that part of a client’s selection of bank, administrator, prime broker or custodian is influenced by how that institution can provide targeted, individualized data in real-time. Clients may want their own asset classification, to put assets in certain liquidity buckets, or they may want their risk, performance, portfolio, and profit and loss (P&L) reports all grouped and sorted the same way based on their custom requirements. Paramount is the ability of an institution’s reference data and pricing infrastructure to provide clientspecific information, categorization and a higher level of service. However, providing this capability is obviously a complex initiative and currently very few banks have the internal infrastructure to provide this level of service to their clients.

Mutualization: Mutualization of costs and expenses has become perhaps one of the most relevant conversations about cost savings and efficiency within the industry, which will be discussed in more detail below. However, mutualizing costs may create challenges, one of which is consensus-building across multiple banks and multiple states of maturity of infrastructure to agree on priorities and vision.

FROM CONSIDERATION TO ACTION: SELECTING A MANAGED SERVICE PROVIDER

Navigating the reference data managed service market can be daunting. Providing the highest quality reference data is not the only factor in determining a third-party provider’s advantageous offerings. Considerations based on the maturity level of a firm, its regulatory complexity, its transition appetite as well as how its data will be used and by whom will help ascertain the best fit for an institution. Emerging industry trends may play into an institution’s deliberation process as well; among these are:

SERVICE LEVEL AGREEMENTS (SLAS)

When outsourcing internal processes or functionality, it is vital to have strong SLAs in place with meaningful penalties for failure to perform, as well as a mature governance model. A more compelling attribute of some managed services is the ability to conduct business under very granular SLAs, but the managed service business has matured and a few best-in-class service providers have begun to offer customized SLAs.

MULTIPLE, DIFFERENTIATED SLAS

A single SLA may not be enough when engaging with a managed service. While many service providers guarantee service levels to an institution, an emerging trend is the ability of a managed service to provide distinct and individualized SLAs for different constituents and consumers within a bank. For example, where the trading department may require five minute commitments because of trade processing requirements, an hour or two may be adequate for a department which is only looking for end-of-day reporting. The ability to set one service level for a trading desk and a separate SLA for an asset management department will be a differentiator both for the institution as well as an indicator of the maturity and the quality of managed service provider.

Managed Service Providers

FLEXIBLE SERVICE MODEL

Another important consideration is flexibility in the managed service model, which means that the managed service provider must be able to support multiple customers with multiple service needs. Again, a high quality managed service provider recognizes that support must be employed essentially on a function-byfunction basis. Some institutions may require jobs be performed by a managed service team working across departments within an institution. Others may contractually require a dedicated group working on only their processes. Others may need very flexible and customizable models where work can be shared between the managed service provider and the institution’s own staff. Regardless of the need, there is a requirement to provide a high level of transparency from the managed service provider to the customer.

PEOPLE, TECHNOLOGY, HARDWARE

The managed service provider must be flexible and holistic in their offering, provide the tools and infrastructure to be able to do as much or as little as a client requires of them, and share the workload with the bank staff as required. Tools to monitor and measure data quality need to be incorporated in every aspect of the managed service, as it is counter-productive for institutions to seek multiple providers to perform different functionality separately—a best-in-class managed service should provide the technology, staff and tools so that a bank doesn’t need to be a system creator or integrator.

SCALABILITY

When investigating managed services, institutions should also look carefully at the breadth and flexibility of the offering from both a present and future perspective. Does the managed service offer a phased approach to implementation? Can an institution modify its relationship with the managed service as future needs arise? The agility of a managed service to move between an institution’s business strategies should be vital in determining the best fit of a reference data managed service. For example, if an institution wants to buy a hosted service now and upgrade to a managed service in the future, this option should be achievable. If an institution chooses to employ a partial managed service, it should be able to move to a fully managed service in the future, if so desired. If an institution wants to create hybrid models where a portion of the service is hosted and maintained and the rest is run locally in its internal infrastructure, this should be another acceptable option. Customizable, phased implementation and service solutions should be a large consideration when performing due diligence on managed service solutions.

CONTENT-PROVIDER NEUTRALITY

An additional consideration when choosing a managed service is to ensure the provider is content-provider neutral. Cognizant that alliances and partnerships between content providers are formed and dissolve on a regular basis, if an institution buys a managed service or reference data product, it is creating exposure that not all providers are going to integrate with its platform. The best way to future-proof a managed service solution and mitigate risk is to ensure the product that is core to infrastructure and service is content-provider neutral, agnostic, and able to work with all providers in a non.threatening way.

MUTUALIZATION

Again, the value of mutualizing work across multiple banks provides economic incentives to all participants. Because many large banks have already found ways to leverage offshore resources to provide economies of scale on manual tasks, the benefit of mutualization is not labor arbitrage, rather it is the ability to deliver best practices, to maintain primary and redundant subject matter expertise across all products and asset classes, and to provide highly automated infrastructure that is less dependent on labor and intervention. Mutualization must also not interfere with the managed services provider’s ability to deliver bespoke SLAs and configured golden or silver data copies. Service will still be the most important component in a supplier’s ability to support the ever-changing challenges at a financial services firm and a single flavor of data will add little value to the supply chain.

CONCLUSION

As data floods institutions with greater complexity and higher volumes, as the need to run that data for vastly different and customized purposes, and as the requirement for greater transparency into the past and present of reference data becomes reality, financial institutions are realizing that they are at a competitive disadvantage by acquiring, cleansings, normalizing and distributing data in-house. For true differentiation, financial institutions are exploring how engaging a proven reference data management platform and managed service may proactively address challenges inherent in the current and future financial landscape.

Every financial institution relies on good data. Not every institution is aware of the total effort necessary to procure it. Incremental data improvements can support vast reductions in organizational costs. By engaging a proven, best-in-class data management service, firms will be able to differentiate their business as a nimble financial institution quickly adapting to current and future challenges. Firms that adopt best in class managed reference data services will find themselves with a lower cost structure, a scalable data service architecture, and with transparency to meet the next series of regulatory hurdles, whatever they may be.

View a full list of references for this white paper.

lock.svg

Want to read the full report?Contact us about this topic

Your information will help us understand our readers and ensure that we create the hard-hitting subject matter that you need.

Complete the following form to request more information.

Contact us about this topic image Contact us about this topic image
Contact us about this topic image Contact us about this topic image

Welcome back, {firstName lastName}.

Not {firstName}? Clear the form.

Thank you for your inputs

The full content is available above. We will contact you soon.

Thank you .

We will contact you soon. Return to the top of the page.