Realizing the Enormous Untapped Potential of Your Data

Tim Versteeg discusses how firms have yet to leverage the potential of all their data.

From our vantage point, the industry is awash with data. Broadridge processes more than US$5 trillion in fixed income and equity trades per day, serve 6 of the top 10 global banks, every mutual fund company, and provide investment servicing for 70% of the North American market. That’s a LOT of data. We spend a lot of time and effort around data management, and we maintain a great dialog with our clients — and the capital markets industry as a whole — about managing, optimizing, and capitalizing on their data. What’s surprising though is that firms still need to leverage the potential of all this data.

According to new CEB research, almost 70% of capital markets firms either don’t have a data management strategy, or haven’t invested in one. That’s a staggering and unacceptable figure for an industry so dependent on data quality. It’s not that executives don’t understand the importance of data. Most do, but they just can’t agree on what their strategy should be. Sixty-three percent of surveyed firms have only partial alignment between IT and business lines on data management strategy. As CEB’s research points out, misalignment has a documented 20% negative impact on the insights that investment staff can draw from data. It also causes firms to react to data, instead of planning for it.

Leading firms recognize the pitfalls of a reactive approach to data management, and are right now building internal consensus and agreement. To move strategy beyond the item or technology level, CEB recommends grounding strategy efforts in four target states for data management:

“One Data” Approach for Market and Reference Data. Firms should converge divisions between market and reference data, which have historically been managed separately.  This dividing line is antiquated, and bridging it will alleviate the risks and costs of both data types. Seven in ten firms say this convergence is happening, at the level of business usage, operations and procurement.

Best-in-Class Data Usability and Context. Firms should provide better visualization and data context to staff. To make sense of an increasingly noisy data ecosystem, staff needs tools that help them to explore, analyze and make predictions from data. Highly effective visualizations can have more than 2.7 times the impact of analytics on employee insight.

Fast Time-to-Value of New and Unstructured Data Types. Firms should speed the acquisition, onboarding, and mapping of new data feeds into core applications. In the past, it has taken upwards of 90 days. If reduced to five days or fewer, firms will capitalize on a much broader data ecosystem.

Centralized Analytical, Quality, and Supporting Data Management Functions. For data activities such as analytics, staff training, data quality control, and user support, centralization means greater staff insight. Firms that centralize analytics teams have 13% higher employee insight than those that don’t. Centralization also means efficiency, because it cuts down the duplication of efforts across product and asset class lines.

A common component of these target states is the need for a cohesive, firm-wide approach. Over time, many product lines have sprouted cottage industry functions to manage data and analytics. It is not only inadequate from an efficiency standpoint – it also leads to inconsistent and low-quality data. This holds true at the industry level. Managed services for data and processes can mutualize costs across firms, lead to more consistent data, and improve the scalability of the data process model.