Access the latest news, analysis and trends impacting your business.
Your sales rep submission has been received. One of our sales representatives will contact you soon.
Complexity is a real enemy for any organization. It creates risks, saps efficiency, introduces redundancy and duplication, and ultimately squanders precious resources. Across the financial services industry, firms are battling complexity by simplifying technology and streamlining operations, and as they work to renovate existing internal systems, these firms are bumping up against the limitations of outdated applications. For that reason, technology modernization has become a strategic priority for virtually all financial service firms.
However, technology modernization poses challenges and risks of its own. Technology upgrades have the reputation for being expensive, time-consuming, and carrying the potential to negatively disrupt the business if they go wrong. Over the past 20 years, many financial service firms have tried to modernize technology infrastructure through large transformational projects, and it’s not uncommon to see major banks announce significant programs to upgrade back-office applications, or other core infrastructures, or talk about IT budgets measured in the billions of dollars. That’s execution risk at scale.
Recent innovations might be making those large scale projects a thing of the past. Using a combination of new technologies and strategies, financial services firms can now take an incremental approach to modernization. This type of evolutionary approach lowers risk by allowing firms to modernize one function or application at a time, without impacting or jeopardizing other systems. It’s based on four key concepts: componentization; data integrity; interoperability; and scalability. By focusing on these four areas firms are building a solid but flexible technology foundation that streamlines and simplifies existing systems and makes future modernization easier, cheaper and less risky.
As explored in a previous article in this series, we traced the development of the ecosystems that financial services firms run today. In particular, we highlighted how the ad hoc creation of these ecosystems through a series of business expansions and acquisitions resulted in extremely high levels of fragmentation. But in a somewhat ironic twist, these disjointed ecosystems consist of applications have been stitched together so tightly, and so inflexibly, that they now form something close to super-monoliths. As firms look to simplify, they find it can be incredibly difficult to change one application without having an impact on the rest. For that reason, one of the first and most important priorities in technology modernization is componentization. Using this approach, firms build an architecture out of a series of discrete components that can be easily deployed and upgraded without impacting all the others. This is a fundamental shift that will allow for efficiency and extensibility, while ensuring ecosystems never again fall into obsolescence.
A reliable data architecture is a core requirement of any modernized infrastructure. High quality data is already essential to financial services operations, and it will become exponentially more valuable as data-driven artificial intelligence and machine learning solutions advance and proliferate. In legacy architectures, data is widely distributed, often existing in multiple locations and models across the organization. This dispersion makes it difficult to ensure that data is consistent, accessible and reliable, and almost always requires harmonization and rationalization steps before the data becomes useable at an enterprise level. For that reason, one early goal of most technology modernization efforts is to build a framework that provides for data integrity, including the adoption of enterprise-wide data models, common data warehousing approaches, and consistent transport and messaging mechanisms. In general, modern ecosystems feature a single point at which any piece of data is entered, housed or calculated. Data is fed from this single golden source into the various applications that make up the ecosystem. Ensuring the quality, consistency and reliability of data reduces the time and resources spent on data reconciliation, facilitates the interoperability of solutions, and provides a solid platform for all other modernization and simplification initiatives.
There’s a reason financial service technology infrastructure is so complex: these are large, diverse organizations with multiple, complicated businesses. If the infrastructure that supports these organization is to be simplified, the streamlined systems that run these businesses must be able to interact seamlessly. For that reason, systems consistency and interoperability must be guiding principles for any simplification effort. However, that interoperability must ensure the independence of the various applications so as to ensure true componentization.
From a systems perspective, consistency begins with data integrity and then extends to messaging. Modern API frameworks make it possible for systems to interact in a near frictionless manner, providing tremendous flexibility to both the IT professionals who design systems and the financial professionals who use them.
API consistency isn’t just important on the back-end. Systems designers know that ensuring consistency in the look, feel and functionality of user interfaces can increase usage and allow users to more easily jump from one application to the next, which acts as a catalyst for organizational fungibility amongst user communities.
Many legacy applications, especially those built for post-trade processing, were not designed to process all transactions in real-time. This was a deliberate decision to take advantage of the batch compute capabilities of mainframe infrastructures. However, firms need to have access to important data elements in real-time, and as the industry continues to shift towards same day settlement paradigms for more asset classes, requiring real-time transaction processing, the applications that power financial service firms, and the infrastructure upon which those applications are built, will require flexible scalability.
As firms modernize their applications upgrades to existing mainframe-based infrastructures become increasingly problematic. Firms are shifting to using scalable and distributed cloud-based environments that employ on-demand compute and more readily allow for continuous integration / continuous delivery paradigms to be employed by providing easily accessible capacity when required. Not only can this approach keep pace with the demands of real-time processing, it also creates invaluable flexibility, allowing firms to pay for only the compute power they need, when they need it.
Progress across these four areas will help firms simplify technology and operations by reducing the fragmentation of legacy ecosystems and unlocking efficiencies throughout the organization. Just as importantly, advancements in componentization, data integrity, interoperability and scalability achieved through cloud migration will facilitate increased, and less risky, modernization now and in the future.
And as we eradicate complexity through modernization, we will ultimately free up those precious resources that are being squandered today, and we can then use them to help focus on the next wave of innovation in financial services, to further drive digitization and automation.
Our representatives and specialists are ready with the solutions you need to advance your business.