By Justin Llewellyn-Jones, head of capital markets for North America at Broadridge.
Complexity in data and data management is like a fault line running through the organizations and strategies of capital markets firms. It consumes resources, introduces dangerous risks, and limits firms’ ability to capitalize on artificial intelligence and other new technologies. For that reason, simplifying the process of acquiring, managing and employing data has become a top priority for firms around the world.
The IT infrastructure that runs most large financial service organizations can be sprawling and disjointed. Systems that power individual businesses and functions operate in silos. As a result, within one organization, a single piece of data can reside in multiple locations. Often, data housed in different systems exists in different formats. Complicating this situation is the fact that these various data iterations are not static. To the contrary, they are constantly changing as they move throughout the organization.
Throughout the course of the trade cycle, the same data point is used by many different functions. At each transition point across front-end, middle and back-office systems, data must be transferred. In many cases, it’s also altered. In every step there is some chance of introducing an error. Even if everything goes correctly, the sheer number of transitions can make it difficult or even impossible to track when and where all the changes were made. Without reliable traceability, organizations cannot guarantee the accuracy of their data.
Due to this complexity, reconciling data consumes an inordinate amount of resources and time. To address this inefficiency and mitigate resulting risks, many capital markets firms have embarked on sweeping efforts to transform or simplify their data management infrastructure. All firms will be forced to address this issue soon, as the compression of the trade settlement cycle to T+1 and eventually to T+0 will reduce the window for data reconciliation and ultimately require real-time data management.
Real-time data management will also be a prerequisite for financial service firms competing in an age of automation, data analytics, cloud computing and artificial intelligence. These new technologies are the centerpieces of future financial service strategy. Already, they are unlocking previously unattainable efficiency enhancements while revolutionizing investment processes and strategies. However, none of these benefits will be possible without robust, reliable data that is available to applications on a timely, and eventually real-time, basis.
Achieving this level of performance will require a broad simplification of legacy IT infrastructures. This will include both horizontal simplification, in which firms consolidate processes and systems like OMS across asset classes and geographies, and vertical simplification, in which firms streamline and integrate applications from front- to middle- to back-office. It will also require the wholesale modernization of legacy technology, with firms utilizing the cloud and componentized applications to remake the IT framework incrementally, function by function and system by system.
What does a simplified data management infrastructure look like? For starters, it features a single point of entry and storage for all data. This single source feeds data into the various applications and systems that make up the organization. All these applications and systems operate on a common messaging protocol that allows easy movement of data. Achieving this level of seamless and 100% traceable dataflows across the organization is a monumental step. In legacy infrastructure, different systems speak different languages. Moving from one to the other requires a translator in the middle. Introducing a common messaging protocol is like getting a team of French, Russian and German speakers to agree to use English as a common language for business.
Implementing these changes is no small task. The transformation of a firm’s data management must be approached as just one component of a bigger initiative to simplify and modernize the entire infrastructure. However, the potential benefits make it more than worthwhile. By establishing a single source of data and facilitating the traceable movement of data across the organization whenever and wherever needed, firms will eliminate much of today’s heavy data reconciliation burden. They will also greatly reduce the risk that incorrect or out-of-date data results in faulty investment or business decisions, or even causes the firm to run afoul of regulators or public perception. Finally, they will put in place the foundation for next-gen strategies built on automation, cloud computing and artificial intelligence that will power financial service firms in the future.
Subscribe to our newsletter