About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Case Study: Henderson Global Centralizes Using DISL Integral

Subscribe to our newsletter

Henderson Global Investors, a mid-sized U.K. investment management firm, is about a third of the way through an almost three-year project to revamp its data management procedures across the board. With its back office outsourced to Cogent, Henderson’s focus was on the data used to support its front and middle offices, including the investment management, performance measurement and risk functions.

According to Dominic Shine, head of IT products and development at Henderson, the project’s chief aims were to improve data quality, rationalize and reduce market data spending, especially through softing, improve timeliness of delivery, particularly in overnight processing time, and generate consistent daily and month-end reporting data. Shine was speaking at Osney Media’s Financial Information Management conference a few weeks ago. Henderson was also seeking to centralize risk, performance measurement and client reporting data. Significantly, it wanted to shift responsibility or ownership of data to the business from the IT function. Certainly, the business side understood the value of the data it used in its operations. Shine quoted a Henderson fixed-income fund manager: “Without accurate data, we might as well put it all on the 2.30 at Cheltenham!” (Cheltenham, for those of you who don’t know, is a British racecourse.) The task facing Shine and his team was daunting. “Every front-office system had its own database and associated write-arounds,” he said. The target data architecture would transform this organically grown state of affairs into a centralized single repository that would be home to all local data stores required to run the various Henderson processes. By managing the data centrally, Shine said, Henderson hoped to improve service to the applications. Buy vs. Build To achieve its aims, Henderson opted to buy rather than build the solution itself. In its evaluations, Henderson sought to ensure that the chosen solution had a track record in the investment management data model area, complete with appropriate interfaces, a centralized data storage capability, the possibility of high levels of customization, flexible reporting capabilities and at a reasonable cost. Henderson selected the Integral platform from Digital Innovation Systems Ltd. (DISL), which it felt met its requirements. Now approximately six to nine months into the project, with 18 to 24 months to go, a number of goals have been achieved. According to Shine, Henderson has succeeded in implementing the central repository and basic reporting capability. It has completed the asset matching engine and has developed an enhanced attribute population engine. As it progresses, Shine said, Henderson aims to interleaf the best attributes of the various services to create a golden copy. “All the services are good at different things,” he said. “We are able to set rules to optimize the data, and then allow manual overrides. Then, these are stored as rules, which leads to the best possible solution for downstream applications.” The asset matching system is taking data from such sources as Bloomberg, various index data providers and internal systems. The asset matching engine checks the identity of data elements, and throws up any exceptions for the administration team to handle. Henderson is now focusing on a number of outstanding items, including improving data feed delivery to front office systems, enhanced reporting capability, centralized fund and benchmark holdings data, repointing front-office applications to the DISL Integral central repository. It also remains to centralize historical transaction data, and risk and performance historic data, and to consolidate data stores for client reporting. Senior management support for the project has been essential, Shine said. From a governance perspective, a steering group was assembled to make big decisions, comprising the chief operating officer, the IT director, the operations director and Shine himself. The steering group reports into the Henderson executive. A data working group, comprising representatives from the front office, compliance and data administration, performed the work on the project. Of the total resource allocated to the project, 20% or approximately 10 people, were sourced from the data area, 25% from distribution and corporate services, 50% from investments and the remaining 5% from other departments. In conclusion, Shine said the project will meet future developments at the firm, based on the growing trends he sees in the business. These include demands to increase client reporting frequency, demands to move from period-end to daily performance attribution (which may have implications for fund managers’ bonus calculations) increasing demand for CRM and investment management data, new modeling tools for specialist fund manager boutiques, increased demands for better compliance and regulatory reporting, demand for data for client-facing web-sites and the ongoing drive toward STP and the reduction of operational errors. All of these trends require accurate, timely and well organized data, Shine said. The project allows the firm to assemble the data required quickly and to leave application development resources to focus on functionality.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Data platform modernisation: Best practice approaches for unifying data, real time data and automated processing

Financial institutions are evolving their data platform modernisation programmes, moving beyond data-for-cloud capabilities and increasingly towards artificial intelligence-readiness. This has shifted the data management focus in the direction of data unification, real-time delivery and automated governance. The drivers of this transition are improved operational efficiency as manual processes are replaced by faster, more accurate automated...

BLOG

UK Equity Consolidated Tape and EU MiFIR – Two Data Regimes, One Control Problem

The UK’s proposed equity consolidated tape is framed as a response to long-standing fragmentation in equity market data. By aggregating post-trade information and an attributed best bid and offer across trading venues, the tape is intended to provide a single, standardised view of UK equity trading. At the same time, transaction reporting under the Markets...

EVENT

TEST Event page 2

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Corporate Actions Europe 2010

The European corporate actions market could be the stage of some pretty heavy duty discussions regarding standards going forward, particularly with regards to the adoption of both XBRL tagging and ISO 20022 messaging. The region’s issuer community, for one, is not going to be easy to convince of the benefits of XBRL tags, given the...