About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Daiwa Securities Completes Reference Data Project

Subscribe to our newsletter

Daiwa Securities SMBC Europe has completed the reference data phase of its ongoing project to modernise its middle and back office, resulting in a single consolidated data model for counterparty and securities data. The project has been underpinned by the extended use of the GoldenSource data management platform.

During the first phase of the three-year project, the London operation put in place an infrastructure to support systems integration, data integration and business process automation, as well as data consolidation. Future phases, should they be approved, will include consolidating all P&L, positions and trades into one database, generating general ledger postings, and replacing a 20 year old, internally developed mainframe GL, accounting and settlement system.

Graeme Muirhead, executive director at Daiwa, says an important conclusion of the strategic review of middle and back office systems undertaken in 2001/2002 was that, where possible, it would buy systems off the shelf. “We decided we needed to become excellent at systems and data integration. We also concluded that we need to be focused on data, not functionality, because we can generally buy functionality off the shelf. A focus had to be business process automation, for those processes that make us unique in the marketplace, rather than the commonplace processes such as, for example, settling UK equities.”

The project was initiated in order to cope with rising volumes, the addition of new products and the increasing demand for improved process control and transparency. The creation of a single data model has resulted in a number of benefits for Daiwa, says Muirhead. “Securities and counterparties are linked by the issuer,” he says. “An additional driver for the project was to facilitate more accurate credit risk calculation; to have a greater ability to create a hierarchy of the legal entities that we are dealing with. This required us to fully appreciate all of the roles that our counterparties are playing, customer, issuer, broker, custodian et cetera, et cetera. The GoldenSource database has allowed us to create that. We have made extensive use of the GoldenSource data model for counterparties. All data – customers, issuers, settlement instructions, ETC details – is stored in the one system, and relates very closely to the securities traded. It makes sense to bring them together. If you have counterparty and securities data in two different systems, that becomes harder to do.”

Having selected middleware from Tibco, Daiwa reviewed several standing data systems before selecting technology from what was then FTI, now GoldenSource. At that time, the GoldenSource solution was not “fully developed”, says Muirhead, but it was the system with the data model that best met Daiwa’s requirements. Daiwa’s decision to consolidate counterparty, client and securities master data in one system was relatively unusual then, he says. “We wanted a consolidated system, which would be the only place in the company where static data could be changed, with Tibco managing the publishing of that data out to trading, middle and back office systems.”

The project commenced in April 2003. Daiwa first consolidated counterparty data from numerous front and back office systems. It also created a new, paperless account opening and counterparty management process, with electronic workflow provided by Tibco, and the GoldenSource system managing the data in its various states as it goes through the process. The counterparty phase of the project went live in May 2005, and in parallel with the end of that Daiwa and GoldenSource began work on the instrument side.

Daiwa has created a three-way hierarchy system to manage data feeds from its Tokyo head office, Bloomberg and Telekurs. The Tokyo feed contains consolidated Japanese data. Telekurs was chosen for its strength in corporate actions, and Bloomberg for the quality of its descriptive and pricing data. “The business wanted to implement scenarios whereby if a piece of data is not available from the primary feed, the system automatically goes to the back-up,” Muirhead says. In a phase of the project which began in Summer 2005 and finished in June 2006, Daiwa worked with GoldenSource to enable a three-way data feed hierarchy to be established according to instrument type. For example, for Japanese equities, the Tokyo feed is at the top of the hierarchy, followed by Bloomberg and then by Telekurs. Daiwa also worked with GoldenSource to develop the connectors to Bloomberg and Telekurs. “At the outset, the GoldenSource product was new and wasn’t perfect but we overcame the problems together,” Muirhead says.

While it would be possible to add another data vendor into the hierarchy, Muirhead doubts Daiwa would ever want to. “What we have discovered is that the two vendors are generally very similar. Where the main differences take place tends to be at the very beginning of the life of a security or corporate action: Bloomberg and Telekurs act differently at that stage and the system has to be cognizant of that, but after a few days, the data tends to settle down.”

Neill Vanlint, head of EMEA professional services at GoldenSource, says one challenge was learning about the quality of the data that comes in from the different vendors. “A lot of it is a question of timing. The data ends up accurate, but sometimes a vendor announces something before it has all the details, and this can make the data hard to match. We had to get used to the data matching, the timing issues and the data quality issues, and establish which problems were being caused by the system and which by the data.” Matching rules were adjusted based on operational realities. “You might get data from one vendor two days before you get it from the other. A way to manage this is to allow only the primary vendor to create a new instance; otherwise, you risk creating an exception from the outset.”

In order to adhere to its budgets for buying data from Bloomberg and Telekurs, Daiwa has opted to operate a securities universe comprising only instruments traded during the past three months. Daily, the position keeping system updates the GoldenSource system about the securities with open trades or positions and it adds them to the universe; at the end of the month it drops instruments that have not been traded in the past three months and ceases to maintain them daily. Vanlint says GoldenSource already had the concept of the securities universe within its application, but that Daiwa and the vendor worked together to fully develop the capability.
Muirhead admits that consolidating and cleansing static data is a “very unglamorous” job – “it’s like fixing the plumbing in your house; no one wants to do it, but nonetheless it has to be done in order to allow the business to develop in the future,” he says. However, having addressed standing data, Daiwa can now leverage its efforts to improve its capabilities in the middle and back office, to allow the company to become more responsive when taking on new business opportunities. “The project has enabled us to transact more volumes because we have automated quite a number of processes that were manual before. We are now considering very seriously going on to the next stage. We have embarked upon a clear strategy, and are moving in a concerted way towards it, and we are taking a long term view of the future,” he concludes.

Subscribe to our newsletter

Related content


Recorded Webinar: The roles of cloud and managed services in optimising enterprise data management

Cloud and managed services go hand-in-hand when it comes to modern enterprise data management (EDM), but how can the best solutions for the business be selected and implemented to ensure optimal data management based on accurate, consistent, and high-quality data, and delivering improved efficiency, better decisions and competitive advantage? This webinar will answer these questions,...


AI Integration in Capital Markets: Current Trends and Future Directions

Although artificial intelligence (AI) and machine learning (ML) have been widely used in the capital markets sector since the 2000s, the emergence of generative AI (GenAI) within the last 18 months has spurred a significant increase in investment in AI tools and technologies. This trend is set to continue as AI is deployed and utilised...


ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.


Entity Data Management

Entity data management has historically been a rather overlooked area of the reference data landscape, but with the increase focus on managing risk, the industry is finally taking notice. It is now generally agreed to be critical to every financial institution; although the rewards for investment in entity data management appear to be rather small,...