The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Schroders Takes Citadel’s CADIS For Pricing and Analytics

Global asset manager Schroders plc is implementing Citadel Associates’ CADIS data management system for pricing and analytics. Schroders is the fourth investment manager to take CADIS in recent months, with others including AMVESCAP and the Canada Pension Plan Investment Board (Reference Data Review, November 2006).

Robin Strong, Citadel’s head of sales and marketing for EMEA, says its “Investment Solutions Layer” proposition is finding favour among asset management firms for functions such as pricing and analytics which are not necessarily best supported by centralized data warehouses. “While a data warehouse is good for some requirements, it is not good for others,” he says. “A data warehouse doesn’t enable different parts of the business to operate at different speeds.”

Taking the example of pricing, for a trading desk timeliness is the most important factor, Strong contends. “There usually isn’t time to load everything into a data warehouse and still get the prices to the trader’s desktop for the start of the trading day. Typically, the traders would rather have the best data possible on time, with an indication of anything that is suspect. They can always refresh stale prices themselves.” By contrast, other functions – such as reporting – require exact prices and valuations and will wait until the data is complete and valid.

There is a requirement therefore for a data management infrastructure capable of accommodating multiple golden copies, and providing different views of data for different destinations – one view for a trader, one for risk management, and another view of data that is validated, signed-off and approved for client reporting. “A lot of data management systems don’t consider the destinations for data: the definition of good data is not just dependent on the source, but also on each destination, ie whether it’s the trading desk or the client/regulatory reporting function,” Strong says. “Each destination can cope with different levels of timeliness, completeness, accuracy and auditability.”

Many investment management firms that have implemented big data warehouses are finding they are not dynamic enough or performant enough for some functions, for example getting prices and valuations to the trader’s desktop by the start of the day, he reckons. “Because we can slice up our solution aligned with business functions, for example just pricing or analytics, we can run it in parallel to existing solutions as a ‘fast track’, and back-feed less time critical data later. So if a data warehouse isn’t delivering in a certain area, the investment manager can identify the highest priority requirements and put in place a new solution to meet it, while also knowing that solution is scalable and can be extended to cover other functions as required in the future.”

Because projects such as these tend to be aligned to the business structure of the organisation – for example the pricing and analytics team – implementations are less risky, according to Strong. “We don’t have to spend time getting consensus across multiple departments, because all the people involved in each phase of the project already work together, making implementations more succinct and generating better return on investment.”

The trend towards centralising pricing data will continue as firms seek to cut costs and avoid paying data vendors multiple times for the same data, Strong believes, but he adds: “It remains necessary and possible to override prices at multiple different points and to deal with non-listed and internally priced securities. There are strong reasons to centralise in order to achieve consistency, but also a need to decentralise to ensure input from different areas of expertise within the organisation: many firms have specialists on pricing complex bonds, credit default swaps et cetera.”
The correction of prices has to be done in an environment that gives business users visibility over what has been done, plus the ability to vary this processing as requirements change, Strong says. “Many data management systems, both commercial and inhouse, end up with the business logic buried in large SQL scripts or C++ code. If prices are wrong, or the logic and workflows need to be changed, it is hard for business users to determine what is done currently and how it can be modified. The real value in a data management platform comes when the business can rapidly adapt to change, and this requires a sophisticated graphical user interface that can be interpreted by users and business analysts, without having to unpick application source code,” he concludes.

Related content

WEBINAR

Upcoming Webinar: Getting ready for Sustainable Finance Disclosure Regulation (SFDR) and ESG – what action should asset managers be taking now?

Date: 8 June 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Interest in Environmental, Social and Governance (ESG) investment has exploded in recent years, bringing with it regulation and a requirement for buy-side firms to develop ESG strategies and meet disclosure obligations. The sell-side can help here by integrating ESG...

BLOG

The Future of Transatlantic Regulatory Cooperation: Antony Phillipson

As the RegTech Summit Virtual 2020 fast approaches, we are so pleased to welcome Antony Phillipson, Her Majesty’s Trade Commissioner for North America and Consul General in New York on Day Three (November 18) for a special Keynote speech and Q&A to discuss a possible Brexit UK-US FTA and outline the future of transatlantic regulatory...

EVENT

Data Management Summit Virtual

The Data Management Summit Virtual will bring together the global data management community to share lessons learned, best practice guidance and latest innovations to emerge from the recent crisis. Join us online to hear from leading data practitioners and innovators from the UK, US and Europe who will share insights into how they are pushing the boundaries with data to deliver value with flexible but resilient data driven strategies.

GUIDE

Regulatory Data Handbook 2020/2021 – Eighth Edition

This eighth edition of A-Team Group’s Regulatory Data Handbook is a ‘must-have’ for capital markets participants during this period of unprecedented change. Available free of charge, it profiles every regulation that impacts capital markets data management practices giving you: A detailed overview of each regulation with key dates, data and data management implications, links to...