About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Schroders Takes Citadel’s CADIS For Pricing and Analytics

Subscribe to our newsletter

Global asset manager Schroders plc is implementing Citadel Associates’ CADIS data management system for pricing and analytics. Schroders is the fourth investment manager to take CADIS in recent months, with others including AMVESCAP and the Canada Pension Plan Investment Board (Reference Data Review, November 2006).

Robin Strong, Citadel’s head of sales and marketing for EMEA, says its “Investment Solutions Layer” proposition is finding favour among asset management firms for functions such as pricing and analytics which are not necessarily best supported by centralized data warehouses. “While a data warehouse is good for some requirements, it is not good for others,” he says. “A data warehouse doesn’t enable different parts of the business to operate at different speeds.”

Taking the example of pricing, for a trading desk timeliness is the most important factor, Strong contends. “There usually isn’t time to load everything into a data warehouse and still get the prices to the trader’s desktop for the start of the trading day. Typically, the traders would rather have the best data possible on time, with an indication of anything that is suspect. They can always refresh stale prices themselves.” By contrast, other functions – such as reporting – require exact prices and valuations and will wait until the data is complete and valid.

There is a requirement therefore for a data management infrastructure capable of accommodating multiple golden copies, and providing different views of data for different destinations – one view for a trader, one for risk management, and another view of data that is validated, signed-off and approved for client reporting. “A lot of data management systems don’t consider the destinations for data: the definition of good data is not just dependent on the source, but also on each destination, ie whether it’s the trading desk or the client/regulatory reporting function,” Strong says. “Each destination can cope with different levels of timeliness, completeness, accuracy and auditability.”

Many investment management firms that have implemented big data warehouses are finding they are not dynamic enough or performant enough for some functions, for example getting prices and valuations to the trader’s desktop by the start of the day, he reckons. “Because we can slice up our solution aligned with business functions, for example just pricing or analytics, we can run it in parallel to existing solutions as a ‘fast track’, and back-feed less time critical data later. So if a data warehouse isn’t delivering in a certain area, the investment manager can identify the highest priority requirements and put in place a new solution to meet it, while also knowing that solution is scalable and can be extended to cover other functions as required in the future.”

Because projects such as these tend to be aligned to the business structure of the organisation – for example the pricing and analytics team – implementations are less risky, according to Strong. “We don’t have to spend time getting consensus across multiple departments, because all the people involved in each phase of the project already work together, making implementations more succinct and generating better return on investment.”

The trend towards centralising pricing data will continue as firms seek to cut costs and avoid paying data vendors multiple times for the same data, Strong believes, but he adds: “It remains necessary and possible to override prices at multiple different points and to deal with non-listed and internally priced securities. There are strong reasons to centralise in order to achieve consistency, but also a need to decentralise to ensure input from different areas of expertise within the organisation: many firms have specialists on pricing complex bonds, credit default swaps et cetera.”
The correction of prices has to be done in an environment that gives business users visibility over what has been done, plus the ability to vary this processing as requirements change, Strong says. “Many data management systems, both commercial and inhouse, end up with the business logic buried in large SQL scripts or C++ code. If prices are wrong, or the logic and workflows need to be changed, it is hard for business users to determine what is done currently and how it can be modified. The real value in a data management platform comes when the business can rapidly adapt to change, and this requires a sophisticated graphical user interface that can be interpreted by users and business analysts, without having to unpick application source code,” he concludes.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Augmented data quality: Leveraging AI, machine learning and automation to build trust in your data

Date: 19 September 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Artificial intelligence and machine learning are empowering financial institutions to get more from their data. By augmenting traditional data processes with these new technologies, organisations can automate the detection and mitigation of data issues and errors before they become...

BLOG

Best Practice Approaches to Trade Surveillance for Market Abuse

Market abuse is a problem, a very big problem for financial institutions that fall on the wrong side of regulation. Penalties include eye-watering fines, reputational damage and, ultimately, custodial sentences of up to 10 years. Internally, market abuse triggers scrutiny of traders and trading behaviours, a lack of trust and the potential need for significant...

EVENT

TradingTech Summit London

Now in its 14th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Entity Data Management Handbook – Fourth Edition

Welcome to the fourth edition of A-Team Group’s Entity Data Management Handbook sponsored by entity data specialist Bureau van Dijk, a Moody’s Analytics company. As entity data takes a central role in business strategies dedicated to making the customer experience markedly better, this handbook delves into the detail of everything you need to do to...