The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit: A Keynote on Data Utilities

Data management utilities could cut the cost of data, ease the burden of in-house data management and provide a solution that will deliver data not only to meet the requirements of today’s regulations, but also those of emerging regulations.

Presenting a keynote speech at this week’s A-Team Group Data Management Summit – which for the first time was linked to the company’s Intelligent Trading Technology Summit – in New York, Joseph Turso, vice president at SmartStream, argued the case for data utilities and detailed the Central Data Utility developed and run by SmartStream in conjunction with Euroclear.

Turso set the scene with a quick review of the past 20 years of data management and their sorry end of data stuck in siloed databases that are difficult to integrate and expensive to sustain. He said: “After the financial crisis, the mandate changed. Data management had to be improved and more ETL tools were used, but they didn’t get us where we needed to go. With $125 billion spent on data every year, the drivers behind a new approach to data management include timely trading decisions, best execution and compliance with regulatory requirements. The problem is that the wish list has to be achieved at reduced cost.”

Offering a solution to the problem, Turso suggested data utilities can come into play in data processing and data management operations. Users don’t lose control of their data as they continue to control vendor relationships and contracts; data from different vendors is not comingled but mapped to a common, consolidated data model; and efficiencies are provided by the utility fixing any data issues once for the benefit of all users.

Countering perceptions that utilities curtail the flexibility of data distribution, Turso touched again on user control and the ability to customise data distribution to different platforms, avoiding the complexity of doing this in-house and delivering significant cost savings.

Turning to the structure of the SmartStream Central Data Utility, Turso described a bottom layer that centrally manages data cleansing and mapping for all clients, a variable layer that can be used by clients on an individual basis to define the data integration and cross-referencing they want, and a top layer that can be customised by clients for data distribution.

He concluded: “The utility normalises, cleanses, maps and distributes data, providing clients with costs savings, improved time to market and, in turn, improved business performance.”

Related content

WEBINAR

Upcoming Webinar: Brexit: Reviewing the regulatory landscape and the data management response

Date: 11 May 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes With Brexit behind us and the UK establishing its own regulatory regime having failed to reach equivalence with the EU, financial firms face challenges of double reporting, uncertainty about UK regulation, and a potential exodus of top talent. The...

BLOG

Observational Learning Boosts Data Quality, Improves Reconciliations, Cuts Costs of Exceptions

Large data volumes and manual data validation techniques are making it difficult for firms to achieve levels of data quality required to support seamless transaction processing and regulatory reporting. The problem is exacerbated by MiFID II and other emerging regulations that impose new processes on transaction reporting, including reconciliation of transactions from the trade repository...

EVENT

RegTech Summit London

Now in its 5th year, the RegTech Summit in London explores how the European financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Enterprise Data Management, 2010 Edition

The global regulatory community has become increasingly aware of the data management challenge within financial institutions, as it struggles with its own challenge of better tracking systemic risk across financial markets. The US regulator in particular is seemingly keen to kick off a standardisation process and also wants the regulatory community to begin collecting additional...