About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit Innovation Showcase Presents The Investment Data Utility

Subscribe to our newsletter

Join us tomorrow at our London Data Management Summit to find out how to make data an opportunity rather than a risk, what still needs to be done to ensure Markets in Financial Instruments Directive II (MiFID II) compliance, how to make the most of alternative data and next generation analytics, the latest technologies adding a new dimension to data management – and more!

Looking at emerging technologies and their vendors, the Summit will include a data innovation showcase. We caught up with Robin Strong, founder and CEO of The Investment Data Utility, ahead of the event to find out how his company can help resolve some of today’s data management challenges.

Q: What data management problems do financial institutions have that you believe you can solve?

A: Unknown, probably poor, data quality across the business, the inability to measure data quality versus peer groups, and incorrect data that is not spotted until it’s too late – failed trades, skewed risk reports, compliance breaches and so on.

Q: Why do financial institutions have these problems?

Typically, because data comes from disparate sources and is processed by different systems in different ways. Even a well-produced ‘gold copy’ lacks any meaningful comparison point to assess its quality.

Q: How do you solve these problems?

A: I have developed crowdsourcing, collaborative technology that allows data to be compared across institutions, generating an industry benchmark that identifies incorrect data before it is used by the business.

Q: What technology do you use?

A set of proprietary algorithms implemented using low-cost standard software tools linking to a central processing cloud.

Q: How does your solution fit into a financial institutions architecture and data flows?

The beauty of this model is that it does not impinge on existing toolsets, workflows and governance processes. It can be added as an additional layer in the architecture at low cost, resulting in very high ROI.

Q: Which emerging technologies do you see as having the most potential to improve data management and why?

I am a big believer in industry standards, yet so many firms reinvent the wheel with proprietary tools to store, reconcile and govern data. Industry collaboration is key and unless operational costs are reduced, new entrants will establish lower overhead models to undercut the competition.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking Transparency in Private Markets: Data-Driven Strategies in Asset Management

As asset managers continue to increase their allocations in private assets, the demand for greater transparency, risk oversight, and operational efficiency is growing rapidly. Managing private markets data presents its own set of unique challenges due to a lack of transparency, disparate sources and lack of standardization. Without reliable access, your firm may face inefficiencies,...

BLOG

Re-Architecting Regulatory Reporting with REGnosys and Open Source

Regulatory reporting has long been defined by highly specialized jurisdictional knowledge, templates, spreadsheets, and a significant part of the compliance budget. Regulators publish new requirements, firms interpret them independently, technology teams build extraction and transformation layers, and operations teams reconcile outputs before pushing formatted datasets to supervisory authorities. RegTech Insight sat down with regulatory reporting...

EVENT

TEST Event page 2

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again. But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I...