About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

SmartStream’s New Recruit Stewart on DClear’s Performance Thus Far: Metrics and Multi-Tenant Architecture

Subscribe to our newsletter

In his new role as sales director for SmartStream’s DClear business, ex-GoldenSource exec Hugh Stewart has been tasked with raising the profile of the reference data utility solution, as it extends its reach into areas such as corporate actions, standing settlement instructions and legal entity data. As revealed by Reference Data Review back in November, the vendor is keen to extend DClear’s coverage beyond its current remit of securities master data, holiday calendar data and some corporate actions data, all of which it will also increase coverage of.

The initial focus for Stewart in his new role will be on raising DClear’s visibility in the market; quite a challenge when none of the clients currently using the solution are willing to publicly discuss their experiences. However, Stewart indicates that these firms, which include a large sell side institution, a hedge fund and an exchange, are willing to talk to SmartStream’s prospective customers behind closed doors.

“We have measured the metrics against which these early adopters wished to judge the success of their experience with DClear,” he says, “and this data is available to them to prove the benefits of cleaner data to their organisations.” This experience is therefore being passed on to DClear’s next tranche of potential prospects.

Moreover, the vendor is adding new data items on a continuous basis to DClear such as new instruments and exchanges in order to increase its appeal to the market. To facilitate all of this, it has created an internal data platform, which sits alongside its data dictionary, to collect data from multiple sources. It has also added algorithms to its data hub in order to check for data completeness and correctness, such as checking that dates and times are correct for specific countries in terms of valid working days, thus correcting their holiday calendar data.

The addition of legal entity data is bound to be something of a challenge for the vendor, given the requirement for complex hierarchical data sets to be linked to each other, but Stewart indicates a lot of work is going on in the background to accomplish this. The index space is also on SmartStream’s radar for the near future, although details are not forthcoming on this just yet.

As noted back in November, the vendor has also opted to include a sub-set of DClear data in the rest of the SmartStream portfolio of products. The plan is to develop a joined up approach to the reference data space by linking the exceptions focused solution set with the utility’s data repository by the end of third quarter of this year.

In terms of DClear’s prospects overall, he is confident that the vendor will end the year with more clients on the platform because of the appeal of a ‘pay as you play’ managed service in the current cost conscious environment. The shorter implementation cycle for connection to the platform and the ability to measure success with defined metrics should add to its appeal, he continues. “People are chopping enterprise data management (EDM) projects into chunks and dealing with them at the department level, although this needs to be balanced with a view of the organisation’s requirements as a whole,” he says.

As for the bigger picture, Stewart is unfazed by the regulatory developments going on across the globe around compelling greater standardisation of data formats: “Standardisation will always be a pragmatic issue and the environment will remain diverse. The fact of the matter is, standardisation can be talked about endlessly at conferences but downstream applications have codes and data formats hardwired into their systems and this means symbology cross references are needed.”

He adds that many of the recent decisions around standardisation are likely to have been taken as a result of political rather than economic or practical considerations. In the meantime and regardless of the outcome, there is a job to be done: “The introduction of new standards also causes more diversity in data.”

Regulation is also proving to be a positive force, Stewart adds. “External regulatory requirements and internal audit and risk management requirements are asking firms to prove they are in control of their data. The need for audit trails and transparency for analysis means the underlying data is very important.”

Stewart is also a fan of the developments towards more openness around proprietary standards (see Bloomberg’s open symbology initiative as an example) because it makes the process of mapping to these standards much easier. “If the contractual obligations around proprietary data are lessened, that means there is less permissioning required,” he says.

As for the impending sale of SmartStream itself, Stewart indicates that it is very much business as usual for the vendor. “There is a sense of calmness about the sale and SmartStream is continuing upon its own path of momentum,” he says.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Augmented data quality: Leveraging AI, machine learning and automation to build trust in your data

Artificial intelligence and machine learning are empowering financial institutions to get more from their data. By augmenting traditional data processes with these new technologies, organisations can automate the detection and mitigation of data issues and errors before they become entrenched in workflows. In this webinar, leading proponents of augmented data quality (ADQ) will examine how...

BLOG

AI No Magic Wand for Augmented Data Quality: A-Team Group Webinar

Artificial intelligence (AI) is often seen as a magic wand for many data operations but when it comes to augmenting data quality processes, the technology’s potential comes with a few caveats. Trusted data is critical to the smooth running of financial institutions’ operations. AI has shown incredible capabilities to ensure that, especially in such tasks...

EVENT

AI in Capital Markets Summit New York

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...